How to make the fastest execution of get requests in python flask app
up vote
1
down vote
favorite
I have a task:
Get requests of 2 API endpoints /api/1, /api/2 from which I get list of id's
For each listed id from point 1., get requests of /api/x/${id}/data
Then I use these data to fill the elements on web page. The web page is built with python flask and uwsgi.
The count of URLs I'm sending requests to is now 30 and in the future will be max 60.
My current solution is with asyncio and aiohttp but I think I'm using it wrong.
Here is my simplified code:
class MyClass:
async def fetch_page(self, session, url):
async with session.get(url) as response:
return await response.text()
async def pull_data(self):
first_endpoint = requests.get('https://domain/api/1')
first_endpoint = json.loads(first_endpoint.text)
second_endpoint = requests.get('https://domain/api/2')
second_endpoint = json.loads(second_endpoint.text)
endpoints = {**first_endpoint, **second_endpoint}
urls =
for data in endpoints:
urls.append("https://domain/api/x/" + str(data['id']) + "/data")
async with aiohttp.ClientSession() as session:
html = await asyncio.wait([self.fetch_page(session, url) for url in urls])
for x in html:
# here I get the result data and send it
And this is how I call it:
loop = asyncio.get_event_loop()
@app.route('/')
def index():
global loop
my_obj = MyClass()
loop.run_until_complete(my_obj.pull_data())
First point I'm thinking if later there will be more endpoints in point 1. I need to parallelise that code part too but how? should I create a new session with aiohttp or somehow join it to the existing one?
and my page loads in 2s which is still slow, what load time is achievable in this scenario? (without getting the data the page loads always under <0.5s)
So my question is, which library/approach should I choose for this issue and how to use it correctly to speed load of my page? I will be happy with 1s load
The other approach I was thinking was to run the new process with this flask app which will get the data and cache them somewhere but I'm afraid then I don't get the "fresh" / actual data always on each page reload
python flask python-asyncio aiohttp
add a comment |
up vote
1
down vote
favorite
I have a task:
Get requests of 2 API endpoints /api/1, /api/2 from which I get list of id's
For each listed id from point 1., get requests of /api/x/${id}/data
Then I use these data to fill the elements on web page. The web page is built with python flask and uwsgi.
The count of URLs I'm sending requests to is now 30 and in the future will be max 60.
My current solution is with asyncio and aiohttp but I think I'm using it wrong.
Here is my simplified code:
class MyClass:
async def fetch_page(self, session, url):
async with session.get(url) as response:
return await response.text()
async def pull_data(self):
first_endpoint = requests.get('https://domain/api/1')
first_endpoint = json.loads(first_endpoint.text)
second_endpoint = requests.get('https://domain/api/2')
second_endpoint = json.loads(second_endpoint.text)
endpoints = {**first_endpoint, **second_endpoint}
urls =
for data in endpoints:
urls.append("https://domain/api/x/" + str(data['id']) + "/data")
async with aiohttp.ClientSession() as session:
html = await asyncio.wait([self.fetch_page(session, url) for url in urls])
for x in html:
# here I get the result data and send it
And this is how I call it:
loop = asyncio.get_event_loop()
@app.route('/')
def index():
global loop
my_obj = MyClass()
loop.run_until_complete(my_obj.pull_data())
First point I'm thinking if later there will be more endpoints in point 1. I need to parallelise that code part too but how? should I create a new session with aiohttp or somehow join it to the existing one?
and my page loads in 2s which is still slow, what load time is achievable in this scenario? (without getting the data the page loads always under <0.5s)
So my question is, which library/approach should I choose for this issue and how to use it correctly to speed load of my page? I will be happy with 1s load
The other approach I was thinking was to run the new process with this flask app which will get the data and cache them somewhere but I'm afraid then I don't get the "fresh" / actual data always on each page reload
python flask python-asyncio aiohttp
1
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
@duhaime no I can't
– roboo
Nov 9 at 22:59
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have a task:
Get requests of 2 API endpoints /api/1, /api/2 from which I get list of id's
For each listed id from point 1., get requests of /api/x/${id}/data
Then I use these data to fill the elements on web page. The web page is built with python flask and uwsgi.
The count of URLs I'm sending requests to is now 30 and in the future will be max 60.
My current solution is with asyncio and aiohttp but I think I'm using it wrong.
Here is my simplified code:
class MyClass:
async def fetch_page(self, session, url):
async with session.get(url) as response:
return await response.text()
async def pull_data(self):
first_endpoint = requests.get('https://domain/api/1')
first_endpoint = json.loads(first_endpoint.text)
second_endpoint = requests.get('https://domain/api/2')
second_endpoint = json.loads(second_endpoint.text)
endpoints = {**first_endpoint, **second_endpoint}
urls =
for data in endpoints:
urls.append("https://domain/api/x/" + str(data['id']) + "/data")
async with aiohttp.ClientSession() as session:
html = await asyncio.wait([self.fetch_page(session, url) for url in urls])
for x in html:
# here I get the result data and send it
And this is how I call it:
loop = asyncio.get_event_loop()
@app.route('/')
def index():
global loop
my_obj = MyClass()
loop.run_until_complete(my_obj.pull_data())
First point I'm thinking if later there will be more endpoints in point 1. I need to parallelise that code part too but how? should I create a new session with aiohttp or somehow join it to the existing one?
and my page loads in 2s which is still slow, what load time is achievable in this scenario? (without getting the data the page loads always under <0.5s)
So my question is, which library/approach should I choose for this issue and how to use it correctly to speed load of my page? I will be happy with 1s load
The other approach I was thinking was to run the new process with this flask app which will get the data and cache them somewhere but I'm afraid then I don't get the "fresh" / actual data always on each page reload
python flask python-asyncio aiohttp
I have a task:
Get requests of 2 API endpoints /api/1, /api/2 from which I get list of id's
For each listed id from point 1., get requests of /api/x/${id}/data
Then I use these data to fill the elements on web page. The web page is built with python flask and uwsgi.
The count of URLs I'm sending requests to is now 30 and in the future will be max 60.
My current solution is with asyncio and aiohttp but I think I'm using it wrong.
Here is my simplified code:
class MyClass:
async def fetch_page(self, session, url):
async with session.get(url) as response:
return await response.text()
async def pull_data(self):
first_endpoint = requests.get('https://domain/api/1')
first_endpoint = json.loads(first_endpoint.text)
second_endpoint = requests.get('https://domain/api/2')
second_endpoint = json.loads(second_endpoint.text)
endpoints = {**first_endpoint, **second_endpoint}
urls =
for data in endpoints:
urls.append("https://domain/api/x/" + str(data['id']) + "/data")
async with aiohttp.ClientSession() as session:
html = await asyncio.wait([self.fetch_page(session, url) for url in urls])
for x in html:
# here I get the result data and send it
And this is how I call it:
loop = asyncio.get_event_loop()
@app.route('/')
def index():
global loop
my_obj = MyClass()
loop.run_until_complete(my_obj.pull_data())
First point I'm thinking if later there will be more endpoints in point 1. I need to parallelise that code part too but how? should I create a new session with aiohttp or somehow join it to the existing one?
and my page loads in 2s which is still slow, what load time is achievable in this scenario? (without getting the data the page loads always under <0.5s)
So my question is, which library/approach should I choose for this issue and how to use it correctly to speed load of my page? I will be happy with 1s load
The other approach I was thinking was to run the new process with this flask app which will get the data and cache them somewhere but I'm afraid then I don't get the "fresh" / actual data always on each page reload
python flask python-asyncio aiohttp
python flask python-asyncio aiohttp
edited Nov 9 at 22:59
lgwilliams
505317
505317
asked Nov 9 at 22:24
roboo
107
107
1
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
@duhaime no I can't
– roboo
Nov 9 at 22:59
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16
add a comment |
1
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
@duhaime no I can't
– roboo
Nov 9 at 22:59
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16
1
1
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
@duhaime no I can't
– roboo
Nov 9 at 22:59
@duhaime no I can't
– roboo
Nov 9 at 22:59
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16
add a comment |
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53234051%2fhow-to-make-the-fastest-execution-of-get-requests-in-python-flask-app%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53234051%2fhow-to-make-the-fastest-execution-of-get-requests-in-python-flask-app%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
your flask app will always depend on the slowest get request. You can use as much yield/await as you like, as soon as your api endpoints don't response fast enough, you can't accelereate it. How fast is a single request? Does the server scale on muitple requests? If the server on which the API (maybe single threaded flask api) is running isn't capable of serving your multiple parallel requests fast enough/or even parallel, you can't accelerate it.
– olisch
Nov 9 at 22:48
Can you denormalize the data a bit such that a single request fetches more of your desired resources?
– duhaime
Nov 9 at 22:53
@duhaime no I can't
– roboo
Nov 9 at 22:59
As a suggestion you could move to Quart (pgjones.gitlab.io/quart/index.html) its an async fork of flask and I have found it really good.
– Jack Herer
Nov 10 at 1:15
and although this article (hackernoon.com/…) is a couple of years old it will help you understand what you need to know
– Jack Herer
Nov 10 at 1:16