Issue
What's the correct way of reusing Python Requests connections in Django across multiple HTTP requests. That's what I'm doing currently:
import requests
def do_request(data):
return requests.get('http://foo.bar/', data=data, timeout=4)
def my_view1(request)
req = do_request({x: 1})
...
def my_view2(request)
req = do_request({y: 2})
...
So, I have one function that makes the request. This function is called in various Django views. The views get called in separate HTTP requests by users. My question is: Does Python Requests automatically reuse the same connections (via urllib3 connection pooling)?
Or do I have to first create a Requests session object to work with?
s = requests.Session()
def do_request(data):
return s.get('http://foo.bar/', data=data, auth=('user', 'pass'), timeout=4).text
And if so, does the session object have to be created in global scope or should it be inside the function?
def do_request(data):
s = requests.Session()
return s.get('http://foo.bar/', data=data, auth=('user', 'pass'), timeout=4).text
I can have multiple HTTP requests at the same time, so the solution needs to e thread safe ... I'm a newbie to connection pooling, so I really am not sure and the Requests docs aren't that extensive here.
Solution
Create a session, keep the session maintained by passing it through functions and returning it, or create the session object at global level or class level, so the latest state is maintained whenever it is referenced. And it will work like a charm.
Answered By - Vikas Ojha
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.