When using APIs that provide streaming data, we want to process every response from the server in real time as close as possible.
I would appreciate it if you could let me know if there is a more appropriate approach in the following situations:
The server responds every event and sends you a heartbeat message every 15 seconds to maintain the connection.
Now I use requests
to write: Python version 3.4
#stream=True
res=requests.get(url,headers=HEADERS,params=payload,stream=True)
# process one line at a time in itter_lines()
for line in res.iter_lines (chunk_size=64):
line = line.decode ('utf-8')
do something...
When iter_lines()
was initially used without specifying chunk_size
, the default was chunk_size=512
, and 5,6 heartbeat messages were processed together.
By reducing the chunk_size
, I think I was able to reduce the processing interval to some extent, but I have not reached the point of processing each response.
Also, I am concerned that certain types of responses will be blocked in the meantime because I/O wait occurs.
I think it can be solved by using a library or framework that handles asynchronous processing, so I read documents such as asyncio
, aiohttp
, Twisted
, and I don't know if I should use asyncio
.
I think this part of asyncio
is related, but I don't understand it well.
http://docs.python.jp/3/library/asyncio-stream.html
asyncio
?Could you tell me as much as possible about the above two points?
python
If you're going to do it a little more advanced, create an Event class and create specialized event objects like Heartbeat (Event).A worker thread pushes an event object to a queue.On the main thread, create an event loop to call the event handler for each event.
What do you think of it like this?I think I can do the same with asyncio, but I haven't tried it yet, so I can't answer it.
© 2024 OneMinuteCode. All rights reserved.