Python Co-routine

Seeing the demo in websocket, Python websockets module is taking advantage of asyncio which provides an elegant coroutine-based API.

So what is co-routine (协程)? 这个建议多看几遍如果忘了, 把多进程,多线程和协程优缺点都讲到了, 然后从generator + yield 出发,升级到asyncio: Python perspective of view on coroutine Demo code github: https://github.com/jreese/pycon

A variant of functions that enables concurrency via cooperative multitasking (task yields control when they are waiting for external resources 也就是在做完重要工作需要其他资源的时候,再转换到其他任务,所以协程很适合IO-bound tasks). We can run all tasks in one thread in one process (就是在一个线程中模拟多线程,本质还是单线程,注意Python threading module也是如此), better than multi-threading and multi-processing in some situations(因为切换内核态和用户态要消耗系统时间和资源, 协程由用户自己控制切换,不用陷入系统内核态).

to analyze function bytecode, the Python dis module can help, for example:

1
2
3
4
5
from dis import dis
def func():
pass

dis(func)

视频中举了个naive python code实现的例子, 用到了yield keyword, 它不仅可以输出数据,也可以通过generator的send() method接收外部的参数. 当yield 整个过程结束的时候,会有一个StopIteration exception 抛出。这实现了AsyncIO的基本思想。

For example, asyncio with aiohttp:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
import asyncio
import time
## 支持async io 的http 库
from aiohttp import request

URLS = [
"https://2019.northbaypython.org",
"https://duckduckgo.com",
"https://jreese.sh",
"https://news.ycombinator.com",
"https://python.org",
]

# Coroutines with aiohttp
async def fetch(url: str) -> str:
async with request("GET", url) as r:
return await r.text("utf-8")


async def main():
coros = [fetch(url) for url in URLS]
## gather will run tasks concurrently
## *coros: unpacking waitable objects
results = await asyncio.gather(*coros)
for result in results:
print(f"{result[:20]!r}")

if __name__ == "__main__":
asyncio.run(main())
0%