Async socket server as a producer with workers consuming it

493 Views Asked by At

I started playing with asyncio module, and I was wondering if it is possible to build a tcp server that puts some work to do in a queue in order for some workers to execute it.

I tried merging code from the examples in the python documentation.

import asyncio

async def handle_echo(reader, writer):
    data = await reader.read(100)
    message = data.decode()
    addr = writer.get_extra_info('peername')

    print(f"Received {message!r} from {addr!r}")

    print(f"Send: {message!r}")
    writer.write(data)
    await writer.drain()

    print("Close the connection")
    writer.close()

async def main():
    server = await asyncio.start_server(
        handle_echo, '127.0.0.1', 8888)

    addr = server.sockets[0].getsockname()
    print(f'Serving on {addr}')

    async with server:
        await server.serve_forever()

asyncio.run(main())

and the worker

import asyncio
import random
import time


async def worker(name, queue):
    while True:
        # Get a "work item" out of the queue.
        sleep_for = await queue.get()

        # Sleep for the "sleep_for" seconds.
        await asyncio.sleep(sleep_for)

        # Notify the queue that the "work item" has been processed.
        queue.task_done()

        print(f'{name} has slept for {sleep_for:.2f} seconds')


async def main():
    # Create a queue that we will use to store our "workload".
    queue = asyncio.Queue()

    # Generate random timings and put them into the queue.
    total_sleep_time = 0
    for _ in range(20):
        sleep_for = random.uniform(0.05, 1.0)
        total_sleep_time += sleep_for
        queue.put_nowait(sleep_for)

    # Create three worker tasks to process the queue concurrently.
    tasks = []
    for i in range(3):
        task = asyncio.create_task(worker(f'worker-{i}', queue))
        tasks.append(task)

    # Wait until the queue is fully processed.
    started_at = time.monotonic()
    await queue.join()
    total_slept_for = time.monotonic() - started_at

    # Cancel our worker tasks.
    for task in tasks:
        task.cancel()
    # Wait until all worker tasks are cancelled.
    await asyncio.gather(*tasks, return_exceptions=True)

    print('====')
    print(f'3 workers slept in parallel for {total_slept_for:.2f} seconds')
    print(f'total expected sleep time: {total_sleep_time:.2f} seconds')


asyncio.run(main())

As soon as I started coding, a lot of questions came to my mind.

Does the server create its own event loop?

Can I serve a server while workers consume jobs from a queue that the server fills.

Is there any good guide for this kind of applications or for guiding people into this mess of new terms that async is bringing?

1

There are 1 best solutions below

0
WisdomPill On

I'm not really sure, but the issues was with the queue that was creating its own event loop, so I had to create it inside the main async function. While start_serving and serve_forever did not make any difference. I am still trying stuff and studying the documentation, so I will accept this answer for the moment.

from asyncio import Queue, create_task, gather, run, start_server


async def do_work(name: str, broker: Queue):
    while True:
        data = await broker.get()

        print(f'worker `{name}` is consuming {data}')

        broker.task_done()


async def main():
    broker = Queue(maxsize=512)

    async def handler(reader, writer):
        data = await reader.read()
        message = data.decode()
        addr = writer.get_extra_info('peername')

        print(f'Received {message!r} from {addr!r}')

        print(f'Send: {message!r}')
        writer.write(data)
        await writer.drain()

        print(f'Add work')
        await broker.put(data)

        print('Close the connection')
        writer.close()

    server = await start_server(handler, '127.0.0.1', 8888)

    addr = server.sockets[0].getsockname()
    print(f'Serving on {addr}')

    # await server.start_serving()

    workers = []
    for i in range(3):
        worker = create_task(do_work(f'worker-{i}', broker))
        workers.append(worker)

    # await gather(*workers)
    async with server:
        await server.serve_forever()


if __name__ == '__main__':
    run(main())