Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how multi processes can be used for each task? #90

Open
doncat99 opened this issue Mar 22, 2021 · 1 comment
Open

how multi processes can be used for each task? #90

doncat99 opened this issue Mar 22, 2021 · 1 comment

Comments

@doncat99
Copy link

doncat99 commented Mar 22, 2021

Description

I test the following two methods, method_1 can show multiprocessing is involved for task dispatching. While in method_2, all tasks are involved in a single process, can anyone show me a way out for involving multiprocesses in method_2? Thx in advanced.

Details

async def get(url):
    async with request("GET", url) as response:
        result = await response.text("utf-8")
        logger.info(len(result))
        return result


async def method_1():
    urls = ["https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev"]
    async with amp.Pool() as pool:
        async for result in pool.map(get, urls):
            logger.info(len(result))


async def method_2():
    pool_tasks = []
    calls_list = ["https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev", "https://jreese.sh", "https://noswap.com", "https://omnilib.dev"]
    async with amp.Pool() as pool:
        for call in calls_list:
            pool_tasks.append(pool.apply(get, args=[call]))
        [await _ for _ in tqdm(asyncio.as_completed(pool_tasks), total=len(pool_tasks), ncols=90, desc="total", position=0, leave=True)]

  • OS: Mac
  • Python version: 3.8.2
  • aiomultiprocess version: 0.9.0
  • Can you repro on master?
  • Can you repro in a clean virtualenv?
@doncat99
Copy link
Author

My Fault, it is only because the default childconcurrency set to 16, and my data_len is about 12, cheers~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant