FastAPI asynchronous background tasks blocks other requests?

Your task is defined as async, which means fastapi (or rather starlette) will run it in the asyncio event loop.
And because somelongcomputation is synchronous (i.e. not waiting on some IO, but doing computation) it will block the event loop as long as it is running.

I see a few ways of solving this:

  • Use more workers (e.g. uvicorn main:app --workers 4). This will allow up to 4 somelongcomputation in parallel.

  • Rewrite your task to not be async (i.e. define it as def task(data): ... etc). Then starlette will run it in a separate thread.

  • Use fastapi.concurrency.run_in_threadpool, which will also run it in a separate thread. Like so:

    from fastapi.concurrency import run_in_threadpool
    async def task(data):
        otherdata = await db.fetch("some sql")
        newdata = await run_in_threadpool(lambda: somelongcomputation(data, otherdata))
        await db.execute("some sql", newdata)
    
    • Or use asyncios‘s run_in_executor directly (which run_in_threadpool uses under the hood):
      import asyncio
      async def task(data):
          otherdata = await db.fetch("some sql")
          loop = asyncio.get_running_loop()
          newdata = await loop.run_in_executor(None, lambda: somelongcomputation(data, otherdata))
          await db.execute("some sql", newdata)
      

      You could even pass in a concurrent.futures.ProcessPoolExecutor as the first argument to run_in_executor to run it in a separate process.

  • Spawn a separate thread / process yourself. E.g. using concurrent.futures.

  • Use something more heavy-handed like celery. (Also mentioned in the fastapi docs here).

Leave a Comment