Mastering Asynchronous Processing with FastAPI: A Guide to Background Tasks and WebSocket Usage
📋 Article Highlights (Summary)
- Who is this for?
Mid-to-senior developers who can write synchronous APIs but want to tackle asynchronous and real-time communication - What you’ll learn
- Basics of
async def
/await
for asynchronous functions - Implementing asynchronous batch jobs with FastAPI’s background tasks
- Setting up WebSocket endpoints and integrating clients
- Sample bidirectional real-time communication with a frontend
- Production considerations (connection limits, error handling, etc.)
- Basics of
- Benefits
- Offload heavy or long-running work from your API responses to improve user experience
- Easily add chat, notifications, and other real-time features
🎯 Target Audience
- Developer D (27)
Knows how to write synchronous REST APIs but wants to offload tasks like image processing or email sending - Engineer E (30s)
Learning chat functionality and eager to run WebSockets in FastAPI - Side Projecter F (24)
Building a real-time game prototype while keeping server load low
♿ Accessibility Level
- Readability: Short sentences and bulleted lists reduce eye travel; key terms are annotated with furigana
- Structure: Headings and subheadings are nested for logical flow in screen readers
- Code Examples: Use monospaced font, consistent indentation, and rich comments
- Summaries: Each section ends with key takeaways for easy review
1. Asynchronous Basics: What Are async
/await
?
Python 3.5+ introduced asynchronous functions written with async def
and await
.
- Synchronous: Blocks other work until complete
- Asynchronous: Allows other tasks to run concurrently during long operations
import asyncio
async def say_after(delay: int, message: str):
await asyncio.sleep(delay)
print(message)
async def main():
# Run in parallel
await asyncio.gather(
say_after(1, "Hello"),
say_after(2, "FastAPI!"),
)
asyncio.run(main())
Key Takeaways
- Use
async def
to define async functions- Use
await
to pause until another async task finishes- Use
asyncio.gather
to run multiple tasks concurrently
2. Offloading Long Jobs with Background Tasks
2.1 FastAPI’s BackgroundTasks
FastAPI’s built-in BackgroundTasks
lets you queue work to run after a response is sent.
from fastapi import FastAPI, BackgroundTasks
import time
app = FastAPI()
def write_log(message: str):
time.sleep(5) # Heavy work (e.g., file I/O)
with open("log.txt", "a") as f:
f.write(message + "\n")
@app.post("/process/")
async def process_data(data: dict, background_tasks: BackgroundTasks):
background_tasks.add_task(write_log, f"Processed: {data}")
return {"status": "accepted"}
- Request → immediate response → heavy work runs in background
- Users can update the UI without waiting
Key Takeaways
- Accept
BackgroundTasks
as a parameter- Call
add_task
with either sync or async functions to queue them
2.2 Multiple Tasks & Error Handling
import asyncio
from fastapi import FastAPI, BackgroundTasks
app = FastAPI()
async def notify_user(user_id: int, message: str):
await asyncio.sleep(2)
# Send email/SMS here
@app.post("/notify/{user_id}")
async def notify(user_id: int, background_tasks: BackgroundTasks):
background_tasks.add_task(notify_user, user_id, "You have a notification")
return {"detail": "notification queued"}
notify_user
isasync def
and runs concurrently- Exceptions inside background tasks do not affect the main API
3. Implementing Real-Time with WebSocket
3.1 Basic WebSocket Endpoint
FastAPI makes it easy to create a WebSocket endpoint.
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
app = FastAPI()
@app.websocket("/ws")
async def websocket_endpoint(ws: WebSocket):
await ws.accept()
try:
while True:
data = await ws.receive_text()
await ws.send_text(f"You said: {data}")
except WebSocketDisconnect:
print("Client disconnected")
- A simple echo server
while True
keeps the connection open for bidirectional messages
Key Takeaways
- Use
@app.websocket
to define the endpoint- Call
accept()
,receive_*()
, andsend_*()
3.2 Frontend Connection Example
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>FastAPI WebSocket</title>
</head>
<body>
<input id="msg" placeholder="Type a message">
<button onclick="send()">Send</button>
<pre id="log"></pre>
<script>
const ws = new WebSocket("ws://127.0.0.1:8000/ws");
ws.onmessage = e => {
document.getElementById("log").textContent += e.data + "\n";
};
function send() {
const input = document.getElementById("msg");
ws.send(input.value);
input.value = "";
}
</script>
</body>
</html>
- Open in your browser for a real-time chat demo
- No CORS needed on same origin
4. Production Considerations
- Connection Limits
- WebSockets maintain open connections → resource pressure
- Limit concurrent connections via Uvicorn/Gunicorn or a reverse proxy
- Error & Disconnect Handling
- Catch
WebSocketDisconnect
to clean up resources
- Catch
- Monitoring Background Tasks
- For high volumes, consider a task queue like Celery or RQ
- Security
- Pass tokens in query params or headers to authenticate WebSocket connections
Key Takeaways
- Implement load control and robust exception handling
- Use dedicated brokers or queues for scalability
5. Conclusion & Next Steps
We’ve covered FastAPI’s two core async features: BackgroundTasks and WebSockets:
async/await
fundamentals- Offloading long jobs with
BackgroundTasks
- Real-time endpoints via
@app.websocket
- Frontend integration with sample HTML
- Production tips: limits, error handling, security
Next:
- Message Broker Integration: Use Redis, RabbitMQ with Pub/Sub
- Frontend Frameworks: Build a rich chat UI with React or Vue
- Distributed Task Queues: Scale with Celery or FastAPI-specific background libraries
FastAPI’s async capabilities are highly extensible—experiment and build your own real-time APIs! 💡