Flask with Celery: Building Asynchronous APIs for Heavy Tasks
When building modern web applications with Flask, performance and scalability are critical. While Flask is great for handling web requests, it works in a synchronous manner by default. That means when a request is made to the server, Flask will wait until the process completes before responding — not ideal for tasks like sending emails, processing images, or handling large data computations.
To address this, developers turn to Celery, a powerful asynchronous task queue/job queue based on distributed message passing. Integrating Flask with Celery allows you to offload long-running or background tasks, improving the responsiveness of your APIs. In this blog, we’ll explore how to use Flask and Celery to build asynchronous APIs effectively.
Why Use Celery with Flask?
Heavy operations can slow down your API, degrade user experience, and cause timeouts. With Celery, such operations are handled in the background, letting the Flask app quickly respond to the user. This is especially useful for:
- Sending bulk emails
- Generating reports
- Performing file uploads or image processing
- Interacting with third-party APIs
- Running periodic jobs
Setting Up Flask and Celery
Here’s a basic setup to integrate Celery with Flask.
Step 1: Install Required Packages
bash
pip install Flask celery redis
Celery needs a message broker to queue tasks. Redis is a popular choice, but you can also use RabbitMQ or others.
Step 2: Create a Flask App
python
from flask import Flask, jsonify, request
from celery import Celery
app = Flask(__name__)
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
Step 3: Define a Celery Task
python
@celery.task
def long_task(n):
import time
time.sleep(n) # simulate long process
return f"Task completed after {n} seconds"
Step 4: Create an Endpoint to Trigger the Task
python
@app.route('/start-task', methods=['POST'])
def start_task():
seconds = int(request.json['seconds'])
task = long_task.apply_async(args=[seconds])
return jsonify({"task_id": task.id, "status": "Task started"}), 202
Step 5: Add a Route to Check Task Status
python
@app.route('/task-status/<task_id>')
def task_status(task_id):
task = long_task.AsyncResult(task_id)
return jsonify({"status": task.status, "result": task.result})
Running Flask and Celery
Start Redis server.
Run Flask app.
Start Celery worker using:
bash
celery -A your_flask_file_name.celery worker --loglevel=info
Benefits of This Setup
- Scalability: Tasks run outside the request/response cycle, improving app throughput.
- Non-blocking APIs: Your API can return a response instantly and update the result later.
- Modularity: Decouples your web logic from background processing logic.
Final Thoughts
Integrating Flask with Celery is a great way to improve the efficiency of your applications, especially when dealing with time-consuming or CPU-intensive tasks. It enables better user experience, faster response times, and a scalable architecture.
Learn FullStack Python Training
Read More : Fullstack Flask API Testing: Automating API Tests with Postman
Visit Our IHUB Talent Training Institute in Hyderabad
Get Direction
Comments
Post a Comment