Flask with Celery: Building Asynchronous APIs for Heavy Tasks

 When building modern web applications with Flask, you may encounter tasks that are time-consuming—such as sending emails, generating reports, or processing images. Running these operations within a request-response cycle can slow down your API and negatively affect the user experience. To handle such heavy tasks efficiently, you can offload them to background workers using Celery, a powerful distributed task queue.

In this blog, we'll explore how to integrate Flask with Celery to build asynchronous APIs that improve performance and scalability.


Why Use Celery with Flask?

Flask is inherently synchronous. This means long-running operations can block the server, making it unresponsive to other requests. Celery allows you to:

Run tasks asynchronously in the background

Improve API response time

Scale workers horizontally to handle load

Schedule tasks (e.g., periodic cleanups)


How Celery Works

Celery runs as a separate process from your Flask app and uses a message broker (like Redis or RabbitMQ) to queue tasks. The workflow is:

Client hits Flask API

Flask pushes the task to a broker

Celery worker picks it up and processes it

Result can be stored or pushed back asynchronously


Setting Up Flask with Celery

Step 1: Install Dependencies

bash

Copy

Edit

pip install Flask Celery redis

You also need to have Redis running:


bash

Copy

Edit

brew install redis  # Mac

sudo apt install redis-server  # Ubuntu


Step 2: Create Flask App

python


# app.py

from flask import Flask, jsonify, request

from tasks import long_task


app = Flask(__name__)


@app.route('/start-task', methods=['POST'])

def start_task():

    task = long_task.delay()

    return jsonify({'task_id': task.id, 'status': 'Task started'}), 202


Step 3: Configure Celery

python


# celery_config.py

from celery import Celery


def make_celery(app_name=__name__):

    return Celery(

        app_name,

        broker='redis://localhost:6379/0',

        backend='redis://localhost:6379/0'

    )


celery = make_celery()


Step 4: Define a Celery Task

python
# tasks.py
from celery_config import celery
import time
@celery.task
def long_task():
    time.sleep(10)  # Simulate heavy processing
    return 'Task completed successfully!'
Running Your Application
Start the Flask server:
bash
Copy
Edit
python app.py
Start the Celery worker:
bash
Copy
Edit
celery -A tasks worker --loglevel=info

Send a POST request to /start-task, and the task will run in the background while your API responds immediately.


Monitoring Task Status (Optional)

You can add an endpoint to check task status using the task ID:


python

Copy

Edit

@app.route('/check-task/<task_id>', methods=['GET'])

def check_task(task_id):

    from tasks import long_task

    task = long_task.AsyncResult(task_id)

    return jsonify({'state': task.state, 'result': task.result})


Conclusion

Integrating Celery with Flask gives your application the power to handle resource-heavy tasks asynchronously, enhancing performance and user experience. Whether you're sending emails, processing video, or crunching data, Celery allows your APIs to remain fast and responsive while heavy lifting happens in the background. As your app scales, Celery’s distributed architecture ensures it can handle growing workloads with ease.

Learn FullStack Python Training Course

Read More : Fullstack Flask API Testing: Automating API Tests with Postman

Read More : Flask API Authentication with OAuth 2.0 and JWT

Read More : Fullstack Flask: Handling File Uploads and Downloads via APIs

Visit Quality Thought Training Institute

Get Direction


Comments

Popular posts from this blog

Tosca vs Selenium: Which One to Choose?

How to Build a Reusable Component Library

Flask API Optimization: Using Content Delivery Networks (CDNs)