episyche logo



How to run periodic tasks in Django using Celery ?

Published on

How to run periodic tasks in Django using Celery ?
Scheduling is an important feature for generating reports, updating the status in database tables, sending marketing emails, etc. These types of periodic tasks can be executed in the Django applications using Celery.


Celery is an open-source python library that runs asynchronous tasks and can be scheduled for running tasks periodically. It uses message brokers like Redis and RabbitMQ to send and receive messages.

You can find the full source code for the Celery with Django example project at the GitHub repo

In this article, we will explain the steps needed to configure periodic tasks.

Flow Diagram:



  • Ubuntu machine

  • Python 3.8

  • Django 4.0.2 or above


Step 1: Celery Installation

  • If you want to create a Django project from scratch, please read the following article.

  • If you like to run periodic tasks on the existing Django project, then navigate to the project directory.

1cd <django_project_directory> 2 3For example: 4cd /home/mahen/blog/django/django_celery_project
  • Install the celery package using the pip package manager.

1python3 -m pip install -U celery[redis]
  • Verify the installed celery version by executing the following command.

1celery --version

It’ll prompt the version details as shown in the following screenshot. 


  • Install Redis using the apt package manager.

1sudo apt install redis -y
  • Start redis-server using the following command.

1sudo systemctl start redis-server.service
  • Enable redis-server. This command helps to start the redis server automatically as soon as the server rebooted.

1sudo systemctl enable redis-server.service
  • Check the status of the redis server, by running the below command.

1redis-cli ping
  • If redis is running successfully, it will return a PONG message.

A sample screenshot for the redis-cli ping command is shown below.



Step 2: Celery Integration with Django

  • Now it's time to integrate celery with our Django project.

  • Navigate to the Django core app directory.

1cd sample_app/
  • Create a file named celery.py with the following snippet.

1from __future__ import absolute_import, unicode_literals 2import os 3 4from celery import Celery 5 6# Set the default Django settings module for the 'celery' program. 7# "sample_app" is name of the root app 8os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'sample_app.settings') 9 10app = Celery( 'celery_app', 11 broker='redis://localhost:6379/0', 12 backend='redis://localhost:6379/0' 13 ) 14 15# Load task modules from all registered Django apps. 16app.autodiscover_tasks()
  • In the above code, we just created the Celery instance called app, and to use Celery within our project we simply import this celery instance.

A sample celery.py file can be found in the following git repository file.

  • Add the below code in the __init.py__ file present in the current directory.

1from __future__ import absolute_import, unicode_literals 2from .celery import app as celery_app 3 4__all__ = ('celery_app',)

A sample __init__.py file can be found in the following git repository file.

  • Now navigate to the project root directory.

1cd ../
  • And run the following command to test the above configurations in the Django app directory.

1celery -A sample_app worker -l info

Note:* sample_app - Name of your celery app mentioned in the celery.py file

An example execution screenshot is given below for your reference.


Step 3: Creating tasks

  • Now we create the tasks which are going to be executed in a specific interval or periodically.

  • Tasks can be created within any app of the project under the filename tasks.py (so that celery can auto-discover them)

  • Navigate to any of the Django app directories from Django core app directory.

1cd ../test_app/
  • Create a file named tasks.py

1touch tasks.py
  • Import the celery instance and create tasks named task_one and task_two as follows.

1from sample_app.celery import app 2 3@app.task 4def task_one(): 5 print(" task one called and worker is running good") 6 return "success" 7 8@app.task 9def task_two(data, *args, **kwargs): 10 print(f" task two called with the argument {data} and worker is running good") 11 return "success"

Note:* @app.task defines the functions as a celery task

A sample tasks.py file can be found in the following git repository file.

Step 4: Running Worker

A worker is a celery application, which runs and manages tasks.

  • Navigate to the Django project directory.

1cd <django_project_directory> 2 3For example: 4cd /home/mahen/blog/django/django_celery_project
  • Start the worker using the following command.

1celery -A sample_app worker -l info

If the worker is started successfully, then it will print the tasks in the terminal as shown in the below screenshot.


  • Now that the tasks are loaded, we are going to test the celery tasks, by invoking them from the Django shell.

    • Open a new terminal without stopping the celery worker (i.e celery -A celery_app worker -l info)

    • Activate Django Shell by executing the below command

      python3 manage.py shell

    • Import the tasks into the shell.

      from test_app.tasks import task_one, task_two

    • Then call the tasks using delay().




Now go to the terminal where the celery worker was running and you'll see that task_one and task_two are called and printed the results as shown in the following screenshot.


Step 5: Set up Periodic Tasks

  • To run tasks at regular intervals we need a scheduler(Beat) and schedule configuration for respective tasks. We will see about the scheduler in the next step, now we need to add the configuration for the tasks.

  • In the settings.py file under the Django core app directory, please append the following snippet.

1from celery.schedules import crontab 2from datetime import datetime 3CELERY_BEAT_SCHEDULE = { # scheduler configuration 4 'Task_one_schedule' : { # whatever the name you want 5 'task': 'test_app.tasks.task_one', # name of task with path 6 'schedule': crontab(), # crontab() runs the tasks every minute 7 }, 8 'Task_two_schedule' : { # whatever the name you want 9 'task': 'test_app.tasks.task_two', # name of task with path 10 'schedule': 30, # 30 runs this task every 30 seconds 11 'args' : {datetime.now()} # arguments for the task 12 }, 13}


The configuration is placed under the variable CELERY_BEAT_SCHEDULE.

Configuration is given as the key, value pair. Key is the name for the configuration and the value dictionary contains the configuration.

  • The task key defines the task (likely the specific path {app_name}.tasks.{task_name}).

  • The schedule key defines the time interval for the task. (for more crontab configurations click here).

  • The args key defines the arguments to be passed into the task.

A sample settings.py can be found in the following git repository file.

Step 6: Running Beats

  • Celery Beats is a scheduler, which will read the schedule configuration mentioned in Step 5 and invokes them at the defined time.

  • Navigate to the Django project directory.

1cd <django_directory>
  • Start the beat by running the following command in the new terminal.

1celery -A sample_app beat -l info


Always start the celery worker before starting beats

The beat will invoke the worker with configured tasks and print the results. A Sample screenshot for the same is shown below.


Navigate to the worker terminal and check the execution results as shown in the following reference screenshot.


Step 7: Saving Celery results

  • We can save the result of the celery to our database for various purposes. To do that we need a library named django-celery-results.

  • Install the Django celery results by running the command.

1python3 -m pip install django-celery-results
  • Add django_celery_results to the INSTALLED_APPS in our project’s settings.py.

1INSTALLED_APPS = [ 2 ... 3 'test_app', 4 'django_celery_results', 5]
  • Run the migrations command to create the Celery database.

1python manage.py migrate django_celery_results
  • Now that you can see the tables django_celery_results have been created.

An example screenshot is given below for your reference.


  • Appending the following line to the settings.py file.


Step 8: Run Celery as a daemon

  • In order to start the celery automatically after the server reboot, we are going to configure the celery under the supervisor service.

  • On Ubuntu, we can install supervisor as a Debian package, to do so run the following command.

1sudo apt-get install supervisor -y
  • Navigate to the supervisor configuration directory and create a file named celery.conf.

1cd /etc/supervisor/conf.d/ 2touch celery.conf
  • Add the following line to the celery.conf file.

1; ========================================== 2; celery worker config 3; ========================================== 4 5[program: worker] 6command=/home/mahen/blog/django/.venv/bin/celery -A sample_app worker -l info 7directory=/home/mahen/blog/django/sample_app 8user=ubuntu 9numprocs=1 10stdout_logfile=/var/log/celery/worker.log 11stderr_logfile=/var/log/celery/worker.err.log 12autostart=true 13autorestart=true 14startsecs=10 15stopwaitsecs = 600 16killasgroup=true 17priority=998 18; priority 998 executes first and then 999 19 20; ======================================== 21; celery beat config 22; ======================================== 23 24[program: beat] 25command=/home/mahen/blog/django/.venv/bin/celery -A sample_app beat -l info 26directory=/home/mahen/blog/django/sample_app 27user=ubuntu 28numprocs=1 29stdout_logfile=/var/log/celery/beat.log 30stderr_logfile=/var/log/celery/beat.err.log 31autostart=true 32autorestart=true 33startsecs=10 34stopwaitsecs = 600 35killasgroup=true 36priority=999


[program:worker]: This is the name of the supervisor’s daemon process (name of your convenience)

command : The command used to run the processor with the path for celery.

To know the path to celery, enter the command which celery and add the output before the celery command.

directory: The directory in which the project runs. Here is the path of the Django project.

user: Username of the server or computer. Usually ubuntu here.

stdout_logfile : Logfile location. (any)

stderr_logfile : Error logfile locations.(any)

A sample configuration file celery.conf can be found in the following git repository file.

  • To fetch the latest celery configuration changes added to the supervisor, please execute the below command.

1sudo supervisorctl reread
  • To make the supervisor update the celery configuration, run the following command.

1sudo supervisorctl update
  • To view the supervisor processes status run the following command.

1sudo supervisorctl 2#(or) 3sudo supervisorctl status all
  • If the supervisor process is already started, please run the following command to start the process.

1sudo supervisorctl start all

An example screenshot is given below for your reference.


Helpful Commands for your reference:

sudo supervisorctl start process_name

sudo supervisorctl stop process_name

sudo supervisorctl restart process_name