Getting started with periodic tasks

David
5 min readFeb 22, 2021

In this post we’ll explore creating and running periodic tasks in django using celery.

A quick word about celery.

This post assume some basic knowledge about django and celery. If you are new to celery, I’d recommend reviewing the official celery documentation here: https://docs.celeryproject.org/en/latest/index.html
But here are the basics:
Celery is a distributed task queue that requires an external message broker for sending and receiving messages. This post explores periodic tasks in a django project. The process consists of the django project, a broker (redis) a task scheduler (celery-beat), and workers (celery-worker). The intent is to create a task in django, create periodic tasks that will execute at explicitly scheduled increments, place a job on a queue, and have the job picked up and executed by a worker. We’ll walk through this entire process.

Dependancies

For this project we’ll need a broker and examples will use redis. There are also a number of python packages need. We’ll use the following (pip-installable) packages in our project:

celery==5.0.5
Django==2.2.16
django-celery==3.3.1
django-celery-beat==2.2.0
redis==3.5.3

For examples use in this post, we’ll create a django project called testcelery with an app call demo.

Django Configuration

django-celery-beat is a django app that enables storing periodic tasks in the database. We’ll add this add as well as our demo app to the installed apps in the django settings.py file.

INSTALLED_APPS = [
...
'django_celery_beat',
'demo',
]

The celery work and celery beat instances use settings defined in the settings.py file in the django project. The snippet below include the minimum settings we’ll need. For a full list of settings see the official documentation: https://docs.celeryproject.org/en/stable/userguide/configuration.html

# celery settings
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERYBEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Additional celery configuration resides in a celery.py file the project directory (in the same directory as the settings.py file). The contents look like this:

from celery import Celery
from django.conf import settings


celery_app = Celery('demo')
celery_app.config_from_object(settings)
celery_app.autodiscover_tasks()


@celery_app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

Lastly, in the __init__.py file we add the following:

from .celery import celery_app

Now we’re ready to create some tasks. Create a tasks.py file in the demo app directory. we’ll start with something simple, a few tasks that log a message to show task start and completion. For each task we use the @ shared_task decorator and include a time_limit, name, and queue. Here’s what the tasks.py contains:

from celery import shared_task
import time
import logging


@shared_task(time_limit=30, max_retries=0, name='demo.test_task_1', queue='normal')
def test_task_1():
logging.info('started test task 1')
time.sleep(2)
logging.info('completed test task 1')


@shared_task(time_limit=30, max_retries=0, name='demo.test_task_2', queue='normal')
def test_task_2():
logging.info('started test task 2')
time.sleep(3)
logging.info('completed test task 2')


@shared_task(time_limit=30, max_retries=0, name='demo.test_task_3', queue='normal')
def test_task_3():
logging.info('started test task 3')
time.sleep(4)
logging.info('completed test task 3')

For the next steps, make sure the broker is running and open three terminal sessions, one to run the django server, one to run celery beat, and one to run the celery worker. All three terminal sessions need access to the same code, so make sure the same virtualenv is activated and the DJANGO_SETTINGS_MODULE environment variable is set. We should in in the testcelery directory (where manage.py is located) in all three terminals.

In the first terminal, run migrations, create a superuser, and run the django server. In your browser, navigate to the admin page and you’ll see a section for Periodic Tasks that looks like this:

This section alls us to create intervals and schedule periodic tasks. If we click on Periodic tasks and then on add ADD PERIODIC TASK to open the add task page, we can see our three tasks listed in the Tasks (registered) field.

Let’s add some periodic tasks; one per registered tasks. Under the schedule section, we’ll add new Interval Schedules, in seconds, so we can see the periodic tasks being scheduled and executed.

For testing purposes I’ve added three periodic tasks, listed below.

To see our periodic tasks in action we’ll start a celery worker and celery beat in the other terminal windows.

We can start the celery worker with the following command. The -A option refers to our django project to read settings from. The -Q option tells the worker what queue to read jobs from. This is the a same queue name used in the shared_task decorator setting. The -l option sets the logging info.

celery -A testcelery worker -Q normal -l info

We can start the celery beat with the following command. Celery beat is a cron-like process that will read periodic tasks and put a job on the queue at the scheduled interval to be consumed by a worker.

celery -A testcelery beat -l info

Here’s what the output of the beat process looks like. Notice the tasks name and timestamps.

Here’s what the output of the worker process looks like.

Back in the django admin, we can see the ‘last run’ timestamp listed in the periodic task list.

With the periodic tasks listed in the database, changes, such as to the interval, can be make on the fly. The django-celery-beat package include several extensive features for managing periodic tasks. For example, the positional and keyword arguments allow for passing parameters to tasks in the tasks.py file.

--

--