Skip to content

Marina Mele's site

Reflections on family, values, and personal growth

Menu
  • Home
  • About
Menu

How to install Celery on Django and Create a Periodic Task

Posted on February 16, 2014December 28, 2015 by Marina Mele

Updated on December 2015! – Now for Celery 3.1.19 and Django 1.8.7.

This post explains how to set up Celery with Django, using RabbitMQ as a message broker.

It also explains how to create a Periodic Task

The Broker RabbitMQ

First, we need to choose what is called a Message Broker, required by Celery in order to send and receive messages. Here we will use RabbitMQ, which is feature-complete, stable, durable and easy to install. Moreover, it is the default broker so it does not require additional configuration 🙂

Check out how to install it for your particular system here. If you are using Mac OS X you can install it with homebrew (and if you want to install first homebrew… check this post):

$ brew install rabbitmq

The RabbitMQ server scripts are installed into /usr/local/sbin. This is not automatically added to your path, so open or edit a .bash_profile in your home folder and add the following line

export PATH=$PATH:/usr/local/sbin

The server can then be started with

$ sudo rabbitmq-server -detached

Where the -detached flag indicates the server to run in the background. To stop the server use

$ sudo rabbitmqctl stop

You can find a detailed description on how to use RabbitMQ with Celery here.

After installing RabbitMQ we need to create a RabbitMQ user, a virtual host and allow that user to access the virtual host. We also start the server before that:

$ sudo rabbitmq-server -detached
$ sudo rabbitmqctl add_user myuser mypassword
$ sudo rabbitmqctl add_vhost myvhost
$ sudo rabbitmqctl set_permissions -p myvhost myuser ".*" ".*" ".*"

Then, open your Django project settings.py file and configure RabbitMQ by adding the lines:

BROKER_URL = "amqp://myuser:mypassword@localhost:5672/myvhost"

This tells celery, where your broker (your queue) is located. Here, we are running Celery at the same machine as RabbitMQ and using the localhost to find it.

Celery

Celery is on the Python Package Index (PyPi), and can be easily installed with pip or easy_install. Remember to activate first your virtual environment (if you want to install Virtualenv to create a virtual enviroment check this post).

$ pip install celery

Next, add this package to your requirements.txt file, so that both the production environment and the development environment on your local machine will use it. Recall that you can check the packages used by the current environment with

$ pip freeze

You will see that you have installed celery, pytz, billiard, kombu, anyjson and amqp. Write them all on your requirements file. You can also write them directly by using

$ pip freeze > requirements.txt

Now, we need to create a Celery instance, called a Celery app. Create a file at the same level of your settings.py file:

$ touch myprojectfolder/myproject/celery.py

And write the following code:

import os
from celery import Celery
from django.conf import settings

# Indicate Celery to use the default Django settings module
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myproject')
app.config_from_object('django.conf:settings')
# This line will tell Celery to autodiscover all your tasks.py that are in your app folders
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

Then, to ensure that the app is loaded when Django starts, you need to import this app in the __init__.py file.

Open the __init__.py file that is at the same level than the settings.py and celery.py files and write:

from .celery import app as celery_app

Moreover, for security purposes, you should specify a list of accepted content-types in the settings.py file. In this case, we will set json as our content type:

CELERY_ACCEPT_CONTENT = ['json']

Then, we need to specify the task serializer accordingly:

CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Finally, we can specify the time zone we are in:

CELERY_TIMEZONE = 'Europe/Madrid'

Note: In Celery 3.0+ the setting CELERY_ENABLE_UTC is enabled by default (it is set to True). This setting, if enabled, makes the dates and times in messages to be converted to use the UTC timezone.

Django-celery

If you want to store task results in the Django database, you’ll have to install the django-celery package. This package defines a result backend to keep track of the state of the tasks. To install it use:

$ pip install django-celery

remember to include it in your requirements file. Then, add it to your installed apps in your settings file:

INSTALLED_APPS(
    ...
    'djcelery',
    ...
)

Next, we need to create the corresponding database tables of this app, which can be done with:

$ python manage.py migrate djcelery

As we have indicated Celery to use our settings.py file, we can configure Celery to use the django-celery backend by adding this line into the settings.py file:

CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'

Create a Periodic Task

One thing you might want to use in your project is a Scraper, which is, for example, an aplication that runs periodicaly at night to update some data for your web site.

Choose or create an application in your Django project to include the Scraper. Then, create and edit the file myapp/utils/scrapers.py (note: you must have an empty __init__.py file inside the utils folder). The scrapers.py file must contain a function that performs your desired operations, like accessing an API and modifying your database.

In this example, we just write:

def scraper_example(a, b):
     return a + b

Then, create the file myapp/tasks.py and edit it:

from celery.task.schedules import crontab
from celery.decorators import periodic_task
from myapp.utils import scrapers
from celery.utils.log import get_task_logger
from datetime import datetime

logger = get_task_logger(__name__)

# A periodic task that will run every minute (the symbol "*" means every)
@periodic_task(run_every=(crontab(hour="*", minute="*", day_of_week="*")))
def scraper_example():
    logger.info("Start task")
    now = datetime.now()
    result = scrapers.scraper_example(now.day, now.minute)
    logger.info("Task finished: result = %i" % result)

Here, we have created a periodic task that will run every minute, and that writes into the logger two messages indicating the beginning and the end of the task, and also calls our scraper function.

Run it!

Ok, so we have our Periodic Task created, but how can we run it?? First start your RabbitMQ server:
$ sudo rabbitmq-server -detached

Next, start a Celery worker

$ python manage.py celeryd --verbosity=2 --loglevel=DEBUG

If the installation is correct, you should see at the top of the text displayed something like

transport: amqp://myuser@localhost:5672/myvhost
results: djcelery.backends.database:DatabaseBackend

And a list of the application tasks:

[tasks]
  .  celery.backend_cleanup
  .  ......etc
  .  myapp.tasks.scraper_example

Next, open a new tab and start celerybeat, which will send the registered tasks periodically to RabbitMQ:

$ python manage.py celerybeat --verbosity=2 --loglevel=DEBUG

If you go back to the Celery worker tab, you will see the results of your tasks 🙂

And finally, open another tab and start your Django developement server:

$ python manage.py runserver

Note: Beat needs to store the last run times of the tasks in a local database file, which by default is celerybeat-schedule.db and it’s placed at the same level of your manage.py file. If you are using Git as version control, you should include this file into your gitignore file.

Please, give a +1 if you liked it! Thanks!

How to create a Periodic Task with Celery and Django. Retweet it to your followers, they may find it interesting! 🙂 http://t.co/4VL44OQXng

— Marina Mele (@Marina_Mele) February 16, 2014

Marina Melé
Marina Mele

Marina Mele has experience in artificial intelligence implementation and has led tech teams for over a decade. On her personal blog (marinamele.com), she writes about personal growth, family values, AI, and other topics she’s passionate about. Marina also publishes a weekly AI newsletter featuring the latest advancements and innovations in the field (marinamele.substack.com)

Leave a Reply Cancel reply

You must be logged in to post a comment.

Categories

  • Personal Growth and Development
  • Artificial Intelligence
  • Mindful Parenting and Family Life
  • Productivity and Time Management
  • Mindfulness and Wellness
  • Values and Life Lessons
  • Posts en català
  • Other things to learn

Recent Posts

  • BlueSky Social – A Sneak Peek at the Future of Social Media
  • The Incredible Journey of AI Image Generation
  • AI and Fundamental Rights: How the AI Act Aims to Protect Individuals
  • Overcoming Regrets: Finding the Strength to Move Forward
  • Thinking Outside the Box: Creative Problem-Solving with Critical Thinking

RSS

  • Entries RSS
Follow @marina_mele
  • Cookie Policy
  • Privacy Policy
©2023 Marina Mele's site | Built using WordPress and Responsive Blogily theme by Superb
This website uses cookies to improve your experience. If you keep navigating through this website, we'll assume you're ok with this, but you can opt-out if you wish.Accept Read More
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT