Celery Tips¶

Following on yesterday’s post about Virtualenv Tips, I will be talking about celery tips. Yesterday I talked about how to run celery with upstart easily, and today I’ll be expanding on that below as well as talking about how to set it up using supervisord.

Note: Also interesting, I wrote a Big list of django tips back in 2008, that still has a lot of good information.

Running celery in development¶ When you run celery in production, you should be using a queue on the backend. However, when you’re running celery in development, it’s nice to execute the code paths, but not actually need a queue. This is where the CELERY_ALWAYS_EAGER setting comes in handy. It makes celery run the code in process, but will make sure your code paths work correctly. I talk about this and more in my djangocon talk.

Killing long running tasks¶ On ReadTheDocs I would run into problems with celery tasks never returning. Luckily, celery has a way to handle this. The CELERYD_TASK_TIME_LIMIT setting lets you set the number of seconds that a task can run until it is killed. This is nice to make sure that a run-away task won’t take down all your backend processing.

Use the JSON serializer for interoperability¶ I was talking on IRC to Eric Florenzano and he mentioned that you should use the json serializer if you want to be able to add celery tasks from other languages. This allows you to use another language to put a message that looks like a celery task in the queue, and it should just work.

Explictly set the number of clients¶ When you run celery, it defaults to having the number of workers equal to the number of cores the machine has. If you are running multiple queue workers on the same machine, it is a good idea to use less. You can set this with the CELERYD_CONCURRENCY setting, or passing -c<num> on the command line.