How to run a celery worker with Django app scalable by AWS Elastic Beanstalk?

This is how I set up celery with django on elastic beanstalk with scalability working fine. Please keep in mind that ‘leader_only’ option for container_commands works only on environment rebuild or deployment of the App. If service works long enough, leader node may be removed by Elastic Beanstalk. To deal with that, you may have … Read more

Running “unique” tasks with celery

Based on MattH’s answer, you could use a decorator like this: def single_instance_task(timeout): def task_exc(func): @functools.wraps(func) def wrapper(*args, **kwargs): lock_id = “celery-single-instance-” + func.__name__ acquire_lock = lambda: cache.add(lock_id, “true”, timeout) release_lock = lambda: cache.delete(lock_id) if acquire_lock(): try: func(*args, **kwargs) finally: release_lock() return wrapper return task_exc then, use it like so… @periodic_task(run_every=timedelta(minutes=1)) @single_instance_task(60*10) def fetch_articles() yada … Read more

Run a Scrapy spider in a Celery Task

The twisted reactor cannot be restarted. A work around for this is to let the celery task fork a new child process for each crawl you want to execute as proposed in the following post: Running Scrapy spiders in a Celery task This gets around the “reactor cannot be restart-able” issue by utilizing the multiprocessing … Read more

How to run celery on windows?

Celery 4.0+ does not officially support window already. But it still works on window for some development/test purpose. Use eventlet instead as below: pip install eventlet celery -A <module> worker -l info -P eventlet It works for me on window 10 + celery 4.1 + python 3. This solution solved the following exception: [2017-11-16 21:19:46,938: … Read more