If we use celery beat and run about 1000 tasks by same crontab schedule, will tasks run one by one or some tasks will not run (cause of out of time)? redis as M
I'm trying to write a Celery(v. 4.2.1) integration test for my Django(v. 2.2.3) application. There is a bunch of outdated articles about this around, but non o
I have a periodic task scheduled to run every 10 minutes. Sometimes this task completes in 2-3 minutes, sometimes it takes 20 minutes. Is there any way using
I have this in my /var/log/celery/w1.log I'm following the steps for Celery here. I have this in my celery.py from __future__ import absolute_import, unicode_l
I'm currently managing my API using Celery tasks and a kubernetes cluster on Google Cloud Platform. Celery is automatically logging input and output of each tas
FOR THOSE READING THIS: I have decided to use RQ instead which doesn't fail when running code that uses the multiprocessing module. I suggest you use that. I a
In my web-application I start backgrounds jobs with celery without storing their id. Some of the task are periodic and some are triggered by user-interaction. T
I've installed rabbitmq and it's running. I've successfully add_user as well as add_vhost. But in the next step of the documentation it says to set_permissions
I am trying to run my django server on an Ubuntu instance on AWS EC2. I am using gunicorn to run the server like this : gunicorn --workers 4 --bind 127.0.0.1:8
I've hit a really nasty situation. I have the following setup. I have a django model representing an FSM with a django FSM field I have a celery task that sen
I would like to implement a distributed job execution system with celery. Given that rabbitMQ doesn't support priorities and I'm painfully needing this feature,
I want to listen to an existing SQS queue through Celery. I have already done publishing to Queue via celery and then consuming from that queue through workers