'Django +docker-compose + Celery + redis - How to use Redis deployed in my own remote server?

I have a Django app deployed in Docker containers.

I have 3 config environnements: dev, preprod and prod. dev is my local environnement (localhost) and preprod/prod are remote linux environnements. It works when using the "public" Redis server and standard config.

But I need to use our own Redis deployed in Docker container in a remote server (192.168.xx.xx) with name container redis_cont.

And I do not really know how to config. I do not know if it is possible? I would appreciate some help.

docker-compose

version: '3.7'

services:
    web:
        restart: always
        build: 
            context: ./app
            dockerfile: Dockerfile.dev
        restart: always
        command: python manage.py runserver 0.0.0.0:8000
        volumes:
            - ./app:/usr/src/app
        ports:
            - 8000:8000
        env_file:
            - ./.env.dev
        entrypoint: [ "/usr/src/app/entrypoint.dev.sh" ]
        depends_on: 
            - redis
        healthcheck:
            test: ["CMD", "curl", "-f", "http://localhost:8000/"]
            interval: 30s
            timeout: 10s
            retries: 50
    redis:
        container_name: redis_cont          <= container running in remote linux server
        image: "redis:alpine"
    celery:
        build: 
            context: ./app
            dockerfile: Dockerfile.dev
        command: celery -A core worker -l info
        volumes:
            - ./app:/usr/src/app
        env_file:
            - ./.env.dev
        depends_on:
            - web
            - redis
    celery-beat:
        build: 
            context: ./app
            dockerfile: Dockerfile.dev
        command: celery -A core beat -l info
        volumes:
            - ./app:/usr/src/app
        env_file:
            - ./.env.dev
        depends_on:
            - web
            - redis

settings.py

CELERY_BROKER_URL = 'redis://redis:6379'
CELERY_RESULT_BACKEND = 'redis://redis:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_BEAT_SCHEDULE = {
    'hello': {
        'task': 'project.tasks.hello',
        'schedule': crontab()  # execute every minute
    },
}



Solution 1:[1]

Since the containers are not created via the same docker-compose, they won't share the same network. redis_cont just doesn't exist to the services built in the isolated network of your docker-compose.

If Redis container is published on the remote and is accessible using ip:port, you should be able to use it directly in your settings.py. No need to add a new service in your compose file.


Note

To establish a communication between services in the same docker-compose you should use the service name (web, celery-beat, etc in your case) and not the container name.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 zoot