I would like my docker-compose.yml file to use the ".env" file in the same directory as the "docker-compose.yml" file to set some envrionment variables and for those to take precedence for any other env vars set in the shell. Right now I have
$ echo $DB_USER
tommyboy
and in my .env file I have
$ cat .env
DB_NAME=directory_data
DB_USER=myuser
DB_PASS=mypass
DB_SERVICE=postgres
DB_PORT=5432
I have this in my docker-compose.yml file ...
version: '3'
services:
postgres:
image: postgres:10.5
ports:
- 5105:5432
environment:
POSTGRES_DB: directory_data
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: password
web:
restart: always
build: ./web
ports: # to access the container from outside
- "8000:8000"
environment:
DEBUG: 'true'
SERVICE_CREDS_JSON_FILE: '/my-app/credentials.json'
DB_SERVICE: host.docker.internal
DB_NAME: directory_data
DB_USER: ${DB_USER}
DB_PASS: password
DB_PORT: 5432
command: /usr/local/bin/gunicorn directory.wsgi:application --reload -w 2 -b :8000
volumes:
- ./web/:/app
depends_on:
- postgres
In my Python 3/Django 3 project, I have this in my application's settings.py file
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DB_NAME'],
'USER': os.environ['DB_USER'],
'PASSWORD': os.environ['DB_PASS'],
'HOST': os.environ['DB_SERVICE'],
'PORT': os.environ['DB_PORT']
}
}
However when I run my project, using "docker-compose up", I see
maps-web-1 | File "/usr/local/lib/python3.9/site-packages/django/db/backends/postgresql/base.py", line 187, in get_new_connection
maps-web-1 | connection = Database.connect(**conn_params)
maps-web-1 | File "/usr/local/lib/python3.9/site-packages/psycopg2/__init__.py", line 127, in connect
maps-web-1 | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
maps-web-1 | psycopg2.OperationalError: FATAL: role "tommyboy" does not exist
It seems like the Django container is using the shell's env var instead of what is passed in and I was wondering if there's a way to have the Python/Django container use the ".env" file at the root for it's env vars.
I thought at first I had misread your question, but I think my original comment was correct. As I mentioned earlier, it is common for your local shell environment to override things in a
.envfile; this allows you to override settings on the command line. In other words, if you have in your.envfile:And you want to override the value of
DB_USERfor a singledocker-compose upinvocation, you can run:That's why values in your local environment take precedence.
When using
docker-composewith things that store persistent data -- like Postgres! -- you will occasionally see what seems to be weird behavior when working with environment variables that are used to configure the container. Consider this sequence of events:We run
docker-compose upfor the first time, using the values in your.envfile.We confirm that we can connect to the database us the
myuseruser:We stop the container by typing
CTRL-C.We start the container with a new value for
DB_USERin our environment variable:We try connecting using the
tommyboyusername......and it fails.
What's going on here?
The
POSTGRES_*environment variables you use to configure the Postgres are only relevant if the database hasn't already been initialized. When you stop and restart a service withdocker-compose, it doesn't create a new container; it just restarts the existing one.That means that in the above sequence of events, the database was originally created with the
myuserusername, and starting it the second time when settingDB_USERin our environment didn't change anything.The solution here is use the
docker-compose downcommand, which deletes the containers...And then create a new one with the updated environment variable:
Now we can access the database as expected: