Sign up ×
Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them, it only takes a minute:

I have a main service in my docker-compose file that uses postgres's image and, though I seem to be successfully connecting to the database, the data that I'm writing to it is not being kept beyond the lifetime of the container (what I did is based on this tutorial).

Here's my docker-compose file:

main:
  build: .
  volumes:
    - .:/code
  links:
    - postgresdb
  command: python manage.py insert_into_database
  environment:
    - DEBUG=true


postgresdb:
  build: utils/sql/
  volumes_from:
    - postgresdbdata
  ports:
    - "5432"
  environment:
    - DEBUG=true


postgresdbdata:
  build: utils/sql/
  volumes:
    - /var/lib/postgresql
  command: true
  environment:
    - DEBUG=true

and here's the Dockerfile I'm using for the postgresdb and postgresdbdata services (which essentially creates the database and adds a user):

FROM postgres

ADD make-db.sh /docker-entrypoint-initdb.d/

How can I get the data to stay after the main service has finished running, in order to be able to use it in the future (such as when I call something like python manage.py retrieve_from_database)? Is /var/lib/postgresql even the right directory, and would boot2docker have access to it given that it's apparently limited to /Users/?

Thank you!

share|improve this question
    
Is Auto-Commit set to true, or are you committing your changes manually? – Politank-Z Apr 24 at 16:03
    
auto-commit on sqlalchemy (that's what I'm using)? I believe I'm committing the changes "manually" --that is, by running python manage.py insert_into_database within the main service and letting that commit to Postgres (which worked before I started using Docker). Is this what you mean? – miguel5 Apr 24 at 16:07
    
Yes. Knowing Postgres, a failure to commit seemed the likeliest explanation. I am out of my depth otherwise and shall withdraw. – Politank-Z Apr 24 at 16:11
    
Thanks for your answer, @Politank-Z! I think it's a problem with the way I set up Docker though, because I had no problems with Postgres before I tried it... – miguel5 Apr 24 at 16:14

1 Answer 1

up vote 5 down vote accepted

The problem is that Compose creates a new version of the postgresdbdata container each time it restarts, so the old container and its data gets lost.

A secondary issue is that your data container shouldn't actually be running; data containers are really just a namespace for a volume that can be imported with --volumes-from, which still works with stopped containers.

For the time being the best solution is to take the postgresdbdata container out of the Compose config. Do something like:

$ docker run --name postgresdbdata postgresdb echo "Postgres data container"
Postgres data container

The echo command will run and the container will exit, but as long as don't docker rm it, you will still be able to refer to it in --volumes-from and your Compose application should work fine.

share|improve this answer
    
I see... So what can I do, Adrian? – miguel5 Apr 24 at 17:58
    
@miguel5 You caught me in between drafts - I've updated my answer. – Adrian Mouat Apr 24 at 18:11

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.