Simple Heroku-like workflow with git and docker compose
Inspired by Aria’s post about simple deployment with git using a new feature in git 2.3.0 receive.denyCurrentBranch = updateInstead
, I have been experimenting with putting together a Heroku-like workflow (i.e. deploy with just git push heroku master
) with just git and docker.
Since Heroku just comprises of a compute worker and a bunch of services that are accessible to the main process through environment variables, it fits quite nicely with the docker ecosystem. Specifically, the setup can be achieved quite seamlessly with docker-compose.
The primary reason that I wanted to move away from Heroku is cost. Most of what I do is just at the hobby level, but with Heroku’s recent changes to their pricing, my hobby app will be forced to sleep if it runs for more than 18 hours a day. It seems reasonable, but I kept running into issue with an app that is pinged every 10 minutes through a cron job.
For this new setup, I use Digital Ocean as the hosting provider. It isn’t exactly free either, but for $5-10/ month, I could host all of my hobby apps. Also, I was already paying for the droplet for running a couple other services.
Setup
Docker Compose
You develop your app as per usual. Then add a docker-compose.yml
file to declare all the different services that your app might rely on (the main suspect is usually a database component). Even if your app only comprises of a single container, you can still use docker-compose to run it. The reason for using compose is that it has a feature called smart recreate that will only rebuild a container if its configuration has changed.
Here’s an example docker-compose.yml file I have for a node application:
app:
build: .
links:
- db:mongodb
environment:
PORT: 3000
ports:
- "3000:3000"
db:
image: mongo
volumes:
- ./data/db:/data/db
Git
On the hosting server, create a new git repository that will serve as your remote. For example:
:; git init myapp
:; cd myapp
:; git config receive.denyCurrentBranch updateInstead
This will allow the remote repo to accept push updates.
Now, back on the local machine, add the new remote:
:; git remote add do ssh://droplet-ip/path-to-remote-myapp
do
– name of the git remote – it can be anything we want it to bedroplet-ip
– IP address of the remote hosting server, in this case a digital ocean droplet
Lastly, the remote git repository needs to be configured to run docker-compose up -d
every time it receives a push update. This can be simply achieved with git hooks:
:; cd myapp
:; cd .git/hooks
:; touch post-update
:; chmod +x post-update
post-update
file should have the following content in it:
#!/usr/bin/env bash
docker-compose build
docker-compose up -d
With this setup, every time we make change to our app locally, we can simply deploy it with
:; git push do
WordPress
This setup can be used with any application stack out there. I have been able to use this quite successfully with setting up a WordPress site as well. Using the same instructions described above, here’s a sample docker-compose.yml
file I use for a WordPress instance.
wordpress:
image: wordpress
external_links:
- wordpress_db:mysql
environment:
WORDPRESS_DB_NAME: wp_blog
volumes:
- .:/var/www/html/wp-content/themes/mytheme
- ./uploads:/var/www/html/wp-content/uploads
- ./plugins:/var/www/html/wp-content/plugins
ports:
- "5000:80"
A couple of notes here:
- An external mysql database container already running outside of this compose setup is used here, which explains the
external_links
instead of the usuallinks
. - The current working directory is mounted as the theme for the installation.
- Directories
uploads
andplugins
are mounted to store any uploaded media content and plugins installed for this WordPress site. This is so that any data is persisted even if the container is killed. These two directories are therefore.gitignore
-ed.
In this setup, docker-compose up -d
step is only needed to be run once once. Since the container is running just the stock WordPress iamge, it doesn’t need to be rebuilt with every code change. The post-update
git hook is still used however for running an npm build step for compiling css
and js
assets for the theme:
#!/usr/bin/env bash
npm prune && npm install
npm run build
I used to use VVV for my local WordPress setup, but after switching over to using docker, I can now truly guarantee that the ways my site is run locally and in production are the same. The set up is also much simpler. I can focus on working on the theme, which is my main goal. Everything else is scripted and automated.
Conclusion
This new set up is very refreshing to me, as it greatly simplifies the deployment process. It is also readily scalable by using docker out of the box. The best part however is that I get to control every part of the process and not rely on some opaque architecture, like Heroku’s. All of this is achieved with very little sysadmin overhead tasks on my part.
comments powered by Disqus