Deploying WordPress with Docker Compose on Ubuntu
How-to deploy custom WordPress themes and plugins in a local container handled by Docker Compose on Ubuntu
I know, there are lots of tutorials about running WordPress with Docker… and I followed some of them to get it working on my machine, too. But I had issues they didn’t consider, so I want to share how-to avoid them: you may run multiple servers under the same operating system just like me — and that’s an annoying problem. Yes, because you need to reserve local addresses and ports which could be already taken by other services; in this case, Docker Compose won’t work as expected (and isn’t easy to immediately understand why at all).
Docker CE (Community Edition)
First of, you need to install Docker. I’ll show you how-to do so on Ubuntu 19.10 and later, but (as long as you have a working Windows 10 Pro copy) you can do the same under WSL2: it was the best choice for a startup I worked with. Unfortunately, it doesn’t run on Windows 10 Home,¹ because you need full Hyper-V support to get it running. Let’s say that a “native” Linux installation is preferreable; on every Debian-based system you can follow the same steps below. Here’s the official way to install Docker CE on Ubuntu:
$ sudo apt-get update
$ sudo apt-get install apt-transport-https ca-certificates curl gnupg-agent software-properties-common$ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
$ sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"$ sudo apt-get update
$ sudo apt-get install docker-ce docker-ce-cli containerd.io
$ sudo usermod -aG docker username
Once it has been installed, you need an extra step to work with Docker Compose: it’s a light script to start multiple containers without using more heavy solutions like Kubernetes. Running WordPress doesn’t require such a complex infrastructure, so I won’t cover it here. You can also build and run a container with Docker CE alone, but I find this easier to maintain across multiple devices; in fact, I use to work with different operating systems and machines… then I need a consistent setup I can easily share with all of them.
$ sudo curl -L "https://github.com/docker/compose/releases/download/1.25.5/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
$ sudo chmod +x /usr/local/bin/docker-compose
Please, notice that the version mentioned above is the latest at the time of writing. Check the official repository on GitHub to get a stable release: Docker Compose is as little as a single executable to run in the terminal emulator. FYI, I use Tilix on Ubuntu Budgie powered by Digital Ocean, but it works on all the platforms — macOS included; it allows you to start a full development environment for WordPress, controlled by a simple YAML configuration file.² You can find a basic setup on the Docker documentation.
Why do you need further instructions? Well, the basic setup doesn’t cover some important aspects. I guess that you’re doing this to develop WordPress themes and/or plugins locally, so you must know where their files are saved. Maybe you already have them stored somewhere: then you want to specify their local path. That’s what I needed to do when I started looking for a solution like this… but I wasn’t able to find a working answer to my questions. Now, have a look at what I did with the
docker-compose.yml file to solve this:
# image: mariadb:latest # MariaDB
command: mysqld --default-authentication-plugin=mysql_native_password
Before any WordPress configuration, you have to provide a MySQL database. I didn’t specify a custom address, neither a port, because doing this broke the Docker Compose setup I was working on. You can see that the container uses the same path of a standard installation: they can’t run together if you don’t mount this to another directory. So, if you want to get it running under
localhost, you can’t let another active database server on it — it’s the same for MariaDB. Either for Apache and NGINX, called in the next section.
The most important thing you must see above is under
volumes: those paths link your existing plugins and themes to WordPress. If you don’t specify them, you won’t be able to access anything, because by default Docker Compose saves its files under
/tmp and clears them once you shut it down. I know, this configuration conflicts with any local HTTP server you could have installed and that’s why I suggest my solution to those who don’t have running servers at all. I wanted to focus on themes and plugins development.
$ cd /path/to/docker-compose.yml
$ docker-compose up -d
You may also specify a static address (and modify the default ports) under
ports for all the services, from MySQL to WordPress, but I wasn’t able to get one: having Apache installed and running on my machine, most of them were already reserved. I was forced to look for the randomly set address with
docker-compose logs… and it changed every time I restarted the machine. A Docker Compose setup like that would have been useless for my purpose, since it always needs an updated
wp-config.php file. I fix it removing Apache.
¹EDIT: a new version of Docker Desktop for Windows 10 Home has been released. (2020/03/05)
²EDIT: an open source Compose Specification is now available. (2020/04/07)
I chose to use the default networking configuration of Docker Compose because, as a front-end developer, I have several services running on the same local address with custom ports: Angular, React, and Vue.js to name a few. BTW, my own setup includes PhpMyAdmin as well; I shared it on GitHub, because I find it really useful to have a simple WordPress installation to work with. Of course, this solution isn’t safe for production. You can clone it if you just need a working environment to start building new themes and plugins.