When integrating a PHP application connecting to a PostgreSQL database, both services running as Docker containers using the official PHP and Postgre images, you might encounter (as I have) an error that looks something like this:
Uncaught Error: Call to undefined function pg_connect() in ...
It’s actually a simple error, which means that there’s something wrong with the connection between the app and the PostgreSQL database, but when I first stumbled on it I had a hard time finding out what I needed to do to fix it. There was definitely something missing from the Docker setup, but I did not know what it was until I sought help from a teammate.
Apparently the official PHP docker image does not contain the PDO and PGSQL drivers necessary for the successful connection. Silly me for assuming it does.
The fix is simple. We have to create a Dockerfile that updates our PHP image with the required drivers, which contains the following code:
RUN apt-get update
# Install PDO and PGSQL Drivers
RUN apt-get install -y libpq-dev \
&& docker-php-ext-configure pgsql -with-pgsql=/usr/local/pgsql \
&& docker-php-ext-install pdo pdo_pgsql pgsql
Easy peasy. And to run this Dockerfile from a docker-compose.yml file, we’ll need the build, context, and dockerfile commands, replacing the single image command that does not use a Dockerfile:
# All your other settings follow ...
And that should be all that you need to do! 🙂
When I first tried Docker a couple of years back, I did not find it much different from using a virtual machine. Perhaps because I was experimenting with it on Windows, or perhaps it was still a relatively new app back then. I remember not having a pleasant experience installing and running it on my machine, and at the time it was just easier to run and debug Selenium tests on a VM.
I tried again recently.
Building a Docker image containing test code with dependencies automatically installed, with Docker Toolbox on Windows 7
And I was both surprised and delighted to be able to build a Docker image with an existing test code and its dependencies automatically installed, right out of the box. This is very promising; I can now build development environments or tools which can run on any machine I own or for teams. To use them we just need install Docker and download the shared image. No more setup problems! Of course, there’s still a lot to test – we’ll probably want to have an image be slim in size, automatically update test code from a remote repository, among other cool things. I’ll try those next.
Here’s what the Dockerfile looks like:
RUN mkdir /usr/src/app
ADD . /usr/src/app/
RUN gem install bundler
RUN bundle install
Short and easy to follow. Then we build the image by running the following command on the terminal (on the root project directory):
docker build -t [desired_image_name] .
To run and access the image as a container:
docker run -i -t [image_name]:[tag_name] /bin/bash
And from there we can run our cucumber tests inside the container the same way as we do on our local machine.