Dockerizing Our Legacy Apps: Some Notes

I spent the recent weeks of January tinkering with Docker in both Windows 7 and Mac OS. I played with it a lot because I thought it’s something that’s useful for a grand new project we have at work, and I thought that integrating our legacy application code to it would help me learn about it more. And the exercise did help me understand the tool better, including some nuances with application performance and database connections. I was able to dockerize our legacy apps too! 🙂

Some notes to remember related to the exercise:

  • Windows 7 Docker Toolbox and Docker for Mac has a performance issue with volume mounts through docker-compose. Legacy apps composed of a large number of files (especially with dependency directories) will run, but they will be painfully slow out-of-the-box. Fortunately for Docker for Mac users, docker-sync has an effective workaround for this problem. It involves running an in-sync container for the application code, separate from the docker-compose file. Unfortunately, I have not found any workarounds for said performance issue for Windows 7 (and perhaps Windows 8) Docker Toolbox users.
  • Often we have to update the host file of our server machine so that we can run applications locally using a distinct, easy-to-remember URL through a browser. This means we need to add extra hosts to necessary docker containers too if we dockerize our apps. We can do this by using the extra_hosts command in docker-compose.
  • The official Postgresql docker container does not include the pdo, pdo_pgsql, and pgsql drivers, which handles the connection between the application and the database. To install those drivers inside the official container, we’ll need to use a Dockerfile and build it from the docker-compose file with the build and context commands.
  • Sometimes we have a need to copy the Postgresql DB files from a running container to set up a proper volume mount of database data from host to container. We can copy that data by using the convenient docker cp <source> <destination> command. I had this work in Docker for Mac. However, for Windows 7 Docker Toolbox users, a docker container is unable to use such copied data, perhaps because of the difference in OS between host and container, so I had to resort to restoring and backing up data every time I start and stop my application containers.
  • As a tester, what Docker provides me is a convenient tool to test all sorts of interesting application configurations as much as I want to on a single machine, see if the apps break if I changed some service config, and find out which configurations work or not. I can add or remove a new service, or even update an existing service to a new version, like updating PHP from 5.6 to 7.1, and immediately see what impact it has on the apps themselves. These kinds of tests are often left to operations engineers, but I’m glad there is a now a way to do such tests on my own machine, before application changes even reach a dedicated testing server.
  • Even if Docker makes it easy to setup an application development environment from scratch with docker-compose and Dockerfiles, it is still important to maintain a wiki of the necessary machine configurations a programmer needs to perform in order to reset or build the apps with only a single command, or two. Subtle things like custom docker-compose files, .env and php.ini files, host files, Nginx configs, or turning long docker commands into shortcuts with shell scripts or make commands.
  • Makefile tasks can help put specific scripts into an easy-to-remember command with context. I assume Rakefile does the same thing in Ruby, or Jakefile in JavaScript.
  • Dockerizing our legacy apps pushed me on a discussion with programmers about the ways they run said applications on their machines. Most of them actually just test code changes directly on Staging or another available development server. That speaks about one habit we have as a development team, and likely the reason why our apps are a pain to setup locally.
Advertisements

An Experience Writing and Running Test Code on Cloud9’s Online IDE

What if for some reason we can’t completely run our test code on a local machine?

In the past few months, I’ve had encounters with JSON gem installation problems in some of my colleague’s Windows 10 PCs where there was one I couldn’t resolve. Hence, the question above. I have never had issues with setting up our test code on machines starting from scratch, but apparently they can get broken sometimes. Installations on Windows especially have a tendency to be tricky, where some setting in one application or firewall or registry prevents another software from functioning well.

So, what to do if we can’t completely install required dependencies for our test code on a particular computer, even after days of re-installing and testing possible solutions found online? We could re-install the operating itself, sounds logical, starting completely from scratch, but that takes away all the applications we’ve installed and use. Setting all of those back up will take some time, but maybe good for the long run. Another option is to use a virtual machine and set up our working test code over there. But VMs can hog up memory easily, which a laptop with only 4GB of RAM does not handle very well. What about online IDEs? I knew they exist, but I’ve never had a good reason to try them before until now.

Cloud9 is at the top of my search results for an online IDE, and here’s what running test code on their platform looks like:

Here’s what it looks like running HTTP layer checks on Cloud9 IDE

Easy peasy. I created a Cloud9 account, then created a workspace next, bound that’s account’s SSH key to our remote test code repository, retrieved the test code, installed dependencies via bundler, and tests are running smoothly on the IDE’s terminal / command line. Feels pleasant enough, I can see myself using this if I have to go on remote somewhere for a quick code change and I don’t have a work laptop with me. The IDE feels similar to Sublime Text, after changing to a dark theme, and looks adequate.

Some caveats/notes:

  • a credit card is required for a Cloud9 account (no charges until you upgrade to a premium plan)
  • we cannot run browser tests on the workspace since there are no browsers installed on them, and we can’t run them even if we install them via the terminal
  • we can probably run browser tests on another computer on the cloud if we have access to it and connect to it programmatically via code

In conclusion, online IDEs are not exactly a total replacement for actual machines where we often have all our desired tools. But they’re good enough for automating checks through the HTTP layer, very easy to set up and start with. I actually use Cloud9 for learning how to build Ruby-on-Rails web apps now, and I did not have to worry about installing anything to run my test project. 🙂