'Docker C++ development and CI
I want to use the docker to dockerize my C++ projects. The Internet has recommendations that it is better to use 2 containers - development and product. But I do not understand how I need to create a development container to support Continuous Integration.
The questions are the following:
- if it's CI, when should the project be compiled at docker build or docker run?
- If in the docker build, how do I get the compiled project from the development container to be put in product container? Because at the stage of docker build you can not connect VOLUMES.
- Is there anyway the recipes for the use of docker in C ++ development?
Maybe I'm not in the right direction at all, and, for example, a development container should include eclipse? Although what then about CI? All the tutorials that I find usually refer to PHP, which without compile stage.
Solution 1:[1]
You have different patterns with Docker and C++.
Compile Projects with ready Docker Images
This includes the usage of docker images with your desired compiler and third party libraries (created with docker build) and compile your project with the docker run command. This is typically the most used way to deal with CI environments.
Pro:
- You can mount your project folder in the container and let it do the work. At the end, you have your binaries in your host system.
- The image used for testing can be the same as for production (depending on the project of course)
- Docker images are smaller (as long as you keep your code in a mount volume)
- Easy to update base images and package (e.g. from Ubuntu 16.04 to 16.10, etc)
Cons:
- Every developer / QA must configure the launch of the docker container by herself/himself (use docker-compose to the rescue)
Build Docker Images Project-Wide
In this case you encapsulate all the requirements and code in a docker image, which means you have to rebuild it every time your code changes, remove your previous running containers, download the new image in your CI server and so on.
Pros:
- Very specialized images - one docker image -> one snapshot of your code
- Faster to share with QA
Cons:
- Huge Docker Images
- A docker build must be triggered every single time your code changes, which runs the compile, etc. -> slow
- More complicated to update / too many layers
Remember that Docker has a certain amount of maximum allowed layers (I guess it's still 127), the more layers you have, the bigger will be the image.
Consider Multi-Stage Builds
Multi-Stage Builds are a lately addition to Docker and they can speed up the build process, by creating a cached image which holds all the necessary tools, in order to produce the final image.
Conclusion
I would go definitively for the 1st solution, because it gives you more flexibility and you can keep your changes across different environments, as long as you use the Docker Volumes, especially if you use some package managers such as conan or vcpkg.
For an example of Docker/C++ image, following the solution #1, please see: one of my docker images
About releases, you can build your docker image using COPY or ADD in a extremely simple Dockerfile with only the bare minimum dependencies required by your project (the best is achieved by statically compiling your code)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |