# My Development Environment (follow-up)

Continuous Testing, Integration, Deployment and Docs as Code

This is a follow-up to my previous post on my development environment.

I spent some time to extend my template project in two different directions. First I configured a continuous testing/integration/deployment pipeline, and secondly I added some setup to work with a docs-as-code style project.

## A CI/CD Pipeline

There’s probably nothing really fancy about the pipeline that I set up. It’s mostly what you get in GitLab by default. There are three stages: build, test, and deploy:

• The build stage compiles the project and runs a few basic checks.
• The test stage runs the unit tests and evaluates the code coverage.
• Finally, the deployment step generates the documentation with Doxygen and PlantUML and publishes the resulting html pages.

The pipeline relies on two containers that are stored in the GitLab container registry. One of the containers is used to provide the compilers and other tools needed for building the code. The other container only contains Doxygen and PlantUML to generate the documentation.

The testing steps generate a JUnit XML as well as a Cobertura XML file to publish the results of the unit test evaluation and the code coverage. Both integrate well with GitLab CI/CD artefact reports. See here for JUnit and here for Cobertura.

The containers are rebuilt on a periodic basis and there’s a apt update as well as a apt upgrade step in the building process. So the builds are not really reproducible even though I fixed the underlying base image at Ubuntu 21.10. Reproducible builds are an interesting concept but not in the scope of this project. Also, there are no mechanisms that probe whether upgrading any of my dependencies would break anything. For the moment I don’t really see a need in checking this.

## Docs as Code

At my current job we develop software, at least to some extent, according to a V-model. This usually involves numerous requirements, all sorts of diagrams to document the design and usually also some bookkeeping on the testing status. We use tools such as Doors, Enterprise Architect and a handful of other proprietary products to manage all this data. I don’t want to argue whether these tools are good or not, but the process is sometimes not as smooth as one would expect it. We are, for example, required to link requirements to designs and to test results. This can be tricky. Some of aforementioned tools rely on databases where you can’t simply add a field for a URL or they might use proprietary data formats that cannot be read or written by any other tool. We’ve resorted to quite a few weird hacks in the past to fulfil the necessary linking.

Some of my colleagues have tried to solve this issue by proposing a docs-as-code environment. The idea is to have all the requirements, designs and documentations in simple files that can be version controlled and stored together with your source code. They use Sphinx to create readable documents and do the linking. I actually prefer Doxygen and wondered whether it would be possible to achieve something similar by creating all the documentation and design with Doxygen and PlantUML.

My solution looks as follows:

• Requirements are collected in Markdown files and can easily be stored in your source code repository. Doxygen supports Markdown. Thus, the requirements will automatically be part of the final documentation. Using dox files (i.e. Doxygens native format) would have been an alternative, but I didn’t like the fact that I had to create files which were basically composed of large C++ comment blocks. Also, there are so many more tools available to help you create links and tables of contents for Markdown than for dox files.
• The software designs are created with PlantUML. PlantUML works perfectly well together with Doxygen. I had a look at Mermaid as well, but I’m not sure if it can be integrated in Doxygen as easily as PlantUML.
• The linking between requirements, design and code works semi-automatically. Doxygen creates many links out of the box, but you have to link requirements to requirements manually for example.
• Code coverage and unit tests are evaluated in my CI pipeline. I use gcovr for code coverage, which can generate HTML output that I probably could integrate into the Doxygen documentation. The unit tests rely on Ctest and the Googletest framework. Here it’s probably possible to create some kind of HTML report that can be integrated into the Doxygen output as well. But as I already mentioned, for the moment I prefer hooking into the GitLab CI/CD artefact reports.