Frank DENIS random thoughts.

CI is awesome. CD makes me uncomfortable.

Compiling software is slightly more complicated on Windows than on Unix-based systems.

This is why pre-compiled versions of libsodium are available for download, in addition to the source code.

These packages include static and dynamic libraries for all Visual Studio versions from Visual Studio 2010 to Visual Studio 2019, both for x86 and x86_64, as well as MinGW32 and MinGW64.

New packages are built for each release, but also every time a new update to these compilers is available, and every time a code-impacting change is made to the stable branch.

I’ve been meticulously testing each of these versions on different hardware, and manually reviewed the code for unwanted compiler optimizations.

This turned out to be time-consuming, but has proven to be totally worth it, spotting interesting bugs happening only on some CPUs or with specific compiler versions/patches.

Unfortunately, people want more. Users asked for Windows/ARM builds, 32-bit VS2010 builds (that the Express edition doesn’t support), as well as binaries for ridiculously obsolete Linux distributions that I don’t use, will never use, but .NET still supports.

I can’t handle all of these extra environments. I don’t have the hardware, I don’t have the software, and I can’t afford updating all these things after every single commit. This is not fun at all, and reaches the limit of time I want to work for free, for other people, on things I personally have zero interest in.

People repeatedly suggested I use hosted continuous delivery services to fulfill their needs instead of manually building releases.

So, this is happening. Visual Studio and MinGW builds are now automatically made by Azure Pipelines, pushed to the distribution server, and signed.

Builds for macOS and a bunch of Linux distributions will follow, so that packages for .NET core have all of these.

Azure Pipelines are great. Once you understand how it works and where to click in order to achieve what you want, it works really well, minus arbitrary network connection errors.

And automatically getting up-to-date packages after a commit in selected branches is convenient. And yes, it’s a huge time saver over manually building these packages, especially since my boxes are not particularly fast.

But while continuous integration is awesome, and something I can’t live without, using the same tools for continuous delivery makes me uncomfortable. Let me explain why.

Packages that critical applications may depend on now depend on a third-party service.

Microsoft can backdoor the artifacts, they will be considered perfectly genuine, will be signed, will be delivered and deployed.

Microsoft doing that is very unlikely.

However, the security of these packages now also depends on my Microsoft account not being accessible by a malicious party. Unfortunately, breaches happen all the time. Passwords are leaked. Accounts get hacked. And depending on a 3rd-party CD service adds new attack vectors to the delivery chain.

But the main reason I don’t feel comfortable with this is that these packages will not have received much testing and scrutiny before being delivered.

First, they will be built on an environment that is not the one the software has been written on.

I constantly keep all my development up-to-date. However, CI/CD services are constantly running behind. The package from the CD service can behave differently than the one I built locally, due to a missing compiler update on the CD side. For a concrete example having affected libsodium, some early versions of Visual Studio 2010 didn’t properly preserve registers around assembly code.

Next, CI/CD services are never up-to-date, but many users are. Libsodium binaries for Visual Studio 2019 were available the day the final version of Visual Studio 2019 was available. Being dependent on a 3rd-party service makes this way more difficult, especially with software that requires user interaction to be installed.

Next, CI/CD services don’t provide much hardware diversity or choice. The jobs will run on whatever hardware is currently assigned to your tasks, period. This is a problem for software that has different code paths according to the CPU features.

Non-interactive builds are not great to review compiled code. In order to understand compiler behaviors, I typically keep tweaking the source code, recompile and watch the output. This is simple, quick and efficient. I can’t imagine doing the same thing if every single change requires triggering and waiting for the CD pipeline.

Thanks to a 3rd-party continuous delivery service, binaries for environment I’ve never used are built, and are now immediately available for download.

The code compiles. The test suites are passing. On the CD service. Everything’s fine, then. Probably.