A vulnerability not mentioned in the article is the normalisation of executing code that has been especially targeted to a specific user or specific device with no validation of reproducibility and no ability for anyone to verify this custom build and download service hasn't been generating backdoored builds.
One should want to ensure use of the same build of xz-utils that Andres Freund is using, or at least a build of xz-utils that other security researchers can later obtain to figure out whether supply chain implants are present in open source software[1].
There's a write up at Mozilla[2] from years ago describing an abandoned attempt by Mozilla to ensure their release builds are publicly logged in a Merkle tree. Google has written up their implementation for Pixel firmware builds but apps delivered through the Google Play Store seem to be vulnerable (unless there is another log I have been unable to find).[3] Apple is seemingly worse than Google on binary transparency with Apple's firmware and app distribution system targeting builds to individual devices with no transparency of builds.
For an example of binary transparency done well, Gentoo's ebuild repository (being a single Git repository/Merkle tree containing source checksums) possibly remains the largest and most distributed Merkle trees of open source software.
[1] Post xz-utils backdoor, some researchers (including some posting to oss-security about their efforts) undertook automated/semi-automated scans of open source software builds to check for unexplained high entropy files which could contain hidden malicious code. This is not possible to achieve with customised per-user/per-device builds unless every single build is made publicly available for later analysis and a public log (Merkle tree) accompanies those published builds.
[2] https://wiki.mozilla.org/Security/Binary_Transparency
[3] https://developers.google.com/android/binary_transparency/ov...
For Google Play: https://developer.android.com/guide/app-bundle/code-transpar...
As far as I know there's no centralised log, it's left up to app developers to publish their key/a log of transparency files.
Using a build service like that is apriori saying "i'm not valuable enough for a targeted attack".
> automated/semi-automated scans of open source software builds to check for unexplained high entropy files which could contain hidden malicious code
that's easily defeated though, you just "spread-out" the entropy.
Yeah I remember Google's certificate transparency team basically designing firmware transparency for all of Linux (not just Android) as well.
This is a nice idea, and one I also advocate for, however it's important to keep in mind that the idea of reproducibility relies on determinism. So much of what goes into a build pipeline is inherently nondeterministic, because we're making decisions at compile time which can differ from compilation run to compilation run, setting aside flags. In fact, that's the point of an optimizing compiler, as many reproducible build projects have discovered, turning on optimizations pretty much guarantees no reproducibility.