Over the past two decades, the global software supply chain has evolved into a complex combination of code, binaries, networked APIs and their config files. Although this combination has become incredibly useful to consumers and corporations by increasing reuse and development velocity across the software industry, recent attacks such as SolarWinds and CodeCov have clearly demonstrated that this complexity has also created countless openings for abuse and theft.
Even the Biden administration has taken notice, issuing an Executive Order (EO) that focuses on securing the software supply chain. Closing these gaps and establishing durable confidence in a digital world requires an enormous amount of investment and determination.
Consider an analogy: In hiring contractors for a major remodeling project, a homeowner finds a contractor somewhere on the Internet and has a few short discussions with them to nail down the project’s scope and pricing.
To get the project started, the contractor asks for a copy of the house keys and says they plan to use subcontractors for some of the jobs and that those subcontractors have subcontractors who will need access to the key copies as well to complete the work in a timely manner.
Seems reasonable? Of course not. Yet, that’s how software supply chain risk assessment happens (or rather, does not happen) in many of today's software ecosystems. Installing software is not like installing a kitchen appliance. Rather, running software with little regard for its trustworthiness on a system containing sensitive data compares more to giving the house keys to a number of people the homeowners don't know.
The complexities are vast
To further the adoption of supply chain integrity best practices, Google and the Open Source Security Foundation (OpenSSF) proposed Supply-chain Levels for Software Artifacts (SLSA) to formalize criteria around software supply chain integrity. These are based on strong, proven mechanisms developed for Google's internal development and deployment workflows. SLSA’s key elements for protecting against common attack points include:
- Strong controls against unilateral access for all components of the chain; two person review is required.
- Hermetic builds: All inputs (including transitive dependencies) are declared at the start.
- Deployment artifacts carry tamper-evident digitally signed metadata.
- Deployment is gated by a policy engine that enforces requirements for the target environment.
- Crucially, the components of this source-build-deploy workflow, and the platform it runs on top of, are themselves built and deployed using this secure workflow.
Today's software, especially in many open-source ecosystems, relies on a large number of software packages and libraries published by separate vendors or open-source projects. If 50-100 subcontractors on a remodeling project seems outlandish, consider that the Kubernetes project — which provides the platform for many modern cloud environments — has several hundreds of software library dependencies. To put this into perspective, take a look at the dependency graph, think relationship graph, for the Kubernetes project.
It's reasonable to assume that vendors and open-source projects are generally honest and well-meaning; however supply chains are only as strong as the weakest link, and even one dependency with one malicious developer could lead to a compromise.
An attacker might try to slip an innocent-looking change into a project's source code, for example to create a secret backdoor -- a bypass of the customary security controls -- to use after the software gets deployed. Considering that even a single critical library can consist of around a half million lines of code or more, it's difficult to stay confident that such attacks will always be noticed.
Finally, it's difficult to verify the trustworthiness and integrity of software on an ongoing basis. Software files generally come without a detailed account of all the suppliers who contributed to it, nor evidence that none of the components were tampered with by an attacker.
Laying the foundation
Open source software prevails today, making up most software being built and distributed. Security risks span essentially all software companies and open-source projects – only an industry-wide commitment involving a global community of developers can make real progress.
First, we need to have a common criteria to establish whether components of the supply chain adhere to common security best practices. Google co-created the Open Source Security Foundation (OpenSSF), a cross-industry forum under the Linux Foundation working to improve open source software security, on a number of initiatives to do just that.
The maintainers and security software engineers across the open source ecosystem need our support. That’s why Google has invested over 4 million dollars in the last 12 months to underwrite salaries for critical infrastructure work in Linux, Node.js, Rust, and Python. We encourage our industry peers to consider making this a key part of both their software security and open source strategy.
We’re hopeful about the progress being made, but we still have only just started laying the foundation.
Eric Brewer, vice president and fellow, Google Cloud; Royal Hansen, vice president, security, Google