Google proposed base security standards for critical open-source packages last week, recognizing that open-source code accounts for a tremendous amount of modern software. But it’s still unclear how, exactly, to define a critical package.

Open-source code is ubiquitous for a variety of reasons. By not reinventing the wheel, it saves time and money throughout development and testing. And security of open-source packages can rival that of commercial ones.

“We see open source in around 90% of programs we scan,” said Chris Wysopal, founder and chief technology officer of automated application security tester Veracode. “The only reason a program doesn’t have open-source code is if someone deliberately decides not to use it.”

The problem, noted Google in a broader blog post about mitigating third-party risk from open-source, is that in a post-Solar Winds era, less structured projects are extremely vulnerable to malicious forces and human error.

Indeed, open-source software has been a vector for attacks in the past. Github said in December one in 500 alerts to developers about vulnerable dependencies in their code come from malicious code submissions.

There have also been high profile supply chain hacks of open-source projects, including in event-stream in 2018 and bootstrap-sass in 2019. To be clear, there have also been high profile supply chain hacks of commercial software.

But some of the organizational protections available in the commercial space are not used in the open-source space.

“What stands out to me [about the Google standards suggestions] is that these are what commercial software does,” said Wysopal.

The Google blog advocates for various structural standards. The first is ensuring that multiple independent sources look at all code before it’s committed. The classic joke in open source, famously illustrated by XKCD, is that mighty software stacks can lean heavily on a project being maintained by a single developer. Without an independent code review, developers can make mistakes.

Google also suggests the owners and maintainers of projects should not be anonymous, and if contributors are anonymous, their contributions should be met with additional scrutiny.

Just knowing names does not guarantee identity; Google suggests all projects use modern identity verification techniques. The software giant wants better validation that compiled binaries are what they say they are, and it wants notification for key indicators of risk – like when project ownership changes hands.

Unclear by Google’s suggestions is which projects qualify as critical, requiring the changes. Kubernetes, notes the blog, depends on a thousand packages each with its own dependencies. It is easy to say that Kubernetes is critical, but wading through that chain of dependencies to determine what else might qualify is no small feat.

Google writes that its ideas aren’t the only possible framework. “We presented one way to frame this discussion,” the blog said, “and defined a set of goals that we hope will accelerate industrywide discourse and the ultimate solutions.”

Wysopal agreed that standards still need to be ironed out. But the first step, he said, is a critical one.

“Google is taking a leadership role, and leadership is important,” he said.