Last week, Approov released a report based on data from leading cybersecurity researcher Alissa Knight that revealed serious vulnerabilities in the Fast Healthcare Interoperability and Resources API ecosystem, which led to an immediate firestorm on public forums decrying the research as painting FHIR in a negative light.
But one important note that Knight and others would like all to keep in mind: the FHIR standard itself was not being questioned by the research; the vulnerabilities were caused by the data aggregators and app developers.
Knight spent a year reverse engineering mHealth and FHIR apps, reviewing code, performing network traffic interdiction between the apps and backend APIs, along with performing the same tests on the APIs with web app clients. Initially, the research focused on the electronic health record platforms, where Knight expected to find vulnerabilities.
But she discovered the vulnerabilities existed within the new ecosystem of FHIR apps that plug in and run on top of the EHR. The security flaws discovered by Knight meant that threat actors would not need to breach the hospital network to gain access anymore, they’d simply need to target aggregators.
FHIR is necessary for interoperability. But the research confirms the standard is simply not being implemented securely, and those tasked with implementations aren’t following best practices with recommended security controls – “as simple as applying scopes to tokens to ensure the authenticated user can only request" their own patient records, according to the report.
The FHIR and HL7 creator himself, Grahame Grieve, expressed gratitude for Knight’s research as it shined a “spotlight on the industry’s security practices.” The hope is that the next time the ecosystem is examined, it will be much harder to find security issues in FHIR implementations.
The research was sparked by Grieve, who invited Knight to hack the ecosystem and find vulnerabilities as a way to remediate some of the biggest risks before the interoperability push continues with inherent flaws in the ecosystem that could put the data at risk.
But instead of seeing the research as a way to accomplish those data sharing goals with security baked into the process, FHIR advocates took Knight to task for “misleading” the public on the standard.
Many of the negative responses to the report stemmed from those concerned that Knight and the reporting on the research was denigrading the FHIR standard. And in doing so, healthcare providers would use the data as a reason to not employ the interoperability and info blocking requirements.
However, he Department of Heath and Human Services and its Office of the National Coordinator for Health Information Technology are continuing to roll out these rules and the time for comments have closed. The Centers for Medicare and Medicaid Services began the enforcement period for the Interoperability and Patient Access final rule on July 1, which aims to further data sharing across the healthcare sector and to support a patient’s right to access their health information.
As such, the report is designed to become a mechanism for communicating possible risks with these implementations and what needs to be done as interoperability moves forward.
Knight has since released a follow-up report to clarify the biggest misconceptions being shared online and also held a two-hour discussion with FHIR developers to clear the air. SC Media spoke with several privacy and security leaders to get to the bottom of issues and support the healthcare community with understanding the vulnerabilities and what needs to be done.
Setting the record straight
It’s important to note that FHIR is a blueprint or framework, and the implementation of which is up to the implementer. And the discovered vulnerabilities are “limited to the implementations by the aggregators and app developers tested and further reduced to just the time [available] for testing,” Knight reported.
“FHIR is a critical and important step in the interoperability of EHR systems whose coming is well overdue,” she explained. “My work in this area is not to disparage the hard work of its creators, but of what can go wrong when it isn’t implemented properly — a shift left and shield right approach to cybersecurity.”
Again, FHIR is not a security protocol, nor does it define any functionality related to security, explained Dirk Schrader, global vice president of Security Research, NNT, part of Netwrix. Instead, it’s up to the API developers leveraging FHIR to create any needed security functions.
To Schrader, the report demonstrates that those implementing FHIR are not including some of “the most basic security aspects.” But as noted, these issues are not new and they’re not limited to FHIR implementations.
Consider previous reports, lawsuits, and even regulatory settlements with mHealth developers over security lapses and routine sharing of user data with third parties without transparency. Schrader explained that the report’s findings are consistent with “one recurring issue with digitalized healthcare devices.”
“Once it does work in its medical function, any additional considerations about security effects are neglected, forgotten or ignored,” said Schrader. “That leads to all sorts of vulnerabilities and security gaps, affecting the infrastructure, the identities – both system and data – as well as the data, the PHI handled by these apps and medical devices.”
These vulnerabilities are found in a number of healthcare ecosystems and medical processes that generate large amounts of patient health data, including radiology, patient monitoring, medication management, diagnostics, and electronic medical records, he added.
All healthcare ecosystems, including APIs leveraging FHIR, “must include a security-minded way of enabling the flow of information between these processes to make sure that any further attacks can be shut down and prohibited,” Schrader concluded.
Understanding HIPAA and its limitations
Industry stakeholders, including former ONC Chief Privacy Officer, Lucia Savage, the chief privacy officer and regulatory officer for Omada Health, have long stressed that the onus for health app privacy does not fall with ONC. It’s Congress that will need to protect consumer data privacy.
However, the drive for a consumer data privacy law has quieted amid congressional changes and efforts to curtail the pandemic.
Savage told SC Media that Knight’s work -- and other white hat hacking efforts in healthcare – should be seen as a positive step. While the report focuses on the technical missteps of aggregators, the report does not detail whether those aggregating data fall under The Health Insurance Portability and Accountability Act.
If those data aggregators are considered HIPAA business associates (i.e. those providing services to the health system and health plan customers), Savage stressed that they are required to apply all existing security standards outlined in the security rule, including segmentation, minimum necessity, encryption, and other measures.
These measures are well established and should come as no surprise to data aggregators. As such, “aggregators who are business associates should know and do better,” said Savage. “If the aggregators are violating HIPAA, they should be investigated and fined by the Office for Civil Rights and should be punished in the market for their services.”
“Likewise, health systems and health plans planning on paying for aggregator services, such as healthcare operations analytics should be doing thorough due diligence before they buy those services,” she added.
However, it’s possible the aggregator organizations are serving the consumer: meaning the data is being collected and services are being performed for the consumer. Savage explained that means the aggregator falls outside of HIPAA regulations.
As a result, Savage took issue with some of the report’s recommendations as HIPAA covered entities required to provide patients with their own data should not be required to tighten the process or “ put more controls that make it harder for consumers to get their own data.”
What’s truly needed is a nationwide consumer privacy law that will force aggregators to level-up their security standards for services provided to consumers.
“Rather than make it harder for consumers to implement their rights, level the privacy and security playing field for consumers wherever their data is, so they can stop worrying about it as much,” she explained.
“The best security engineers know exactly what best practices are, and nothing prevents even consumer-facing orgs from meeting the HIPAA minimums in their security engineering, or exceeding it, without a law; but sometimes, it does take a law to make businesses do the right thing, and a law is certainly something that consumers can rely on,” Savage concluded.
Listen up data aggregators
As the interoperability plans move forward, the healthcare sector must continue to improve how data is protected and shared throughout the sector, explained Mohammad Jouni, chief technology officer of Wellframe.
There’s no direct evidence these vulnerabilities are specific to FHIR or its implementations. To Jouni, it’s clear the “same vulnerabilities would have likely existed if the implementers who were assessed used custom APIs contracts instead of the FHIR specifications.”
Companies are being encouraged to use FHIR, which provides these implementers with access to “a number of libraries, tools and vendors that can help strengthen the security of the implementation, or leverage managed services from cloud providers that makes it easier and cost effective to keep systems patched,” said Jouni.
The report should be viewed as an encouraging next step for the security community, to assess the strength of the FHIR communication and to support healthcare in improving secure data sharing. There’s a need for API and app developers to move swiftly and remediate the risks exposed in Knight’s research. Much like when medical device security vulnerabilities are disclosed, the aim is not to shame the vendor: it’s to protect the safety and security of the healthcare system and the patients it serves. In short, the sector is only as secure as the weakest link.
And the research would indicate that the API ecosystem has work to do. As noted by Jouni, there can be no excuse for failing to ensure the APIs are secured with proper authentication and authorization frameworks followed.
“In simple terms, it doesn't matter if your house is painted blue, white or green, you still need to invest in a good security system and use it properly to protect your house from intruders.”