Over the course of about a year, a single ethical hacker was able to access millions of patient health records and expose systemic risks in software that are effectively outside the legal jurisdiction of the Health Information Portability and Accountability Act of 1996 (HIPAA).
Application Programming Interfaces (API’s) are considered infrastructure (not application) software because they typically work below the application presentation layer as a way to bridge data requests between different (often competing) software applications. The end-user (or consumer) would see the result of an API request in a front-facing application, but not the API itself.
Of the five FHIR API implementations I tested in phase two of my research, three contained pervasive vulnerabilities that allowed me to access over four million patient and clinician records – often using a single login. The other two were built by Electronic Health Record (EHR) vendors and I found no vulnerabilities in either of them.
Alissa Knight
Ethical Hacker
Author of “Playing With FHIR”
The white paper, titled “Playing With FHIR,” is a word play on the underlying specification called Fast Healthcare Interoperability Resources – or FHIR – which is a kind of blueprint for building API’s used specifically in healthcare.
In fact, many of the vulnerabilities Alissa identified were easily avoidable and some of the techniques she used were very basic and in common use by entry level security testers globally. At least some of the vulnerabilities may have been caused by software developers who were overly eager to cash-in on a freshly minted regulation called the “information blocking rule.” The new rule went into effect earlier this year and it’s now clear that some developers (either intentionally or out of ignorance) didn’t adhere to critical security specifications that are clearly outlined in the FHIR blueprint.
The basic intent of the new rule (which is exclusive to healthcare) is to threaten incumbent EHR software vendors and providers with penalties if they intentionally “block” access to their datastore from 3rd party requests. In reality, however, the rule is really more of a theory because there’s no hard evidence of intentional blocking to date and there are reasons to believe that attempts at enforcing the new rule would be legally challenging – at best.
Why? Because there’s no precedent for this kind of rule (either in healthcare or other industries) so it’s legally novel and untested. Beyond that sizable hurdle, the rule has no less than eight exceptions which can be relatively easy to claim as a viable defense against allegations of data “blocking.”
In fact, Alissa’s research may well have undermined the entire rule because one of the eight exceptions is specific to security. With the kind of vulnerabilities she identified, EHR vendors could easily deny (or “block”) any 3rd party request for data simply by citing the security exception as their justification – and their defense would likely succeed based on this one exception alone. I’m not an attorney and I don’t think EHR vendors would hide behind that defense but given these new risks (and new liabilities), they very likely could.
For one thing, no one really knows how many FHIR API’s are in production today — let alone how many have basic security flaws. Regulations other than “information blocking” (most notably HIPAA with detailed protections around the use of personal health information) would not apply because many of the companies developing (or using) these new API’s would not be considered a “covered entity” or “business associate” and those legal identifiers are the binding requirement for HIPAA’s jurisdiction.
’Playing with FHIR’ highlights how a rapidly expanding ecosystem of consumer- oriented mobile apps and data aggregators may open new security vulnerabilities for patients and healthcare providers. EHR developers and the healthcare organizations they serve follow specific HIPAA requirements to protect health data. But when non-HIPAA regulated entities hold that same data, those requirements fall away. Strong privacy and security protections should extend to anyone who holds patients’ health information.
Judy Faulkner
CEO & Founder, Epic
Given the unusual and unique wording of the “information blocking rule,” a cynic could argue that the entire rule was an intentional end-run around HIPAA – promoted by commercial interests eager to capitalize on lucrative health data (in a host of new directions), but who’s to say? One big takeaway is that Alissa found no vulnerabilities in the two API’s she tested from the EHR vendor community.
Cerner believes patients have the right to access their healthcare data in any manner they choose, and Cerner undergoes strict standards to protect data in our systems. However, we share concerns over the lack of consistent regulation of third-party developers. Cerner has been involved in industry and government conversations for years. We believe security and privacy protections should be extended for any entity working with identifiable health information and welcome industry conversation.
David Feinberg
CEO & President, Cerner
There’s no immediate fix because again, the number of FHIR APIs in production today is unknown and opinions vary widely on next steps.
By CHIME’s estimation, what is needed is a national privacy law that gives consumers protection around how their health data is used when released to third-parties not governed by HIPAA. Further, the Federal Trade Commission (FTC) must be adequately resourced to address the burgeoning industry of third-party apps. To date this agency has only received 5 reports of breaches which we know does not represent the true state of things. We applaud the FTC for the recent guidance they issued indicating they will be paying much closer attention to these issues as they explicitly state that third-party are included in the definition of personal health records.
Mari Savickis
VP, Public Policy
College of Healthcare Information Management Executives (CHIME)
Alissa’s research is the kind of sunlight that not only exposes technical vulnerabilities, but also the regulatory failings of government (Congress, HHS, ONC, etc…) to get this first attempt at legislating new access to patient data correct. As it is today, it’s just too easy for developers to avoid the time and considerable cost of securing their API’s. Not all the API’s are vulnerable, of course, but it’s relatively easy to find the ones that are so the proverbial barn door is now wide open to bad actors and the risks to protected health information (PHI) is significant with very limited legal recourse.
The final takeaway is this. FHIR is a great standard for APIs in healthcare, but until there is industrial strength certification and binding regulations that assert real penalties, software developers are effectively rewarded for taking the path of least resistance to revenue and the exposure can be measured in the millions of health records. We can’t expect — nor should we — voluntary compliance to security with something as critical as personal health information.
In the meantime, whatever happens — or doesn’t — the advice from Theresa Payton demands everybody’s undivided attention.
Criminals always go where the action is. As API’s continue to be the solution of choice for transformation efforts, the attackers will perfect their tradecraft to attack them and Gartner estimates that by 2022, API attacks will stand out as the most frequent attack method to compromise web applications. If peer reviews and red teaming are not on the top of your priority list now, read Alissa’s research and then reprioritize.
Theresa Payton
CEO Fortalice Solutions
Former White House CIO
Author of Manipulated
[This article first appeared in Forbes in October of 2021]