Privacy, Data Security, Supply chain

Lack of consumer privacy protections allows data brokers to sell mental health info

A view of the Meta headquarters building

The lack of clear consumer privacy protections in the U.S. has empowered the data broker industry, according to a report from a former researcher at Duke University’s Technology Policy Lab. The findings show the majority of third-party data brokers “are willing and able” to sell mental health information, with some actively advertising for consumer health data.

The researcher, Joanne Kim, conducted the study of data brokers and consumers' mental health conditions over the course of two months in an effort to shine a light on the data broker industry and the processes used to sell and exchange mental health data online.

“The largely unregulated and black-box nature of the data broker industry, its buying and selling of sensitive mental health data, and the lack of clear consumer privacy protections in the U.S. necessitate a comprehensive federal privacy law or, at the very least, an expansion of HIPAA’s privacy protections,” according to the the report.

The report also suggests the need for a ban on the sale of mental health data on the open market.

The findings were published ahead of two more lawsuits filed against healthcare providers over alleged data-sharing with Meta and other tech giants through Pixel use on websites. LCMC Health Systems and Willis-Knighton Health System are facing claims of egregious privacy violations, joining a host of lawsuits against Meta, Advocate Aurora Health, and WakeMed.

In the last year, privacy leaders and researchers have detailed massive gaps in consumer data privacy protections. Those privacy gray areas include health apps and hospital data scraping via tech tools, as well as the sale of consumer data through these tools to data brokers.

Congress has been working, albeit slowly, to fortify the Health Insurance Portability and Accountability Act or enact a national data privacy law. But for now, the FTC is levying its authority to combat bad actors. It’s currently embroiled in a lawsuit against data broker Kochava for its data-sharing practices and just issued a fine to GoodRx for $1.5 million over similar alleged privacy practices.

While many of these lawsuits and enforcement actions allude to the data broker processes, questions remain on the prevalence and scope of these practices. For consumers, many are unaware their data is routinely and actively shared on many popular apps.

Vetting and controls for mental-health information few

Kim’s research shines a light on these practices, particularly around software-based health-tracking applications that aren’t covered by HIPAA that “often unknowingly” put consumer’s sensitive mental health data at risk.

The report found that there appears to be a lack of best practices for handling mental health data generated outside of the traditional health setting and “particularly in the areas of privacy and buyer vetting.”

Data brokers both advertise and actively sell data tied to mental-health information. The report shows the companies market the data “seemingly minimal vetting of customers and seemingly few controls on the use of purchased data.”

Kim contacted 26 of the 37 data brokers that responded to her inquiries about mental health data. Eleven “firms were ultimately willing and able to sell the requested mental health data.” And “whether this data will be deidentified or aggregated is also often unclear, and many of the studied data brokers at least seem to imply they have capabilities to provide identifiable data.”

While Kim confirmed that the 10 most engaged data brokers, indeed, asked about the purpose of the purchase and the intended use cases for the data, the companies did not appear to have any other controls for managing clients, nor did they indicate whether a separate background check was conducted to corroborate Kim’s statements.

These same brokers advertised highly sensitive mental health data, including individuals with depression, attention disorder, insomnia, anxiety, ADHD, and bipolar disorder. The data also included ethnicities, ages, genders, zip codes, religions, children, marital status, net worth, credit scores, dates of birth, and single-parent status.

These records sold for a range of prices: “one data broker charged $275 for 5,000 aggregated counts of Americans’ mental health records.” Others “charged upwards of $75,000 or $100,000 a year for subscription/licensing access to data that included information on individuals’ mental health conditions.”

In one instance, a data broker Kim found to be the “most willing to sell data on depressed and anxious individuals” offered to do so at a price of $2,500 and “stated no apparent, restrictive data-use limitations post-purchase.”

Researcher calls for ban on sale of mental-health data

The study appears to confirm the vast privacy violations incurred by the scraping of consumer data and with seemingly no transparency into these practices.

For Kim, the research spotlights “a largely unregulated data brokerage ecosystem that sells sensitive mental health data in large quantities, with either vague or entirely nonexistent privacy protections.” The results vary, in terms of whether a broker will advertist data “clearly and explicitly linked to an individual, or provide the data with “some level of obscurity.”

However, the findings should emphasize the critical need for a federal privacy law able to provide data protections to consumers. In the short term, Kim explained that there’s a need for federal and state-level bans on the sale of mental-health data.

“Data brokers are collecting, aggregating, analyzing, circulating, and selling sensitive mental health data,” she continued. “This comes as a great concern, especially since the firms seem either unaware of or loosely concerned about providing comprehensive privacy protections.”

The issue has been compounded by the lack of regulation and “the opaque nature of the data broker industry,” Kim concluded. “Inconsistent practices, combined with vague privacy policies, point to a critical need for greater consumer protections, particularly for sensitive [health] data.”

Jessica Davis

The voice of healthcare cybersecurity and policy for SC Media, CyberRisk Alliance, driving industry-specific coverage of what matters most to healthcare and continuing to build relationships with industry stakeholders.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.