Pro News

Data broker marketplace research shows loose controls on sensitive mental health info

BY: ALFRED NG, RUTH READER | 02/13/2023 06:01 AM EST

The debate over data privacy on health apps could get a lot more loaded soon: A report from Duke University’s Technology Policy Lab, released this morning, showed that multiple data brokers were willing to sell mental health information about individual customers, with names, addresses and demographic information — with no oversight over who was buying the information.

The report comes as regulators are grappling with how to tamp down on broad sharing of sensitive, often health-related data from websites, apps and devices outside the formal health care system. The report doesn’t indicate what specific services the data brokers use to get their information. Data brokers often don’t disclose their sources of information to maintain a competitive edge and a steady supply of data.

Although apps and fitness trackers are not regulated by HIPAA, the 1996 law that imposes strict privacy guards on health care data, research shows that most consumers assume their health data from apps is protected.

Regulators have started to move accordingly: The Federal Trade Commission recently made an example of telehealth platform GoodRx for sharing customer health data with Google, Facebook and other advertisers, despite promising to keep that data protected. GoodRx’s data sharing practices are commonplace and many in the industry view the FTC move as a warning shot. The agency is also planning to propose new rules to regulate data collection.

In the report, Joanne Kim, a recent graduate of the Duke research program, made preliminary inquiries with 10 data brokers who openly advertised selling mental health information online, and found:

  • They offered highly specific sets of information on people’s medical conditions, including on people with depression, bipolar disorder and post-traumatic stress disorder; people with medical conditions such as cancer; and people who have had strokes
  • While some datasets offered only aggregated batches of information, others provided individualized data including people’s addresses and email contact
  • The data is often paired with demographic information such as age, ethnicity, gender, religion, net worth and marital status
  • The data brokers failed to vet potential buyers or limit how the data can be used after it’s sold
  • Others had no restrictions in place, with one data broker offering to sell data on people with depression and anxiety with no use limits for $2,500

Overall, the report found that without regulations, data brokers operate under a self-policing industry where companies are able to sell specific information about people’s mental health conditions without limits.
One broker offered mental health information on individual members of the U.S. military, sorted by which branch they belonged to and what conditions they had.

The names of the data brokers were not disclosed in the report to protect the companies’ privacy. Two data brokers also required nondisclosure agreements to even discuss potential offerings.

While researchers in the past have pointed out how health apps and websites are able to collect and share people’s health information, Kim’s project highlights what happens once that data lands in a data broker’s inventory.

“One area where there’s been almost no public research has been what the buy process actually looks like. Not just what does the data broker have or who are the data brokers, but what does it actually look like if you have an email and a credit card and you say, ‘I want to go buy health data off the open market,’” said Justin Sherman, the research lead at Duke University’s Data Brokerage Project who oversaw the study.

Health researchers say that collecting and sharing mental health data can be beneficial.

“Anonymizing information can help inform things such as where critical funding should go, trends in areas of mental illness and help inform future screenings. It can also be helpful in addressing health equity,” a spokesperson for the Pharmaceutical Research and Manufacturers of America said in a statement.

Health apps, even with all the sensitive information they gather, don’t have any heightened protections compared to all of the other services available on a smartphone.

In the FTC’s settlement with telehealth platform GoodRx over the company’s health data-sharing practices, the agency highlighted how companies use legal gray areas to make it appear as though they are protecting consumer data even when they are not. FTC said GoodRx misrepresented itself as HIPAA-compliant, even though the law doesn’t apply to it, and misled customers about its data-sharing practices — handing over data to advertisers after promising not to. As part of a settlement agreement, the company is no longer allowed to share customer data with third parties and will have to claw back data that it previously shared.

In its response to the FTC action, GoodRx denied wrongdoing and said its advertising practices were compliant with applicable regulations.

Congress is also paying attention to how health data is being handled. Following several reports concerning telehealth companies sharing customer data, last week, Sens. Amy Klobuchar (D-Minn.), Susan Collins (R-Maine), Maria Cantwell (D-Wash.) and Cynthia Lummis (R-Wyo.) sent letters to telehealth firms Cerebral,Monument and Workit Health asking them to reveal what data they collect on users, which entities they share patient data with and how they plan to protect user data.

Room for regulation

Health data organizations say that collecting and sharing this sensitive information can be beneficial if the information is properly protected. Researchers, for example, have used health conditions to target ads for clinical trials on Facebook. And sharing data on military members with PTSD can be useful for an organization like the Department of Veterans Affairs, allowing it to offer medical services to the right audience.

But ad-targeting can also be used in harmful ways. The Justice Department settled a case with three data brokers in 2020 and 2021, who sold information on elderly people with Alzheimer’s to scammers who preyed on their mental condition.

The American Data Privacy and Protection Act, which failed to pass in Congress last year but is expected to return this session, would have classified health information as sensitive data. That protection would have required express, affirmative consent before health data could be shared.

The FTC has also kicked off a rulemaking process to regulate data brokers. Part of the inquiry looks at how regulations on health care data should be changed.

Copy link
Powered by Social Snap