[ad_1]
Most people have at least a vague sense that someone somewhere is wreaking havoc with the data footprints created by their online activities: maybe they’re using a program that allows that company to create a profile of their habits, or maybe they’re constantly being followed by creepy ads.
It’s more than a feeling. Many health technology companies, which provide services ranging from mental health counseling to mail order ADHD pills, have shockingly lackadaisical privacy practices.
A guide published this month by the Mozilla Foundation found that 26 of 32 mental health programs had weak safeguards. Fund analysts have documented numerous flaws in their privacy practices.
Jen Caltrider, Mozilla’s project manager, said the privacy policies of the apps she used for drumming exercises were not much different from those of the mental health apps the foundation looked at, despite the much greater sensitivity of what the latter records.
“I don’t care if anyone knows I practice drums twice a week, but I don’t care if anyone knows I see a therapist twice a week,” she said. “This personal data is another pot of gold for them, for their investors.”
Bets are becoming more and more relevant in public consciousness. Apps used by women, such as menstrual cycle trackers and other types of fertility management technology, are now raising concerns about potential withdrawal Roe v. Wade. Across social media, users have urged each other to delete data stored by these apps — a right not always granted to users of health apps — out of fear that the information could be used against them.
“I think these big data groups are in for a day of reckoning,” said U.S. Sen. Ron Wyden (D-Ore.). “They have to decide — are they going to protect the privacy of the women who do business with them? Or are they just going to sell to the highest bidder?”
Countering these fears is a movement to better control the use of information through legislation and regulation. While nurses, hospitals, and other healthcare providers adhere to the privacy protections of the Health Insurance Portability and Accountability Act (HIPAA), the growing health app sector has fewer protections for users.
While some privacy advocates hope the federal government can step in after years of work, time is running out for Congress to act as November’s midterm elections approach.
Enter the private sector. This year, a group of nonprofits and corporations released a report calling for a self-regulatory project to protect patient data when they’re outside the health care system, an approach critics liken to the proverbial fox guarding the henhouse.
Project managers tell a different story. The initiative was developed over two years by two groups: the Center for Democracy and Technology and the Health Innovation Leaders. Ultimately, such efforts will be led by BBB National Programs, a nonprofit organization once affiliated with the Better Business Bureau.
Participating companies can store a range of data, from genomic to other information, and work with apps, wearable devices or other products. These companies will agree to audits, spot checks, and other compliance activities in exchange for some sort of certification or seal of approval. This activity, according to the developers, will help eliminate privacy leaks in the current system.
“It’s really a mixed bag — for ordinary people, for health privacy,” acknowledged Andy Crawford, senior privacy and data adviser at the Center for Democracy and Technology. “HIPAA has decent privacy protections,” he said. However, the rest of the ecosystem has gaps.
However, there is considerable doubt that the private sector proposal will create a viable regulatory framework for health data. Many participants, including some of the most powerful companies and participants in the initiative, such as Apple, Google and 23andMe, dropped out during the incubation process. (A 23andMe representative cited “bandwidth issues” and noted the company’s involvement in publishing the Genetic Privacy Principles. The other two companies did not respond to requests for comment.)
Other participants believed that the ambitions of the project were aimed at corporate interests. But that sentiment wasn’t necessarily universal — one panelist, Laura Hoffman, a former fellow at the American Medical Association, said for-profit companies were frustrated by the “restraints they put on profitable business practices that exploit both individuals and community”.
In general, self-regulation plans work like a combination of a whip and a gingerbread man. Membership in a self-regulatory system “can be a marketing advantage, a competitive advantage,” said Mary Engle, BBB’s executive vice president of national programs. Consumers may prefer to use apps or products that promise to protect patient privacy.
But if these corporations get it wrong—touting their privacy practices but not actually protecting users—they could find themselves under fire from the Federal Trade Commission. The agency can prosecute companies that fail to keep their promises under its supervision to monitor unfair or deceptive trade practices.
But there are a few key challenges, said Lucia Savage, a privacy expert at Omada Health, a startup offering digital treatments for pre-diabetes and other chronic conditions. Savage previously served as the Chief Privacy Officer for the Office of the National Coordinator for Health Information Technology at the US Department of Health and Human Services. “There is no need for a person to self-regulate,” she said. Companies may not join. And consumers may not know what to look for in a certificate of good practice.
“Companies are not going to self-regulate. They just don’t. It depends on the politicians,” Mozilla’s Caltrider said. She cited her own experience of emailing privacy contacts listed by companies in their policies, only to be met with silence, even after three or four emails. One company later said the person responsible for monitoring the email address had left and had not yet been replaced. “I think that’s telling,” she said.
Then there’s enforcement: The FTC applies to businesses, not nonprofits, Savage said. And nonprofits can behave just as badly as any predatory robber baron. The suicide hotline was embroiled in scandal this year after Politico reported that it had given an AI company online text conversations between users who were thinking about harming themselves and the AI-powered chat service. The FTC’s action could be tough, and Savage wonders if consumers will really be better off after it.
Difficulties can be seen in the proposed self-regulation system itself. Some key terms, such as “health information”, are not fully defined.
It is easy to say that some data, such as genomic data, is health data. For other types of information, it is more difficult. Researchers are repurposing seemingly mundane data — such as voice tone — as an indicator of health. Establishing the correct definition is therefore likely to be a difficult task for any regulator.
The current debate, whether in the private sector or in government, is exactly that. Some companies are optimistic that Congress can pass comprehensive privacy legislation. “Americans want a national privacy law,” Kent Walker, Google’s chief legal officer, said at a recent event hosted by the R Street Institute, a pro-free market think tank. “We have Congress very close to passing something.”
This may just be a tonic for critics of the self-regulatory approach — depending on the details. But some specifics, such as who should enforce the provisions of the potential law, remain unresolved.
The self-regulatory initiative is seeking startup funding, possibly from charities, in addition to any contributions or fees that would support it. Still, Engle of BBB National Programs said action is urgent: “No one knows when the law will be passed. We can’t wait for it. A lot of this data is collected and not protected.”
KHN reporter Victoria Knight wrote this article.
[ad_2]
Source link