Is Our Private Health Data Really Private?

258

Lynn Julian, a 46-year-old survivor of the 2013 Boston Marathon bombing, worries about the privacy of her weekly telehealth appointments and what she calls "social listening."

A recent experience gave her cause for concern. Soon after taking a vitamin supplement with her phone on the counter nearby, Julian noticed an ad for the small company that makes the product. The ad didn't seem random or part of a mass marketing campaign, but rather targeted based on her behavior in her own home.

The next morning, Julian ate a power bar made by another small company, with her phone nearby. That night, she saw her first commercial for that product, too. It was enough to convince her that she is being watched.

According to Albert Fox Cahn, founder and executive director of Surveillance Technology Oversight Project (S.T.O.P.), even when it feels like our devices are listening to us, that is rarely true.

“The answer is far more creepy,” Cahn said. “The fact is we are so often subliminally communicating about these same products through our searches, through our social media, through our communications with others, that through all of that communication, we often will be connected with advertisers.”

With the rise of telehealth, digital health apps and fitness trackers, these concerns are only likely to grow.

Between May 2019 and May 2020, telehealth usage grew more than 5,000%. Roughly one in five Americans had a smartwatch or fitness tracker in 2019. An estimated 350,000 digital health apps are available to consumers. Those figures — and the capabilities of apps and trackers — continue to increase.

Erika Barnes, 36, founder and CEO of PetSmitten, loves the health features of her smartwatch, but has nagging questions about her privacy.

“There’s just always a concern in the back of my mind about my data and how it’s being used,” she said.

The maker of Barnes’ smartwatch says it may share user data to support medical research, which causes her to wonder if drug companies can get her data for research purposes. And she isn’t sure she likes the idea of companies having information about the rhythm of her heart, for example.

“Some of the people closest to me in my life don’t know the exact state of my health, nor should they. Companies least of all should have access to some of your most intimate personal information,” Barnes said.

But, despite her concerns, she still uses her smartwatch because it’s so convenient.

Legal risks from health data

Experts warn that digital surveillance could be used to enforce abortion bans in light of the United States Supreme Court decision to overturn Roe v. Wade and related state laws.

According to Cahn, law enforcement can access your private data through digital forensics, scouring your data based on a subpoena or court order. Or they may engage in a digital dragnet, scanning broad sets of data, such as location and keywords, to identify anyone who may have sought or helped others seek abortion care. Authorities can also access your data through data brokers, who buy and sell health data.

Once someone is a target of a police inquiry, almost any app can expose them to tracking risk, Cahn said. For example, period trackers that show a missed period could help police reconstruct a woman’s reproductive care.

It’s not just period data that could be used against you, though

“There’s a misconception that femtech (female health technology) apps, including period tracking apps, are the sole method through which law enforcement can access reproductive health data,” said attorney Bethany Corbin, femtech lawyer and senior counsel at Nixon Gwilt Law. “The risk for data access extends much more broadly, including to general health apps and even telehealth and in-person healthcare appointments.”

Privacy laws don't always provide protections for health information

The Health Insurance Portability and Accessibility Act of 1996 (HIPAA) limits how healthcare providers and other entities can use, share or disclose your protected health information. But not all data is protected and not all companies that hold your data are subject to these rules.

“HIPAA’s applicability is very narrow,” Corbin said. “Many women assume [their] data will be protected by federal healthcare privacy laws. This is often not the case. Most healthcare apps — and in particular, most femtech apps — fall into regulatory gray zones, in which federal privacy protections do not apply.”

Some states have privacy laws, and the Federal Trade Commission prohibits unfair or deceptive practices. President Biden recently signed an executive order that included a request that the Federal Trade Commission chair take steps to protect consumer privacy. Lawmakers have launched an investigation into how data brokers and app makers collect and sell user data.

Even with these protections and new efforts, Corbin said that HIPAA-protected health information applies to less data than consumers realize.

What you can do to protect your health information in apps

“With privacy, it’s never one size fits all,” Cahn said. “For every person, there isn't a simple yes or no answer of whether there's a risk, but what we do see is a spectrum of harm and individuals trying to figure out how to navigate that, to protect their own data.”

Still, there are steps you can take to protect your data.

1. Read the health app’s privacy policy before using it, especially the disclosure section, to understand how they share data.
“Disclosing data to research institutions for long-term women’s health research may be a disclosure that women are comfortable with, whereas disclosing data to data brokers may be something that women are uncomfortable with,” Corbin said. “Each user should determine their own comfort level with data sharing and find an app that appropriately reflects the user’s values.”
2. Choose apps that collect the minimum data necessary.

“Look for apps that use local data storage, such as storage on your phone or tablet, rather than cloud-based storage, because it will be less likely that the app can share your data in the event they get a subpoena,” Corbin said.
3. Learn how to delete your personal data from apps.

If you want to minimize the amount of personal data stored in your apps, you can delete data without deleting the app itself, a process sometimes called offloading. The steps you need to take may vary by type of device and app, but it generally involves clearing the cache or offloading the data through privacy, security or storage settings.

Getting rid of an app altogether may be safer, but keep in mind that it’s not as simple as just deleting the app, especially if you’ve given the app permission to access other apps, such as your social media accounts, photos, contacts or location. To truly purge your personal health data from apps, delete the data from the app first but then look in the app settings for any linked accounts you may have enabled and unlink them. Also, check the web version of the app along with the mobile version because sometimes the web version has additional settings that you’ll want to manage.

4. Understand the risks whenever you interact electronically.

Anytime you use an app, it’s possible that your data will be disclosed or made public at some point in time, whether that is because of data disclosures, or cyberattacks or data breaches. “Don’t disclose anything to an app or website that you would not want to be made public,” Corbin said.

Original Article