During the pandemic, the world had to turn to software-based virtual health and wellness apps to manage their mental health. Since then, wearable medical devices, fitness trackers, and health apps have become popular.
Fitness, weight loss, and mental health apps ask many invasive questions to assess people’s mental health and prepare support plans. They request your personal and medical information—similar to a consultation with a doctor.
But soon after telehealth became popular, cyber criminals discovered that most brand-new telehealth companies are easy targets. Data compromises from criminal hacks soared, and the unintended sharing of sensitive user information by big players such as FitBit sparked controversies.
But What About HIPAA?
The victims of hundreds of data breaches found no redress under HIPAA regulations. Indeed, the horrifying results of a 2023 Duke University study revealed that data from health-related apps had been circulating freely in the data broker community.
The study showed that data brokers had even categorized people according to their mental health diagnoses, e.g., depression, ADHD, anxiety, or bipolar disorder, to lure data buyers. Some of the databases for sale even contained the names and addresses of the people looking for help via telehealth apps.
Covered Entities and Data Loopholes
The FDA’s definitions leave plenty of loopholes for the data-hungry ad industry. For example, telehealth apps designed or operated by companies not designated by the FDA as Covered Entities or Business Associates don’t have to adhere to HIPAA regulations. Shockingly, most fitness trackers, period trackers, and personal health diaries fall into that category. They routinely share or sell their data to data brokers.
- The FDA only categorizes an app as medical only if it is used to detect, treat, or fight diseases.
- Healthcare providers and health plans are classified as Covered Entities.
- Third-party companies that handle or store PHI on behalf of a covered entity are categorized as Business Associates.
In practice, for mobile apps, HIPAA only applies if the app handles PHI and is owned or managed by a Covered Entity or a Business Associate.
For example, if a dentist’s mobile app processes PHI, such as patient records, it should conform to HIPAA regulations. However, a fitness app owned by a software company is not subject to HIPAA. The owner is not an FDA-designated Covered Entity. This is the case even when the app handles sensitive information.
Egregious Examples of Companies That Exploit This Loophole
The Duke study found some egregious examples of companies that exploit this loophole. Certain weight loss apps, for example, asked users 50 probing questions about the user’s mental health and medical conditions. And yet, the apps actively share users’ data with analytics firms.
It’s becoming clear that we’ll need more comprehensive privacy regulations. We’ll also need increased public awareness to protect consumers’ sensitive health information.
The Underlying Problem
Some companies bury their outrageous liberties with your data in the fine print. The data ends up in the hands of data brokers, who exploit it for targeted advertising.
Most people click to accept any privacy policy, believing that “someone” must have noticed if it held any dangers. But companies hide secrets in the reams of vague text. Here are some widespread examples of legalese that end up serving only the needs of the company’s profits:
- “We may collect information about you”: They fail to specify all the data they collect, which allows them to add data points at will. The text suggests that collected information “may include” your location, preferences, contact details, financial details, and voice or video recordings.
- Data sharing in the course of “business activities” and “business purposes”: Some companies interpret your acceptance of their privacy policy as a carte blanche to sell, trade, barter, and, in other ways, monetize your private, even delicate information, as GoodRx recently did.
- Legitimate interests: Legitimate business interests are a broad description that may cover many sins. That’s what Meta and Google have been doing. But in ongoing court cases, privacy advocates and EU member countries are working to ensure that these companies obtain actual consent before tracking people online.
- “Business partners”: The “business partners” are usually advertising firms, data brokers, or affiliates. There’s likely a deal to sell, trade, combine, or enrich your data with information collected elsewhere to improve targeted advertising campaigns for other companies around the globe. Adding insult to injury, your gadget or app’s privacy policy may even state that you are also subject to their business partners’ privacy policies. That leaves you with the task of looking up each one to read their privacy policy before you use your app or gadget.
- “De-identified,” “anonymous,” or “aggregated” Data: These terms may sound like they offer better privacy protection. However, personal data can easily be re-identified even after it has been masked or combined. The bottom line is that it doesn’t matter if a company anonymizes your data. The “business partners” can undo that work. Duke University found some instances where data brokers included people’s names, emails, addresses, and race and ethnicity data for sale. They also had information about the number of children in the household.
So, How Do You Protect Yourself?
We’ll have to learn to ask probing questions before we buy. Here are a few questions to ask and tips you can follow to protect your data privacy better:
- Who manufactured the product? Often, overseas companies may flood online retail sites with cheap imitations. They are not subject to deep regulatory scrutiny. Are they concerned about maintaining a great brand experience? Would they even care if your data got hacked?
- Does the company prioritize security? In a world where IoT gadgets can be dangerous, companies that prioritize security in their product design are rare. Don’t skimp on security to get a few extra bells and whistles.
- What data is recorded? Period tracker apps routinely collect information about people’s sexual activity, miscarriages, pregnancy attempts, and location data. They use this information to target users with ads. Why else would a period tracker app need your location? And, since the overturning of Roe v. Wade, detailed period and fertility calendars could be used against people as evidence of abortion in legal proceedings.
- Are all communications between your gadget and the company encrypted? Attackers can intercept and sell the data or use it against you. If the device or gadget stays in one place, you could set up your home Wi-Fi router to encrypt all your IoT gadgets’ data. And, if it’s a mobile phone app, you should use a VPN for mobile phones to encrypt the app data.
- Where do they store the data? The data might be discoverable on the internet. That could impact your job prospects, eligibility for credit, and life or medical insurance coverage.
- Can you hide the IP address and location information? For example, the fitness app StraVa was found to leak users’ location information. Even low-level attackers could pinpoint users’ home locations using the app’s high-precision API metadata. It was even after the users had set up privacy zones to hide their activity within specified areas.
- Do they update the software frequently? Cybercriminals can attack via smart gadgets, vulnerable smartphone apps, household appliances, and medical apparatus. Keep the devices and apps up-to-date with security fixes.
Do Users Have a Choice?
Health apps offer support and convenience but also raise significant data privacy concerns. The need for robust federal regulations is leaving users vulnerable. Data brokers and third parties can sell or share their personal health information.
Unfortunately, many health-related companies fall outside health privacy regulations. They can legally collect, share, and sell such data. That makes health and fitness apps the perfect candidates for data-sharing partnerships with pharmaceutical companies, health insurance providers, and other marketing firms. Sadly, there are almost unlimited opportunities for bias in targeted advertising. It can even determine the cost of health care.
Users face a stark choice. They can accept the app’s unjust terms and conditions and face the consequences. These include inevitable data selling, targeted ads, and loss of privacy. The alternative is to decline and stop using the app or service. That is no choice.