Medical Privacy Under Threat in the Age of Big Data

https://firstlook.org/theintercept/2015/08/06/how-medical-privacy-laws-leave-patient-data-exposed/

Version 0 of 1.

“I didn’t understand the issue of medical privacy. It sounded abstract,” says Deanna Fei, author of the new book Girl in Glass, which covers the premature birth of her daughter Mila and an ensuing storm over medical privacy and ethics. Now she says firmly, “This is an issue of civil rights and social justice. Without the right to medical privacy, ordinary Americans can’t keep information from being used against them.”

Fei’s most intimate story is now public knowledge. A recap: When she went into labor after only five and a half months of pregnancy, she didn’t know if her baby would live or die. She was in pain, bleeding, rushing in a cab to the hospital; and, later, she was staring at the bruised skin of her less than 2-pound daughter, who was too fragile to touch. As baby Mila grew into a healthy one-year-old, a new blow fell. The CEO of AOL, Tim Armstrong, blamed a forthcoming benefits cut on the costs of two “distressed babies” born to employees. One of the employees was Fei’s husband, whose insurance covered the family. People at work started asking him if the comments referred to his family. So Fei decided to speak out. “When I came forward, we were afraid. I was speaking out against my husband’s boss, who runs a powerful company,” she says. “But I just felt it was imperative to speak up to defend my daughter’s basic humanity. I also came to see that to single out any individual for their expenditures undermines the principle of health insurance.” After an uproar, Armstrong quickly apologized and reversed his decision on benefits.

But the episode underscored just how insufficient the existing protections are for individual privacy in the medical realm. Under the Health Insurance Portability and Accountability Act (HIPAA), it’s illegal for health plans and some other entities to reveal medical information about those insured or treated. CEO Armstrong didn’t name names … but they were easily deduced by many employees. If AOL self-insures (which as a large corporation it’s likely to, but will not publicly confirm), then it is considered a health care provider subject to HIPAA. Many medical and legal experts considered Armstrong’s action unethical and possibly a violation of existing medical privacy law. The Office for Civil Rights at the Department of Health and Human Services, which is in charge of investigating violations, would only say, “As a matter of policy, the Office for Civil Rights does not release information about current or potential investigations.”

Medical privacy is a high-stakes game, in both human and financial terms, given the growing multibillion-dollar legal market for anonymized medical data. IMS Health Holdings, for example, acquires data from pharmacies and sells it to biotech and pharmaceutical firms. After looking into its filing to become a public company, ProPublica found IMS’s “revenues in 2012 reached $2.4 billion, about 60 percent of it from selling such information.” Medical data-mining firms claim that this is all harmless because the data is truly anonymous, but their case is not airtight by any means. For example, Latanya Sweeney of Harvard’s Data Privacy Lab bought commercially available data and de-anonymized it by cross-referencing the dates of medical events with local news events and public records. She found that a man publicly identified as a missing person was diagnosed with pancreatic cancer and had attempted suicide, for example. A few of the people she identified chose to speak publicly, including retired Vietnam veteran Ray Boylston, who had his bladder removed after a severe motorcycle crash. “I feel I’ve been violated,” he told Bloomberg Businessweek.

There’s also the risk that medical records will be breached by hackers, or in some cases, by workers manually printing files. When Greg Virgin, CEO of the security firm RedJack, gave NPR a “tour” of sites selling stolen data, he found a bundle of 10 Medicare numbers selling for 22 bitcoin, or $4,700 at the time. General medical records sell for several times the amount that a stolen credit card number or a social security number alone does. The detailed level of information in medical records is valuable because it can stand up to even heightened security challenges used to verify identity; in some cases, the information is used to file false claims with insurers or even order drugs or medical equipment. Many of the biggest data breaches of late, from Anthem to the federal Office of Personnel Management, have seized health care records as the prize.

While many doctors have focused on their personal responsibilities to patients and not ventured into questions of privacy, some have moved to address the problem. Psychiatrist Dr. Deborah Peel remembers patients during the pen-and-paper days asking if they could pay cash and stay out of her paperwork, afraid the information would somehow find its way back to employers. Later, she founded the Patient Privacy Rights organization, realizing the era where paper copies of her records would circulate to perhaps a dozen entities has been superseded by much wider information distribution. “The data holders — hospitals, health plans — they now control where our data goes and we have no idea, no chain of custody for our data,” she says.

The HIPAA law, meanwhile, has been changed several times. A provision in the stimulus bill in 2009 said that patients should have access to disclosures about where their data is sold or shared. But it hasn’t been turned into concrete regulations and implemented. “They do not want us to know how many people and technology vendors and software companies access and use our data. If you’re in a hospital you’ll have more than 100 human accesses per day, but you may have thousands of contacts” with devices and computer systems tracking you as a patient, says Peel. (The Office of Civil Rights at HHS says the agency is “in the process of additional fact finding.”) Addressing the need for medical research and privacy, Peel says, requires coming up with a cyber-credential system that would allow researchers to query individuals, who could choose to grant access to only the parts of their medical history they wanted to disclose. “If a million people get their data queried, then we get research and we get privacy,” she argues.

Lawsuits are another response to privacy violations. Indiana attorney Neal Eggeson has filed and won HIPAA-related cases on behalf of individuals by framing the disclosures as medical malpractice. In one case, a Walgreen’s pharmacist shared information with her husband about his ex-girlfriend, who she thought may have given the man a sexually transmitted disease. The husband texted his ex-girlfriend; the jury found Walgreen’s liable for 80 percent of the $1.44 million damages. (The case is under appeal.)

Medical data resides in unexpected places — including wearable health devices, which can range from a glucometer for someone with diabetes to a fitness tracker. But the HIPAA and HITECH (Health Information Technology for Economic and Clinical Health) laws are geared to regulate health care providers, insurers, employers and schools, not private device manufacturers. Privacy policies can be murky. For example, the first paragraph of Fitbit’s current policy states, “We will never sell your data, and will only share personally identifiable data when you direct us to (or under the circumstances outlined in our Privacy Policy).” Much later, it adds, “De-identified data that does not identify you may be used to inform the health community about trends; for marketing and promotional use; or for sale to interested audiences.” (Fitbit did not respond to a request for comment.) Sen. Charles Schumer raised concerns in August 2014 about its privacy policies. After Fitbit hired the lobbying firm Heather Podesta + Partners, and tweaked its privacy policy, the senator backpedaled, saying of Fitbit, “This company cares very much about their privacy and security.”

During a 2013 FTC panel on “Connected Health and Fitness,” University of Colorado law professor Scott Peppet said, “I can paint an incredibly detailed and rich picture of who you are based on your Fitbit data,” adding, “That data is so high quality that I can do things like price insurance premiums or I could probably evaluate your credit score incredibly accurately.” In addition to selling its devices, Fitbit also sells an analytics platform to employers. Some employers are using the data to negotiate down their insurance rates. Others, like the energy company BP, offer reduced premiums to employees who walk a million steps in a year and take other measures in a rewards-point system. The question is whether employers will use the system in reverse — for example, deciding that someone’s health metrics make them a bad choice for a promotion.

The threats to individuals seeking to protect their medical data can come externally, from data breaches; internally, from “rogue employees” and others with access; or through loopholes in regulations. Fei, whose daughter Mila is now a healthy two-year-old, has embraced her role as a public advocate. “We need comprehensive laws to safeguard our right to medical privacy,” she says. “Most ordinary Americans don’t understand how vulnerable our health data is. And once we understand, there’s not much that we can do. If an employee is coping with a medical problem, they are dependent upon their employers and there’s not much incentive for them to come forward.” Although there’s no way to know how many people have experienced a violation of medical privacy and chosen not to speak up for fear of retaliation or shame, many have responded gratefully to the chance to share their stories. Fei started a website, ourdistressedbabies.org, but found that people were posting about other forms of health-shaming or questionable practices. One man, for example, had high job ratings but was let go after having had a kidney transplant, the aftercare for which would have raised the company’s premiums. “The people who wrote to me, they saved me,” Fei says of the way it ended her sense of isolation. “They told me, ‘I was never able to speak up. I hope you will be able to keep talking.’ These stories haunt me.”