This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/05/22/opinion/health-care-privacy-hipaa.html

The article has changed 4 times. There is an RSS feed of changes available.

Version 1 Version 2
For a Longer, Healthier Life, Share Your Data For a Longer, Healthier Life, Share Your Data
(2 days later)
I’ve always been very careful about my personal information (some might say paranoid). I ad-block, I cookie-block, and I use a password manager and a ton of disposable email addresses. I don’t use fitness-tracking wearables. I even cover my laptop camera. I don’t like the idea of being profiled or running the chance that a data breach might leave me exposed. If you asked me whether I wanted my data collected, analyzed and shared, I would of course have said no.I’ve always been very careful about my personal information (some might say paranoid). I ad-block, I cookie-block, and I use a password manager and a ton of disposable email addresses. I don’t use fitness-tracking wearables. I even cover my laptop camera. I don’t like the idea of being profiled or running the chance that a data breach might leave me exposed. If you asked me whether I wanted my data collected, analyzed and shared, I would of course have said no.
But then I had a funny experience last year. I make a living by analyzing large data sets, and I was looking into creating an artificial-intelligence app that would tell people whether their symptoms were severe enough to warrant a trip to the doctor or even the emergency room. My foray into the health care field did not last very long. A.I. requires a lot of data, and when it comes to people’s personal health information, it’s incredibly difficult to gain access, even if the data is anonymized. After spending months trying to get the data I needed for my research, I gave up.But then I had a funny experience last year. I make a living by analyzing large data sets, and I was looking into creating an artificial-intelligence app that would tell people whether their symptoms were severe enough to warrant a trip to the doctor or even the emergency room. My foray into the health care field did not last very long. A.I. requires a lot of data, and when it comes to people’s personal health information, it’s incredibly difficult to gain access, even if the data is anonymized. After spending months trying to get the data I needed for my research, I gave up.
But my experience taught me something: I didn’t want to hide my health data. I wanted to give it away.But my experience taught me something: I didn’t want to hide my health data. I wanted to give it away.
Although we may not notice it, the scarcity of health care data imposes a significant cost on society. A.I. has the potential to advance medicine across a broad range of subfields.Although we may not notice it, the scarcity of health care data imposes a significant cost on society. A.I. has the potential to advance medicine across a broad range of subfields.
It could increase our understanding of the human genome, improving our screening and understanding of genetic disorders. Given its success with image recognition, A.I. could assist pathologists by enhancing the resolution of slides or even segmenting out potential cancer cells. And it could increase the speed at which radiologists process scans and improve the accuracy of their diagnoses. The last example is particularly resonant to me, as my father died of mesothelioma, a cancer caused by asbestos, after his chest X-rays were misdiagnosed as lung cancer. Much of the progress in these areas, and many others, is at the very least slowed by the lack of data.It could increase our understanding of the human genome, improving our screening and understanding of genetic disorders. Given its success with image recognition, A.I. could assist pathologists by enhancing the resolution of slides or even segmenting out potential cancer cells. And it could increase the speed at which radiologists process scans and improve the accuracy of their diagnoses. The last example is particularly resonant to me, as my father died of mesothelioma, a cancer caused by asbestos, after his chest X-rays were misdiagnosed as lung cancer. Much of the progress in these areas, and many others, is at the very least slowed by the lack of data.
There are a number of overlapping reasons it is difficult to build large health data sets that are representative of our population. One is that the data is spread out across thousands of doctors’ offices and hospitals, many of which use different electronic health record systems. It’s hard to extract records from these systems, and that’s not an accident: The companies don’t want to make it easy for their customers to move their data to a competing provider.There are a number of overlapping reasons it is difficult to build large health data sets that are representative of our population. One is that the data is spread out across thousands of doctors’ offices and hospitals, many of which use different electronic health record systems. It’s hard to extract records from these systems, and that’s not an accident: The companies don’t want to make it easy for their customers to move their data to a competing provider.
But there is also a fundamental problem with our health care privacy protections, primarily the Health Insurance Portability and Accountability Act, known as HIPAA. But there is also a fundamental problem with our health care privacy protections, primarily the Health Insurance Portability and Accountability Act, known as Hipaa.
HIPAA was passed in 1996, when artificial intelligence was largely the realm of science fiction movies and computer science dreams. It was intended to safeguard the privacy and confidentiality of patient records (as well as to improve the portability of health coverage when patients switched jobs). Hipaa was passed in 1996, when artificial intelligence was largely the realm of science fiction movies and computer science dreams. It was intended to safeguard the privacy and confidentiality of patient records (as well as to improve the portability of health coverage when patients switched jobs).
But today one of the main effects of the law is to make it much harder for doctors and hospitals to share data with researchers. The fees they would have to pay for legal experts, statisticians and the other consultants needed to ensure compliance with the law are just too steep to bother.But today one of the main effects of the law is to make it much harder for doctors and hospitals to share data with researchers. The fees they would have to pay for legal experts, statisticians and the other consultants needed to ensure compliance with the law are just too steep to bother.
[Technology has made our lives easier. But it also means that your data is no longer your own. We’ll examine who is hoarding your information — and give you a guide for what you can do about it. Sign up for our limited-run newsletter.][Technology has made our lives easier. But it also means that your data is no longer your own. We’ll examine who is hoarding your information — and give you a guide for what you can do about it. Sign up for our limited-run newsletter.]
Julia Adler-Milstein, the director of the Center for Clinical Informatics and Improvement Research at the University of California, San Francisco, told me that “the costs associated with sharing data for research purposes in a HIPAA-compliant way are beyond what many hospitals can justify.” She added, “The fines associated with a potential data breach are also a deterrent.” Julia Adler-Milstein, the director of the Center for Clinical Informatics and Improvement Research at the University of California, San Francisco, told me that “the costs associated with sharing data for research purposes in a Hipaa-compliant way are beyond what many hospitals can justify.” She added, “The fines associated with a potential data breach are also a deterrent.”
These fines are a blunt instrument that don’t correspond to varying levels of harm, creating a climate of fear that discourages sharing. Leaking personal information onto the internet has rightfully led to fines in the millions. But so have cases of data loss in which it was unlikely that anyone ever accessed the lost data, because it was stored on a laptop or on thumb drives that may never have even left the office. This isn’t to say that the latter case shouldn’t be fined, only that the current amounts are excessive.These fines are a blunt instrument that don’t correspond to varying levels of harm, creating a climate of fear that discourages sharing. Leaking personal information onto the internet has rightfully led to fines in the millions. But so have cases of data loss in which it was unlikely that anyone ever accessed the lost data, because it was stored on a laptop or on thumb drives that may never have even left the office. This isn’t to say that the latter case shouldn’t be fined, only that the current amounts are excessive.
What can be done? One solution is to increase patient control. The government could create a data repository to which patients could upload their information and that would give them controls over how much they wanted to share and with whom. The problem with this plan is that it is unlikely many would bother to use the platform. What can be done? One solution is to increase patient control. The government could create a data repository to which patients could upload their information, and that would give them controls over how much they wanted to share and with whom. The problem with this plan is that it is unlikely many would bother to use the platform.
We could offer an incentive by allowing private companies to purchase the data from patients, but millions of people would need to participate. The fact that Chinese companies are already getting hundreds of thousands of records cheaply compounds the problem: The price of an A.I. cancer scanner from an American company that paid millions for its data would risk being undercut by a low-cost Chinese competitor.We could offer an incentive by allowing private companies to purchase the data from patients, but millions of people would need to participate. The fact that Chinese companies are already getting hundreds of thousands of records cheaply compounds the problem: The price of an A.I. cancer scanner from an American company that paid millions for its data would risk being undercut by a low-cost Chinese competitor.
A more pragmatic alternative would be to ease some of HIPAA’s more onerous requirements, and think more deeply about when we need more privacy and when we could live with less. A more pragmatic alternative would be to ease some of Hipaa’s more onerous requirements and think more deeply about when we need more privacy and when we could live with less.
Data-sharing agreements should be standardized so that doctors and hospitals don’t have to draft custom ones every time they want to share information.Data-sharing agreements should be standardized so that doctors and hospitals don’t have to draft custom ones every time they want to share information.
Some effort has already been made to reform fines by taking into account the “culpability” of the organization — the extent to which a violation is caused by negligence. We should go further and calibrate fines according to the level of verifiable harm. Finally, fines should factor in the size of the institution. A million dollars may not mean much to the Mayo Clinic, but it could cripple a small hospital.Some effort has already been made to reform fines by taking into account the “culpability” of the organization — the extent to which a violation is caused by negligence. We should go further and calibrate fines according to the level of verifiable harm. Finally, fines should factor in the size of the institution. A million dollars may not mean much to the Mayo Clinic, but it could cripple a small hospital.
Lowering data-sharing barriers for these small hospitals is especially important if we want A.I. to be equally effective for all Americans. Models behave badly when they haven’t been trained on a representative data sample; facial recognition technology, for example, is far more effective on white men than on black women. If we draw our data exclusively from the large and wealthy health care systems, we risk reproducing that bias in medicine, further marginalizing the poorer and more rural communities that are often served by small hospitals.Lowering data-sharing barriers for these small hospitals is especially important if we want A.I. to be equally effective for all Americans. Models behave badly when they haven’t been trained on a representative data sample; facial recognition technology, for example, is far more effective on white men than on black women. If we draw our data exclusively from the large and wealthy health care systems, we risk reproducing that bias in medicine, further marginalizing the poorer and more rural communities that are often served by small hospitals.
Reforming HIPAA does not mean opening up all of our personal data to the highest bidder or for all uses. On the contrary. There are many areas today where the government and consumers should be demanding greater protections — particularly genetic testing, fitness trackers and smart watches, which are virtually unregulated. Reforming Hipaa does not mean opening up all of our personal data to the highest bidder or for all uses. On the contrary. There are many areas today where the government and consumers should be demanding greater protections — particularly genetic testing, fitness trackers and smart watches, which are virtually unregulated.
But for us to reap the benefits of artificial intelligence, we need to make data sharing simpler. Unlike the far more substantial privacy sacrifices we’ve already made in so many other aspects of our lives, at least we’ll have something to show for our efforts — the potential for a longer, healthier life.But for us to reap the benefits of artificial intelligence, we need to make data sharing simpler. Unlike the far more substantial privacy sacrifices we’ve already made in so many other aspects of our lives, at least we’ll have something to show for our efforts — the potential for a longer, healthier life.
Luke Miner is a data scientist.Luke Miner is a data scientist.
Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.