Privacy and Discovery

Lucy Steinert
Lucy Steinert
Author

Last year, I was looking at an issue to help a small business in Jamaica take payments from the U.S. Such transactions are complicated because of government legislation around drug trafficking. Almost instantly, after I conducted initial research, I was being offered all sorts of related services. Assumptions were made concerning my purchasing power, nationality, and ethnicity and the advertisements I saw changed accordingly.

Like most people, I use search engines to access information on the internet. I use Facebook, LinkedIn, Instagram, and Pinterest. I also use frequent-flyer point systems.  Delta, JetBlue, and British Airways all know where I am most of the time. The only fee I pay for these services is my information—tons of personal data.

Companies need to make money, and today, more and more businesses do it by leveraging the personal information we make available to them. Companies collect our data, and pay us by providing us with services. That’s the deal. I may not like it. In fact, I don’t like it. I take lots of precautions, clean my cookies regularly, use a VPN, scan the user agreements I sign for the more egregious terms, and decide not to use some apps and services as a result.

But I love the easy contact with friends that Facebook provides. I can’t imagine not having LinkedIn to research people I meet professionally. I get a lot of value in exchange for granting these companies access.

In the case of the rare disease community, the issue of privacy is even more difficult. Many of us belong to “private Facebook groups.” These groups are where we can have open conversations about the challenges we face, the drugs we have found helpful both on and off-license, and the triumphs we achieve, but at what cost?

At TREND, we help communities to turn their conversations into real-world evidence. To do this, we collaborate with those communities to distill their data into actionable insights. It can be tricky—but we complete each project safeguarding privacy every step of the way.

To understand the intersection of privacy and discovery, we wanted to look back on the recent history of a few of the most important events regarding both.

February 17th will be the 11th anniversary of the signing of the American Recovery and Reinvestment Act of 2009. It is a sweeping piece of legislation with several ambitious pieces, not the least of which was ending the great recession.

One piece was The Health Information Technology for Economic and Clinical Health (HITECH) Act. This included 17 billion dollars’ worth of incentives for healthcare providers who implemented approved IT solutions. This led to a massive expansion of electronic health records (EHRs). Gone were the days of pen and paper charts—digital was the way of the future, and a wave of implementation got underway. In 2008, just 10 percent of hospitals had EHRs. “By 2017, 86% of office-based physicians had adopted an EHR and 96% of non-federal acute care hospitals have implemented certified health IT.”

Now that so much health data was available in digital form, one might have thought it could be uploaded to a centralized system, something along the lines of a universal database filled with the health records of every American. Such a database would revolutionize everything from insurance rates to research. We could create personalized care based not just on the history of an individual, but on the pertinent histories of all other individuals.

That is not what happened.

Blame cannot be assigned to any one thing, but one reason might have been 2003’s Health Insurance Portability and Accountability Act (HIPAA). It would not be fair to assert that the reason we have not been able to reap the full benefit of EHRs is that pesky matters of privacy have prevented doctors and researchers from accessing the wealth of data necessary for breakthroughs in treatment…but it is part of the reason.

Concerns over legal consequences of HIPAA have impeded access to data, prevented entities from providing access to researchers, even causing researchers to forego research for fear of being penalized. Many people involved in the quest for new treatments have reported that privacy rules have negatively impacted their ability to conduct meaningful research.

It is easy to understand why there are concerns. Medical records contain highly sensitive information, and the contents of those records need to be protected by strict laws that protect patients and prevent exposure of information that could cause harm.

At TREND, we understand how difficult it is for many patients to find treatments. We know that many people would not think twice about foregoing their privacy in exchange for just the hope of a treatment—many patients do just this.

We also understand that privacy is sacred, and we have put into place a series of safeguards to protect the privacy of the communities we work with. Names and usernames are stripped from the conversation data we use for our Community Voice Reports. If personally identifiable information is mentioned in the comments, it is not included in our data analysis.  Our reports only include summaries of conversation data, nothing about the identities of the people participating in the conversations. The collective voice of entire communities is often the only thing powerful enough to drive research forward, so we anonymize and aggregate.

While the topic of privacy vs. discovery is still up for debate, TREND protects the privacy of the communities we work for while helping to accelerate discovery. We hope that as technology progresses, the unification of patient data doesn’t have to mean those patients sacrifice their privacy. In the meantime, we’ll keep doing everything in our power to inspire solutions while protecting patients and caregivers.