Google, Amazon, insurers and credit card companies have long been able to tell whether you vote, own a dog, spent time in prison or drive a rusty 1997 Chevrolet. Now, that type of information is starting to pop up in front of doctors when you walk into their examination rooms.
A small but fast-growing number of technology companies, including data brokers LexisNexis and Acxiom, sell health care providers detailed analyses of their patients, incorporating criminal records, online purchasing histories, retail loyalty programs and voter registration data.
Some health systems think the data may drive better medical decision-making — helping them identify patients at risk of expensive care or rehospitalization, for instance, and enabling them to connect hurting patients with follow-up care or social work programs. The fact that a hypertension patient lives in a food desert, or lacks carfare to get to appointments, may be more important to her health than any prescription written by a doctor, former AMA President David Barbe noted in an interview.
The medical profession increasingly recognizes that it needs to be aware of how socioeconomic context — the buzz phrase is “social determinants of health” — is vital to a patient’s whole health. The flip side of benevolent concern, however, could be pigeonholing or invasions of privacy.
There are few safeguards on how such outside information can be used within the health system. The algorithms that companies use to classify some patients as “high risk” are rarely made public, so patients may not know their purchasing history or lifestyle could catapult them into a higher-risk strata. For every health plan that uses algorithms to predict substance abuse and help patients get treatment, there could be one that turns patients away when it learns they have.
Federal law protects personal health information held within the healthcare system; it doesn’t stop doctors from using outside data to learn more about how their patients live. If Congress overturns the rest of the Affordable Care Act, as Sen. Mitch McConnell has suggested, health insurers could use a patient’s grocery purchases or internet browsing history, as the basis for charging a high premium.
Even if the law doesn’t change, “there is risk that people who have personal issues at home will not seek care because combining their health data and their lifestyle data may have downstream consequences to them” such as discrimination in access to care, insurance or employment, says Crowell and Moring partner Jodi Daniel, a former HHS policy leader.
Laws and regulations don't apply
At the moment, there’s nothing the government can do about these data uses. The FTC can crack down on data brokers for unfair or deceptive practices, but many current uses aren’t legally defined as unfair or deceptive, an agency spokesperson said.
FTC and Congress are getting interested, however, in privacy legislation that might force data brokers to share their sources or require consumer consent before gathering their data. The Senate Commerce Committee has been holding hearingsthat Chairman John Thune said could lead to changes in federal privacy law.
A bill introduced in April by Sens. Amy Klobuchar and John Kennedy would give consumers the right to review data collected on them and to opt out of online data collection. The FTC has held its own hearings to assess whether it needs new enforcement capabilities.
Privacy and data experts increasingly think something needs to be done, and soon. “We should consider some baseline guardrails for use of data and how to protect individuals from unintended consequences,” said Daniel. “In the meantime, I believe patients should be offered the choice” to deny insurers and doctors access to information about their lifestyles.
New laws or regulations may be needed to protect people from discrimination in employment or insurance — the way the 2008 Genetic Information Non-Discrimination Act shelters patients who undergo genetic testing from discrimination by health insurers, Daniel said. Others have called for preemptive limits on what data brokers can do with what they collect.
“Information is being provided to disparate companies for specific purposes, and then it gets in the hands of data brokers and is repurposed for things that people could never have imagined,” says Natasha Duarte, a policy analyst with the Center for Democracy & Technology. Once the information has been collected and sold, tracing a particular data point back to its origin is almost impossible, she said.
Non-voters don’t take care of themselves
The data that brokers peddle to health care providers is often wrong. Even when it isn’t, its usefulness to medicine is debatable. A recent study of 90,000 patients at the Duke University Health System concluded that putting socioeconomic data into a patient’s electronic health record didn’t improve the system’s ability to predict ER visits or hospitalizations. Apparently, doctors can figure out whether the patient in front of them is in bad shape whether the cause is an empty refrigerator or a genetic mutation.
Brokers and traders in private data are secretive about how they collect and check it for accuracy, or who buys it and for how much money.
LexisNexis has been quietly marketing a product called Socioeconomic Health Attributes that takes data from “more than 10,000 sources of public and proprietary records” and matches it to individual patients. It’s not the kind of information that normally populates a patient’s medical record.
“Liens, evictions and felonies indicate that individual health may not be a priority,” according to a marketing pitch on the company’s website. Voter registration might be relevant as well because “[i]ndividuals showing engagement in their community may be more likely to engage in their own health.”
Another LexisNexis product, Patient Identity Intelligence, can “continually update patient records with the latest demographic information,” and uses socioeconomic data to “better predict [hospital] readmission compared to models that use only clinical data,” the site says.
“We have seen providers increasingly testing and using social determinants of health over the last year to help improve their predictive modeling and care management programs,” a LexisNexis spokesperson said in an email statement, calling those providers “early adopters.”
LexisNexis would not disclose the sources of its information or how it calculates the “health risk score” it peddles to health care providers. The company directed questions about provider use to the American Medical Association, where a puzzled spokesman said the AMA had no knowledge of the matter.
A disclaimer on the LexisNexis website notes that the data flowing into its Socioeconomic Health Attributes “may contain errors.” Data from public and commercial sources are “sometimes reported or entered inaccurately, processed poorly or incorrectly, and [are] generally not free from defect.”
LexisNexis competitor CentraForce has taken another tactic: instead of selling an individual patient’s data to health care providers, it buys up large volumes of de-identified consumer survey data that includes answers to questions about trust in the medical system or substance use, as well as demographic identifiers. Providers then use the software to match patients by their demographic similarity to one of the risk categories CentraForce has established through data crunching.
The technology helps providers focus on patients whose behaviors put them at greater risk of hospital readmission or treatment failure, said Steve Newman, CentraForce’s executive chairman. He said the platform has gained traction among insurers running Medicare Advantage plans, which can lose money if their patients use up high levels of health care dollars.
Newman thinks CentraForce’s profiles are often more accurate than what patients tell their doctors.
“We know that patients will tell people in white coats different stories from what they will tell an anonymous survey company,” Newman said. “I can tell [doctors] unequivocally that the precision and accuracy of what [patients] tell us is greater than what they tell you.”
Since it doesn’t rely on personal health information to assign risk profiles, the product doesn’t require patient consent, Newman said. The company recommends that doctors use the profile to gently bring up specific health risks, “rather than in some sort of confrontational way,” he said. The added context “makes the provider, whether it’s a nurse or a physician, seem a little bit smarter, a little bit more attuned to the patient.”
Nuggets of obscure data can add up to insights, the data wranglers say. For example, “pet ownership is correlated with better outcomes. … Dogs more than cats,” says CEO Kurt Waltenbaugh of Carrot Health, which works with health plans to produce patient risk scores by crunching together medical and public records. An increase in a patient’s online purchases might suggest social isolation or poor health, he said.
But while LexisNexis and CentraForce risks scores may be mimicking the kinds of snap judgments doctors already make when they size up patients, the difference is that behavior outside the medical system may now be influencing the doctors’ judgment.
Patients often aren’t aware their information is being traded. In addition, the information that pops up in the electronic health record may be wrong or not applicable to them, said Tom Tomlinson, a privacy expert at Michigan State University’s Center for Ethics and Humanities in the Life Sciences.
Generalizations about people who don’t vote “don’t tell me whether it’s true of the person in front of me,” Tomlinson said. “What the generalization does is make me presume, without giving the patient the opportunity to prove me wrong.”
Data broker Acxiom sells data and services to identify patients eligible for community and social services based on sources such as retail history and public records. It can make detailed predictions identifying people who are likely to live with smokers even if they do not smoke, for instance, said Heidi West, its vice president for health care.
In the future the company hopes to integrate its assessments into the patient’s digital health record, West said. When it does, she hopes patients will log into Acxiom’s site to ensure their profiles are accurate. Acxiom currently has no legal requirement to do so; by correcting the profiles, individuals would be improving the data’s value to Acxiom and its clients.
West emphasized that the company wants to sell this information to providers so they can connect patients to appropriate preventive services or better tailor care. “If it even starts to go down the path of underwriting, or anything punitive, we will walk away from it,” she said.
Salesforce’s provider customers are using the software, which can integrate with EHRs, to sort patients by demographic data, including zip code, ethnicity or primary language, to refer them to relevant community health programs. “It’s the same as knowing what people’s shopping experience is,” chief medical officer Joshua Newman told POLITICO. Some providers use the software to flag patients who could benefit from counseling programs.
This type of use case attracts some physicians to the outside data. The AMA’s Barbe calls it “augmented intelligence” that could help him be a “better partner” to patients. The AMA has a project to identify the social, economic and behavioral factors most helpful to physicians treating conditions such as diabetes.
“There’s a real moral distress in taking care of people and giving them empty recommendations about what they need to do when they go home,” said physician Stacy Lindau, who created a platform matching people to local services based on their demographic data.
Socioeconomic and behavioral health information has long been used by retailers to target advertisements and services to consumers, and more recently insurers have tapped into similar data sources to flag high-risk patients. ProPublica and NPR reported on insurers partnering with data brokers to learn more about their clients’ health based on information found in the public domain.
Some data brokers have even lobbied to update HIPAA; the Claim Your Own Health Data Coalition, whose members include Experian, supported legislation to give data clearinghouses access to personal health information. The brokers say this would let them combine health data with other information and present detailed health analyses to patients, or de-identify those analyses and sell them to payers.
Digital health vendors hesitant
Cerner and Epic, the two largest electronic health records vendors, do not appear interested in incorporating background checks or detailed socio-behavioral profiles directly into their records, although they are starting to electronically connect patient records with community-based service providers.
Epic’s new release, for instance, can flag patients who might have transportation or food security issues. The company is piloting a product with researchers at the University of Wisconsin at Madison that could give clinicians detailed information about a patient’s neighborhood as part of its “precision patient engagement,” strategy, said Emily Barey, Epic's vice president of nursing and community health.
Cerner’s vice president of population health, Marc Overhage, said the company focuses on patient-reported data, prompting patients to answer questions about their socioeconomic background when they log into their patient portals. That information can also be incorporated into scheduling systems so administrators can ask patients likely to have transportation issues if they need a ride to their appointment, for instance.
Some hospital systems, such as Providence St. Joseph Health, are testing the waters by putting basic data about income level, zip code, and food availability into patient records. But Rhonda Medows, who leads the population health team at the Oregon-based system, said she has declined offers from data brokers who offer information from police records. It might be useful to know if the patient has suffered domestic violence, or a child has been at risk, but she can ask for that information from the patient.
“I’d rather have that dialogue [instead of] the police report,” she said. She doesn’t want “people to feel paranoid about tracking.”
And the idea a doctor can know things about your life outside the clinic that you haven’t told anyone else violates patient privacy and may damage trust in medicine, say critics.
If providers are only looking at cumulative risk scores, and they’re only used to “stratify people and target efforts to address social needs or impact social determinants of health on a personal or public health level, great,” said Karen DeSalvo, the former assistant secretary of health and national coordinator for health IT.
“Right now, given that ACA is still the law of the land, that data can’t be used to exclude people from coverage,” DeSalvo said said. “It gets dicey if that changes.”