HHS is reminding the health care industry that AI tools they use can't discriminate. | Alastair Pike/AFP via Getty Images
As the Trump presidency looms, the Department of Health and Human Services’ Office for Civil Rights is trying to reinforce its anti-discrimination rules on artificial intelligence.
The office sent a missive last week that included a reminder for the health care industry: AI products they use can’t discriminate against certain patients. The office also offered suggestions for how to comply with OCR’s regulations.
In a final rule last year, HHS reinstated antidiscrimination protections that the Trump administration removed in 2020. Those protections prohibit discrimination based on race, color, national origin, sex, age and disability. Health systems must also ensure their telehealth technologies, in addition to AI products, don’t result in discriminatory care.
The rule requires providers to take “reasonable steps” to identify and mitigate potential discrimination.
What are reasonable steps? OCR highlights a few ways to comply:
— Review Section 1557 of the Affordable Care Act, which prohibits discrimination.
— Look at peer-reviewed studies of the AI product in use.
— Implement an AI registry that monitors AI performance.
— Ask AI vendors for information about their technology and the data it’s trained on.
Once providers identify any problems, health systems and payers can take a wide variety of approaches to fix them, including building policies around how the tool is used, training staff and running audits to ensure their AI isn’t running afoul of the law.
In the letter, OCR Director Melanie Fontes Rainer reiterated a point she’s made publicly: Each potential violation is handled case by case. When reviewing whether a health system, provider or insurer made reasonable efforts to suss out discrimination, the office will take into account the entity’s size, resources and the information it had access to at the time of the infraction.
The office will also examine whether the tool was used as intended by the developer and approved by regulators, as well as if the entity had a process in place for evaluating the tool’s impact on patients.
Even so: In October, Rainer acknowledged that OCR’s rules could change under a new administration, though she doesn’t believe that should be the case. “We should not have rights flipped on and off like a light switch,” she told Ruth onstage at the health care conference HLTH in Las Vegas.
The Trump administration could — as it has in the past — narrow OCR’s civil rights rules to roll back protections for LGBTQ+ patients and other protected classes and give health systems and payers more ability to refuse services and coverage based on religious beliefs.
WELCOME TO FUTURE PULSE
Chappaquiddick, Mass. | Erin Schumaker/POLITICO
This is where we explore the ideas and innovators shaping health care.
Using AI, researchers generated a fluorescent protein that would have taken 500 million years to evolve naturally. | AP
An artificial intelligence model has generated a protein that doesn’t exist in nature and is equivalent to simulating 500 million years of evolution, the researchers behind the model say.
How so? The proteins that exist in nature have naturally evolved over billions of years.
Researchers at EvolutionaryScale, a private AI company; the Arc Institute, a nonprofit research organization; and the University of California, Berkeley, trained an AI model called ESM3 on billions of pieces of protein information.
They used it to generate a functional fluorescent green protein that was different in structure from existing proteins, researchers wrote in a study published in Science on Thursday. The protein is what makes jellyfish fluorescent and gives coral its vivid colors.
“We chose the functionality of fluorescence because it is difficult to achieve, easy to measure, and one of the most beautiful mechanisms in nature,” they wrote.
The result? An unknown fluorescent protein that’s only 58 percent similar to known ones. It could have taken 500 million years of natural evolution for such a protein to emerge, the researchers say.
Why it matters: The AI model could help researchers understand how existing proteins work and develop new ones for use in medicine and other areas. The AI model can provide insight into protein sequence, structure and function.
But some biotechnology experts and former government officials have raised concerns that this type of model could also generate dangerous pathogens that could be weaponized.
LIFESTYLE
Sen. Chris Murphy (D-Conn.) is writing a book about America's loneliness crisis and our path out of it. | Getty Images
Sen. Chris Murphy (D-Conn.) is writing a book about America’s spiritual crisis, which he argues is fueled by loneliness — and how we can escape it.
“The dizzying pace of globalization, technological and social change, and triumph of profit maximization over the common good have wrought a terrible cost on the American psyche,” Murphy said in a Random House statement announcing the deal this week.
“The country feels as if it’s unspooling, and a Democratic Party that does not understand and address the roots of this unraveling will forever cede the field to demagogues and scapegoaters," he said.
His efforts include introducing bills to regulate social media for children and legislation laying out a government strategy, including a White House office, to advance social connection.
Why it matters: A growing body of evidence links loneliness and isolation to a greater risk of cardiovascular disease, dementia, stroke, depression and anxiety.
Monkey see, monkey do: Murphy isn’t the first official in Washington to publish a book on loneliness. Dr. Vivek Murthy, the departing surgeon general, beat him to it, publishing “Together: The Healing Power of Human Connection in a Sometimes Lonely World” in 2020.