Skip to Main Content

The artificial intelligence system is a dream for many doctors: It records their conversations with patients and automatically transcribes the notes into their computer systems. No need to bring a laptop to every patient visit, or spend hours afterward on documentation.

But the product, sold by Nuance Communications and its parent, Microsoft, comes with strings attached: To improve its accuracy, health systems sworn to protect privacy must share patients’ most sensitive data with companies trying to develop their next blockbuster product.

advertisement

A STAT review found health systems interested in the AI tool are reaching different conclusions about whether that data-sharing is a dealbreaker that could put them in violation of federal privacy rules, or whether it’s possible — and adequate — to inform patients and get their consent.

“Our organization’s policy in general is that we do not allow vendors to keep our data — and this is an exception,” said Ann Cappellari, chief medical information officer at the hospital chain SSM Health, which is using the AI system, called Dragon Ambient eXperience, or DAX.

The exception is necessary, she said, to reach for the benefits of machine learning, a type of AI that automates tasks by learning from large numbers of examples. “I don’t know how you improve and support machine learning without” sharing data on its performance, said Cappellari. But the decision to use the system wasn’t an easy one, stirring internal ethics discussions about patients’ data rights. “We don’t have great resolution to this,” she said.

advertisement

Those discussions are playing out quietly within hospitals across the country. Lawyers are concerned about complying with privacy laws. Ethicists question whether it’s OK to feed patient data into a corporate development pipeline that doesn’t directly compensate them. Doctors want to protect the sanctity of their conversations with patients, but they also want relief from hours of daily documentation tasks.

Having it both ways is, for them, the hardest part of AI.

“AI is just so blurry,” said Nupur Shah, a physician and informatics fellow helping to guide use of DAX at Rush University in Chicago. She described the legal discussions as “hard” and “sensitive,” occurring on highly technical terrain where clinicians are still learning. “There’s a lot of questions about whether it’s reliable. Can we trust it?” she said. “For the providers using it, we explain it’s AI — machine learning. But these are still nebulous terms.”

Further complicating matters is that health systems are operating in a regulatory vacuum. Many want to adopt AI systems designed to automate, or partially automate, everything from data entry to diagnosis. But the permissibility of the data sharing they require is not spelled out under the federal privacy law known as HIPAA, enacted decades before the advent of the advanced machine learning systems now being pitched to health systems nationwide.

“The lack of clarity is negatively impacting patient care and patient experience,” said one health system executive who asked for anonymity due to ongoing contract discussions with multiple AI vendors. The executive said any decision to buy an AI product comes with a detailed analysis of potential costs and benefits. “But I can’t even get to that part of the conversation because of the data sharing issues. Unless they agree to take that out of the contract, it’s a nonstarter.”

The rollout of DAX — which needs a steady stream of data to sharpen its performance — brings the dilemma into focus. It is designed to record patient conversations on a smartphone or other device, and automatically upload relevant portions of the encounter, such as medical problems and medications prescribed, into the electronic health record (EHR). It can also compose a free-form note about the visit. However, to ensure accuracy, Nuance relies on human experts to review the AI-generated transcript before it is finalized and loaded into the EHR, a process that can take several hours.

The company’s ultimate goal is to make human review optional, so that the AI could instantly upload details of a patient visit. But that level of automation, which the company calls express mode, requires constantly refining the system with data recorded during patients’ private conversations with their doctors.

HIPAA gives health systems wide latitude to share patient data with business associates to help with patient care and business operations, but it is silent on the subject of what, if anything, is owed to patients if their data becomes a component part of a product that makes those outside parties a lot of money. If devising a system to compensate patients seems like a stretch, so too does profiting from their data without sharing the value it creates.

In an interview, Nuance executives said the company is more restrictive on the use of patient data than HIPAA would allow. They said patient information is only used to improve the specific product a health system has purchased. It is not used for marketing, nor is it sold to other parties wishing to use it for their own purposes, they said. But to fully benefit from products like DAX, customers must agree to make what for many is still an uncomfortable tradeoff — patient data in exchange for better performing AI.

“One of the promises we have to clients is that you’re going to contribute to make the product better for you and everyone else,” said Peter Durlach, Nuance’s chief strategy officer. “We’re not going to do anything inappropriate with that data. It’s to make sure that when you pay us a certain value, you get the benefit back.”

DAX, still under development, is being used by hundreds of hospitals and other providers across the country, including Boston Children’s Hospital, Cooper University Health in New Jersey and Pennsylvania, Nebraska Medicine, Rush University, and Providence, a health system with hospitals in five western states, including California and Alaska. Many smaller clinics are also using the product, which can be applied to visits in primary care and 30 medical specialties, from cardiology to plastic surgery.

Several health systems said they bought DAX to help address clinician burnout, a problem worsened by the many hours physicians spend documenting their appointments in EHRs. One recent national study found that U.S. physicians spend nearly two hours each day working on documentation outside of working hours. Those tasks eat away at their evenings and time spent with family, so much so that hospitals now regularly track how much “pajama time” gets lost to the EHR.

At Providence, physicians that use DAX in more than 80% of their visits have cut their time spent on documentation by about 36%, from 7.4 minutes per note to 4.7 minutes, said Scott Smitherman, the health system’s chief medical information officer. “That’s not a huge number of minutes,” he said. “But multiplied by 20 visits per day, you’ve ended up with an hour of time you didn’t spend in notes.”

The time savings are lower for those that use DAX less frequently. But Smitherman said adoption of the product was driven by a desire to improve quality of life for clinicians, not as a way to cram in more visits. “The note and the documentation burden has really become the bane of providers’ existence,” he said. “Some providers may choose to see a couple more patients and are happy to do so, but really from a system perspective the reason to do this is decreased burnout.”

As they expand use of DAX, providers must also contend with the novel questions it poses about how to inform patients about the role of AI and get patients’ consent for its use. Providence said Nuance is permitted to use patient data to review and edit appointment notes generated by DAX to ensure accuracy.

The company is also allowed “to make secondary uses of patient data,” a spokesperson for the health system said. “But only after the data has been de-identified and only for quality and service improvement activities such as training, improving speech recognition of Nuance products, and software development.”

Officials at Rush University said individual physicians make their own determinations about how to inform patients’ about the system’s use. The system developed a script with help from Nuance. Some providers choose to inform patients at the front desk, while others do it in the exam room. But all providers are asked to disclose that data from the encounter will be shared with Nuance.

“We ask that everyone is very explicit in saying that we are sharing PHI (personal health information) with a third-party vendor,” said Melissa Holmes, a physician and clinical informatics specialist. She said the data-sharing arrangement is governed by what’s known under HIPAA as a business associates agreement, a contract that defines how patient data can be used by a contractor working with the health system.

Holmes said the companies have access to the full details of patients’ records during the contract term, but identifying information, such as names, birth dates, and full ZIP codes, must be removed after it ends. Right now, providers are recording the conversations on iPods fitted with an app that uploads the details into the electronic health record, with a human scribe reviewing the accuracy. But the hope is for a more seamless system that fills out the EHR instantly with less visible technology in the background of the visit.

“Our dream in the future is to just have speakers in the exam room, and when people consent to this, there is no iPod — it’s part of the exam room of the future,” Holmes said.  “But part of that is explaining that there is this AI engine involved.”

That future, Nuance executives said, is almost here. Currently the company is employing a large workforce to review and edit the notes generated by DAX. But the data flowing daily from hundreds of customers has allowed the company to begin testing its express mode.

“I can’t tell you how much the providers’ data is actually moving us up the (performance) curve. ”

Ken Harper, vice president at Nuance

“I can’t tell you how much the providers’ data is actually moving us up the (performance) curve,” said Kenn Harper, a vice president at Nuance in charge of developing DAX. He said the data from subspecialists who might focus exclusively only on hand injuries are giving the AI a rich and highly-detailed dataset to learn from. “That’s how you can really perfect how the AI is learning for that particular provider, and why for DAX the data is so critical to getting you to that fully automated experience.”

Harper added that the company hopes to accelerate the deployment of express mode next year. “That’s really the holy grail of what we’re after,” he said. “And really the only way you can do that is by using the data to keep teaching the machine.”

Cappellari, at SSM Health, said even the use of de-identified data by Nuance creates a gray area. While the health system has long disclosed to patients that their data may be shared with others for research, or to improve their care, those documents typically do not discuss the use of data for product development by for-profit companies.

Cappellari said she understood the concerns patients might have: “This big company has this data about me, and you say it’s de-identified, but is it really?” she said. “And what part should I get? It’s my data you’re sending. It’s my tissue sample. It’s my thing.”

She added that while the value of a better version of DAX is clear for providers, the benefits for patients are not as easy to define. During the rollout, there was a lengthy discussion within the ethics department about data ownership and who should direct its use, and how those questions interface with the imperatives of AI development.

As these products become more automated, Cappellari said, those questions are only going to get harder, and the debates more intense. She said SSM is already revisiting the contractual language on data ownership as it prepares to transition to DAX express mode and to a future where AI begins to perceive things that the humans don’t. “As these medical learnings improve, if that machine can detect the clinician doesn’t know and the patient doesn’t know (such as an elevated cancer risk), is it that company’s obligation to tell the patient?” she asked. “This is what we have expected computers and technology to do for medicine, but what do you do with that? It never will be perfect.”      

This story is part of a series examining the use of artificial intelligence in health care and practices for exchanging and analyzing patient data. It is supported with funding from the Gordon and Betty Moore Foundation.

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.