Skip to Main Content
In Depth

Yale Pathology Develops AI Tool that Checks Records to Ensure Accuracy

Provides another layer of scrutiny

4 Minute Read

Key points

  • Reviews pathology report against patient’s medical records to prevent errors
  • Currently piloted at Yale Pathology

Yale Department of Pathology has developed a new AI tool that checks a pathologist’s report against a patient’s complete medical record to ensure the accuracy of the final report.

Peter Gershkovich, MD, MHA, associate professor of pathology, Joanna Gibson, MD, PhD, associate professor of pathology, and Joe Celano, senior software developer, have been working on the tool for the past year and recently introduced it to Yale pathologists on a trial basis.

A pathologist preparing to finalize a report can use the tool to check their report against the patient’s records. If, for example, the patient has a diagnosis on the right side but the report refers to the left side, the tool will pick up this discrepancy.

“What we envisioned is a program that checks every report and compares against the medical record, then sends a notification to the pathologist if there’s an error detected,” says Gibson, director of Quality and Patient Safety at Yale Pathology. By finding errors before the report is finalized, release of incorrect reports can be prevented.

Method report

Overall, reporting errors are rare, which is why it is hard for humans to identify them. A corrected or amended report—a report that has been modified and re-issued after initial release—occurs when a pathology report was finalized, then someone noticed an error, which must be corrected. The rate of corrected or amended reports is a standard measure of quality in a pathology laboratory.

“There’s a huge risk to patient care when that happens because this amended report may not reach the person treating the patient,” Gershkovich says. “The doctor may be working on assumptions from the old report. This system can prevent errors that are often missed—for example, the misalignment of the patient’s site and laterality of a biopsy sample—and can easily pick them up. But humans may not notice this type of error because they are not paying specific attention to it.”

Gershkovich and Gibson developed the concept for the tool and approached Celano, who works in Pathology Informatics, which Gershkovich directs. Celano says it took him about three months to create the tool, then another six months to develop it into a usable application.

“We built feedback into the system,” Celano says.” When they run the application, it checks the case to see if everything matches—does the requisition form match the operating notes, do the operating notes match the final diagnosis? It catches things like laterality—someone wrote left side, but was it supposed to be right. The tool will do an analysis and there is a section for the user to provide feedback. Users enter comments and evaluate how well the tool is working, and the Informatics Team then fine tunes the application to address user concerns.”

‘Adding a level of scrutiny’

Gershkovich says the tool is another layer of security. “We are not replacing anyone or anything, we are just adding a level of scrutiny to the report release process that allows us to catch errors with visible irregularities.”

He says the tool “picks up errors as well as it can, but it doesn’t pick up noise, so it’s both sensitive and specific. It allows the user to correct errors and ignores minor things that are not affecting the pathological diagnosis.” He describes it as “an evolving system intended to be improved and validated through built-in mechanisms.”

Although the tool is still being perfected, Gershkovich says the current product “can be used and should be used beyond Yale.” Presently, the tool is integrated into the workflow and pathologists can click on an icon to generate the AI report check. Celano says the tool will be adjusted to automatically check the report against all known records with no need to click.

There’s a lot of complex information in a patient’s chart, Gibson notes, which can lead to an oversight by the person reading the chart. “There’s so much data that unless you know what you are looking for, you may miss something,” she says. “The AI tool can do it within seconds.”

The tool is already generating positive feedback in the pilot phase.

“I use the AI tool diligently and it’s reassuring when it says no errors,” says Uma Krishnamurti, MD, PhD, professor of pathology and director of breast pathology. She says the tool has already prevented the release of a report with the incorrect laterality of a breast biopsy sample. Had the AI tool not found this discrepancy, the report would have been released with an error, requiring an amendment for correction.

Celano says it makes him proud to know that software he helped develop can ensure a higher standard of patient care.

“If you can tell a software developer that people are going to use the software you’re developing, that really gets them motivated,” Celano says. “So it’s nice to see people using it. And the other thing is seeing that you can actually make a difference in people’s lives.”

Article outro

Author

Terence P. Corcoran
Associate Communications Officer

Tags

Media Contact

For media inquiries, please contact us.

Explore More

Featured in this article