Applying an artificial intelligence algorithm to medical records could improve the care and treatment decisions for patients with a form of head and neck cancer, research suggests.
The AI engine was able to stage oropharyngeal cancer by extracting information from clinical, radiology, and pathology reports in electronic health records, bypassing laborious manual procedures by expert reviewers.
The natural language processing system was able to distinguish between localized and advanced oropharyngeal squamous cell carcinoma (OPSCC) with high accuracy, researchers from Toronto report in JAMA Otolaryngology–Head & Neck Surgery.
“The true value of AI in the management of OPSCC is in the potential to improve patient experience by allowing clinicians to focus on the human aspect of care and less so on the technical aspect of care by reducing workload in areas like staging,” commented Antoine Eskander, a head and neck oncologist and reconstructive surgeon at Sunnybrook Health Sciences Center, and co-workers.
Cancer staging is crucial for determining the extent of disease and providing optimal patient treatment, yet data relating to this can be inaccurate or incomplete.
Although cancer centers have trained experts to document staging, it can be time consuming and expensive to maintain records and uses valuable human resources.
In addition, the data cannot help plan treatment if they are only available months after its initiation.
In an attempt to streamline the process, Eskander and team tested the value of applying natural language processing through the AI engine Darwen.
Specifically, they studied its ability to extract staging information from clinical, radiology, and pathology reports in OPSCC electronic health records.
They also examined whether it could assign tumor, nodal, and metastatic stages in an automated manner according to the American Joint Committee on Cancer eighth edition cancer staging guidelines.
The retrospective study included 806 OPSCC patients, with a mean age of 63.6 years, who presented to a single tertiary care center from January 2010 to the beginning of August 2020.
Overall, just over 80% were male, and a little over half were positive for human papillomavirus.
A ground truth cancer stage dataset and comprehensive staging rule book were developed, consisting of 135 rules encompassing p16 status, tumor, and nodal and metastatic stage.
Following this, four distinct models were trained, including one for anatomical location and invasion state, another for lesion size, a further model for metastasis detection, and a final one for p16 status.
To validate the findings, the results were compared against ground truth established by expert reviewers.
The researchers report that accuracy was “fair to good” for tumor and nodal staging, at 55.9% AND 56.0%, respectively.
It was better for metastatic stage and p16 status, and deemed “excellent” by the team at a corresponding 87.6% and 92.1%.
Differentiation between localized cancer, categorized as stages 1-2, and advanced cancer, categorized as stages 3 and 4, was 80.7%.
Reporting their findings, the researchers noted that the model requires refinement and external validation with electronic health records at different institutions to improve algorithm accuracy and clinical applicability.
Nonetheless, they added: “In all, to provide optimal treatment to patients with oropharyngeal cancer, complete staging must be documented before or at approximately the start of treatment, which AI technology has the potential to do efficiently and accurately in a timely manner.”