Researchers develop new prostate cancer prediction tool: Studytext_fields
New York: Researchers,, including one of an Indian-origin, have developed a novel machine-learning framework that can distinguish between low and high-risk prostate cancer with more precision than ever before, according to a new study.
The study conducted by the Icahn School of Medicine at Mount Sinai and Keck School of Medicine at the University of Southern California (USC), showed that the framework is intended to help physicians -- in particular, radiologists -- more accurately identify treatment options for prostate cancer patients, lessening the chance of unnecessary clinical intervention.
Presently, the standard methods used to assess prostate cancer risk are multiparametric magnetic resonance imaging (mpMRI), which detects prostate lesions, and the Prostate Imaging Reporting and Data System, version 2 (PI-RADS v2), a five-point scoring system that classifies lesions found on the mpMRI.
However, current tools used to predict prostate cancer progression are generally subjective in nature, leading to differing interpretations among clinicians.
The findings, published in Scientific Reports, showed that combining machine learning with radiomics -- a branch of medicine that uses algorithms to extract large amounts of quantitative characteristics from medical images -- researchers were able to classify patients' prostate cancer with high sensitivity and an even higher predictive value.
Hence, the approach has been proposed to remedy this drawback.
"By rigorously and systematically combining machine learning with radiomics, our goal is to provide radiologists and clinical personnel with a sound prediction tool that can eventually translate to more effective and personalised patient care," said Gaurav Pandey, Assistant Professor at the Icahn School of Medicine at Mount Sinai.
The pathway to predicting prostate cancer progression with high accuracy is ever improving, and we believe our objective framework is a much-needed advancement, the study noted.