A York University-led team has developed an AI method that reads advanced MRI scans to tell whether a brain lesion is active cancer or radiation damage. The approach could help doctors choose safer, more effective treatments for patients with brain metastases.
When a patient with brain cancer returns for a scan after radiation treatment, doctors often face a critical question: Is the spot on the MRI a growing tumor or harmless scar-like damage from the radiation itself?
A new study by a professor in the Lassonde School of Engineering at York University suggests artificial intelligence may finally help answer that question with far greater confidence.
The researchers, working with clinicians and imaging scientists at Sunnybrook Health Sciences Centre, have developed an AI-based technique that can distinguish between progressive brain tumors and radiation necrosis on advanced MRI scans more accurately than current methods.
The team reports that its model correctly differentiated between the two conditions in more than 85% of cases, using a specialized form of MRI called chemical exchange saturation transfer, or CEST. By comparison, standard MRI scans are typically right only about 60% of the time, and even more advanced MRI techniques alone reach about 70% accuracy.
The work marks an important step toward more precise, personalized care for patients with brain metastases, which are cancers that have spread to the brain from elsewhere in the body.
“The study shows, for the first time, that novel attention-guided AI methods coupled with advanced MRI can differentiate, with high accuracy, between tumour progression and radiation necrosis in patients with brain metastasis treated with stereotactic radiosurgery,” senior author Ali Sadeghi-Naini, a York Research Chair and associate professor of biomedical engineering and computer science, said in a news release.
Brain metastases are becoming more common as cancer treatments improve and people live longer with their disease. One of the main tools to control these tumors is stereotactic radiosurgery, or SRS, which delivers highly focused, high-dose radiation to cancer spots in the brain while sparing as much healthy tissue as possible.
SRS can be very effective, but it comes with a trade-off. In some patients, the tumor continues to grow despite treatment. In others, the tumor is controlled, but the surrounding healthy brain tissue is damaged by the radiation, a condition known as radiation necrosis. That damage can cause swelling and neurological symptoms, and on a standard MRI, it can look almost identical to a growing tumor.
That diagnostic uncertainty creates a serious dilemma for doctors and patients.
“Timely differentiation between tumour progression and radiation necrosis after radiotherapy in brain tumours is a crucial challenge in cancer centers, since these two conditions require quite different treatment approaches,” Sadeghi-Naini added.
If the lesion is active cancer, patients may need more aggressive treatment, such as additional radiation, chemotherapy or even surgery. If it is radiation necrosis, however, more radiation could make things worse. Those patients are often better managed with close monitoring and medications that reduce inflammation.
“Differentiating tumour progression and radiation necrosis is very important — one needs more anti-cancer therapies and may need to be aggressively treated with more radiation, sometimes surgery. The other may require observation, anti-inflammatory drugs, so getting this right is crucial for patients,” added Sadeghi-Naini.
To tackle this problem, the York and Sunnybrook team turned to deep learning, a type of AI that excels at spotting patterns in complex data such as medical images.
They built a three-dimensional AI model that analyzes CEST MRI scans, which capture subtle chemical information about tissues that standard MRI cannot see. The model incorporates two advanced “attention” mechanisms, computational tools that help the AI focus on the most informative parts of the image, much like a radiologist might zoom in on suspicious regions.
Published in the International Journal of Radiation Oncology, Biology, Physics, the study drew on imaging data from more than 90 patients whose cancers had spread to the brain and who were treated with stereotactic radiosurgery at Sunnybrook. By training and testing the AI on this real-world dataset, the researchers were able to show that the attention-guided model could reliably tell tumor progression from radiation necrosis at a level that outperforms the human eye alone.
While the work is still in the research stage, the findings point toward a future in which AI tools are integrated into the radiology workflow. In that scenario, a patient’s advanced MRI scan could be processed by the AI model, which would provide a probability that a lesion is active cancer or radiation necrosis. Clinicians could then use that information, alongside their own expertise and other clinical data, to make more confident treatment decisions.
More accurate diagnoses could spare some patients from unnecessary and potentially harmful treatments, while ensuring that those with true tumor progression receive timely, aggressive care. It could also reduce the need for invasive brain biopsies, which carry their own risks.
Beyond brain metastases, the approach highlights how AI and advanced imaging can work together to solve longstanding problems in cancer care. As imaging technologies become more sophisticated and datasets grow, researchers expect AI to play an expanding role in interpreting complex scans and predicting how tumors will respond to treatment.
Source: York University

