We’ve been hearing about how big data, genetics and personalized medicine are going to radically transform health care. But as Cedric Manlhiot, who co-leads the computational biomedicine program at the Ted Rogers Centre for Heart Research, points out, there’s still a yawning gap between all the medical data being gathered and the ability to translate this into the best therapies for individual patients.
This may soon change, thanks to work he and colleagues are doing at the centre. In the not-distant future, says Manlhiot, a U of T professor of medicine, physicians will use all of a patient’s relevant information, including their genome, details of their condition, their medical history and aspects of their environment, to devise the best possible treatment for them.
As he sees it, there are three big challenges to getting there: for the most part, the information being gathered isn’t collected and stored in a format that can be shared; there are no direct links between patients’ health records and the high-performance computing facilities necessary to make predictions about the best treatment; and researchers still need to study how doctors will use computer-generated predictions to treat patients, including the fundamental questions of if and how these predictions will improve patients’ health. The centre is working to solve all of these problems with a computational medicine platform that would be rolled out in Toronto, then perhaps more broadly.
This may seem like a daunting task, but compared to the banks and credit card companies, which already use predictive analytics through artificial intelligence to detect fraud, medicine is decades behind, says Manlhiot. “We’re trying to catch up – and fast.”
By bringing artificial intelligence into chemistry, Prof. Aspuru-Guzik aims to vastly shrink the time it takes to develop new drugs – and almost everything else