According to the Alzheimer’s Association, an estimated 5.7 million Americans of all ages were living with Alzheimer’s dementia in 2018. One in ten people aged 65 and older has Alzheimer’s dementia. Alzheimer’s, which has no cure, is the sixth leading cause of death in the country.
In 2010 Congress passed the National Alzheimer’s Project Act with the aim to find a way to prevent the condition or to treat it effectively by 2025. Different scientific approaches, all using machine learning, might be getting closer to realizing that aim.
University of California in San Francisco
Researchers at the University of California in San Francisco (UCSF) have used artificial intelligence (AI) technology to better interpret brain imaging to predict Alzheimer’s.
Scientists know that the disease is linked to changes in metabolism that are revealed by glucose uptake in certain regions of the brain. However, it’s difficult to pinpoint these changes when looking at brain images. The researchers trained their deep learning algorithm on a special imaging technology to help spot the glucose uptake.
These learning algorithms need huge data sets to train on before they are able to notice tell-tale signs of the presence of factors related to a condition, in this case, Alzheimer’s. The researchers had access to a huge database from the Alzheimer’s Disease Neuroimaging Initiative (ADNI). ADNI is a multi-site study about clinical trials on the prevention and treatment of Alzheimer’s.
The researchers were able to train their algorithm on more than 2,100 of these special brain images from 1,002 patients. By using deep learning, the algorithm was able to teach itself to spot metabolic patterns in the images that corresponded to Alzheimer’s disease.
Want to know the best part of this story? When the algorithm “looked at” a completely new set of images, it had a 100% success rate in spotting the presence of the disease six years before final diagnosis.
Case Western Reserve University School of Medicine
Rong Xu, an associate professor at the Case Western Reserve University School of Medicine, is in the process of building a data set of tens of millions of published research articles and FDA profile documents as well as other data sets. Xu will employ complex calculations to rank and prioritize the findings.
All this data will be then used to create a comprehensive database about the factors related to Alzheimer’s and other forms of dementia. Xu, received $5 million for this and another project, both of which will use big data to learn more about Alzheimer’s and related dementia.
“Vast amounts of data from seemingly unrelated sources present opportunities to researchers who aim to extract information that would help develop drugs or treatments,” says Xu. She adds that this is especially true for diseases and conditions that may involve multiple genetic variations and that also have social or environmental influences.
Indeed, her second project will try to find new candidate genes that might be implicated in various forms of neurodegeneration. The second project will also explore larger genetic regions and biochemical pathways in order to determine what warrants further study. Of course, all this data will eventually be fodder for machine learning to train on and hopefully help scientist to discover the mechanisms behind Alzheimer’s and other neurodegenerative diseas