December 7, 2016
By Joshua C. Kendall
Despite a growing conviction among researchers that lifestyle choices matter, Americans still view Alzheimer’s as a predominantly inherited disease.
The concept of cognitive reserve dates back to the early 1990s when it was coined by Yaakov Stern, a professor of neuropsychology at Columbia University’s Taub Institute for Research on Alzheimer’s Disease and the Aging Brain. In 1994, Stern published a groundbreaking paper in the Journal of the American Medical Association on a sample of about 600 patients aged 60 and older, which documented a clear link between educational and occupational attainment and a decreased risk of Alzheimer’s. “At first, the article got a lot of flak,” Stern says. “I received a letter from the wife of a Nobel Prize winner who called it idiotic because her husband suffered from Alzheimer’s. But I wasn’t suggesting that being brilliant means you won’t get the disease. “What I have been trying to get across,” adds Stern, who has since published a stream of papers on the notion of cognitive reserve, including neuro-imaging studies that connect it to regional cerebral blood flow, “is that experiences acquired over a lifetime can stave off dementia — often for several years.”
Today the concept of cognitive reserve is no longer quite so controversial, and neuroscientists have also documented how it mitigates various risk factors, including the presence of the Alzheimer’s susceptibility allele, APOE4. In a study published in JAMA Neurology in 2014, Prashanthi Vemuri, an assistant professor of radiology at the Mayo Clinic, reported that for carriers of APOE4 who also had what were considered to be high levels of intellectual enrichment, the onset of cognitive impairment occurred nearly nine years later than for APOE4 carriers with low lifetime intellectual enrichment. [read more]