Skip to main content
October 27, 2014

by Dr. Bradley Buchsbaum, Scientist, Rotman Research Institute at Baycrest

I recently attended a global workshop in Toronto on behalf of Baycrest, convened by the influential Organization for Economic Development and Co-operation. A group of around 50 scientists, engineers, policy-makers, business executives, government representatives, economists and philanthropists gathered at the Carlu to discuss how to leverage “big data” to solve the mystery of Alzheimer’s disease and other forms of dementia.

The diversity of the attendees at Monday’s meeting reflected the enormity of the challenge: over 35 million people worldwide suffer from some sort of dementia, and that number is increasing at a rapid rate due to what is known as the silver tsunami – aging populations.

What is big data and why do we need it?

Traditionally scientists have solved nature’s puzzles through a painstaking process where individuals or small teams of researchers design and conduct careful experiments that test an hypothesis about this or that aspect of a problem, one step at a time. Progress is made slowly through the steady accumulation of knowledge about a disease, a biological mechanism, or a physical process. Occasionally, a radical insight—for example, the discovery of the double helical shape of DNA—finally solves a scientific puzzle with a breathtaking air of elegance and finality.

After decades of searching for “neat solutions” to brain diseases like Alzheimer’s, however, the scientific community has realized that the problem is so complex and multifaceted that we cannot rely on the incremental advances of small isolated teams, nor on the inspiration of a lone genius to make substantive progress. There is not a single answer or mechanism that “explains” dementia; rather there is almost certainly a constellation of interacting factors—genetic, demographic, environmental, lifestyle—that conspire in various ways to cause, or provide protection from, neurodegenerative diseases. To unravel this deep complexity requires a new approach that takes advantage of technological advances by way of supercomputers that gather, store, and mine vast quantities of data that can lead to breakthrough findings that will help us tackle dementia. A decade ago such an approach would have been unthinkable: the computing infrastructure, the scientific culture, and mechanisms for funding research, were simply not in place.

Baycrest researchers are key players

As was pointed out by one of the conference attendees: unless big data is transformed into coherent knowledge, the enterprise will fail. Researchers such as Drs. Randy McIntosh and Stephen Strother from Baycrest’s Rotman Research Institute have known this for years and have pioneered statistical methods that can take datasets with millions of variables and boil them down into a small number of “latent factors” that summarize the most important aspects of an enormous dataset. Baycrest’s researchers and others in the field of neuroscience are actively developing tools, in collaboration with the Ontario Brain Institute led by Dr. Donald Stuss, to organize and analyze genetic, neuroimaging and other variables, and discover the informational needles in the big data haystack. Ultimately, if big data is going to help us conquer dementia, it will have to be a global effort supported by governments, private industry, and scientists from all over the world working in concert. The OECD event at the Carlu was a strong step towards making this vision a reality in the coming years. Baycrest’s Rotman Research Institute is excited to be a part of this global effort.

Next Article