New Microscope Views and Manipulates Neurons in Live Animals

A still image snapped by the microscope at the University of California, Berkeley. Image courtesy of Hillel Adesnik.

Though neuroscientists make almost daily strides in cracking the brain’s complex circuitry, there is still much to be learned about how the brain processes sensory perception. Now, researchers at the University of California, Berkeley have developed a powerful new microscope that not only can hone in on a small number of neurons in an animal brain, but can manipulate them through light, known as optogenetics. The results of this research were presented in April at the American Association of Anatomists Annual Meeting.

This is no high school science class microscope, but a massive instrument about half a room in size that uses two-photon lasers to create a 3D image of the neurons under its beams in real time. The lasers are projected through a device called a spatial light modulator, similar to a conventional digital projector, which allows the microscope to project light anywhere along an axis. “The idea here is to create a hologram, a three-dimensional patterning of light,” Hillel Adesnik, Ph.D., assistant professor of neurobiology at UC Berkeley, who led the research team, tells mental_floss. “Three dimensions are important because the brain is three-dimensional.”

The device allows them to do both imaging and photostimulation at the same time, he says. To do this, they implanted small glass windows into the skulls of mice that had been genetically modified to have a greater number of neurons that are sensitive to light. They tracked and recorded the brain activity of specific individual movements, like a mouse wiggling its whisker, or touching a specific shaped object.

In other tests they trained the mice to discriminate different objects primarily using their whiskers, which are as sensitive as, if not more than, human fingertips. “Then we record the brain activity while they touch those objects, and play them back under our microscope and try to fool them into thinking they’ve actually touched a cube instead of a sphere, or vice versa,” Adesnik says.

Adesnik, who primarily studies sensory perception, says his goal is to understand how we perceive the world through our senses, and to identify the neural signatures of such perceptions: “If we think of the language of the nervous system as a series of these electrical events we call action potentials that occur in neurons in space and time, millions per second, we want to understand that language as we do any language.”

He likens this to the story of the Rosetta Stone—a simple key that allowed people of different languages to understand each other via a few simple shared similarities. In his research, however, the goal is to get enough basic information to crack the neural code of a specific activity—in this case a specific sensory perception. “What we’ve done in my lab is to be able to write in the activity at the same spatial and temporal scale that the underlying neural circuits actually operate at,” he says.

While the implications of this technology are mostly for research purposes, Adesnik does envision its usage one day in understanding and treating neurological disease, or in building implantable technology that might allow the control of neurons for a variety of functions, or to aid in brain surgery.