AI decoding what mice see may enhance future BCIs

An AI tool that decodes what mice see could improve future brain-computer interfaces, according to a new study learn.

The system, called CEBRA, was developed by researchers at EPFL, a university in Switzerland. your goal? To uncover hidden connections between the brain and behavior.

To test CEBRA (pronounced “zebra”), the team tried to decipher what a mouse sees when it watches a video.

“Since the brain is the most complex structure in our universe, it is the ultimate test for CEBRA.

First, the researchers collected freely available neural data from rodents watching movies. Part of the brain activity had been measured with electrode probes in the visual cortex of a mouse. The rest came via optical probes from genetically modified mice engineered so that their neurons glowed green when activated.

All this data was used to train the base algorithm in CEBRA. As a result, the system learned to associate brain activity with specific frames in a video.

Next, the team applied the tool to another mouse that was watching the video. After analyzing the data, CEBRA was able to accurately predict what the mouse saw based on brain signals alone.

The team then reconstructed the clip from the neural activity. You can see the result for yourself in the following video:

Not surprisingly, researchers aren’t just interested in rodents’ filming habits.

“The aim of CEBRA is to uncover structures in complex systems. And since the brain is the most complex structure in our universe, it’s the ultimate testing ground for CEBRA,” said Mackenzie Mathis of EFPL, the study’s principal investigator.

“It can also give us insight into how the brain processes information and could be a platform for discovering new principles in neuroscience by combining data across animals and even species.”

Neither nor CERA is limited to neuroscience research. According to Mathis, it can also be applied to numerous datasets with temporal or joint information, including animal behavior and gene expression data. But perhaps the most exciting application is in brain-computer interfaces (BCIs).

As demonstrated by the movie-loving mice, even the primary visual cortex—often underlying just fairly simple visual processing—can be used to decode BCI-style video. An obvious next step for the researchers is to use CEBRA to improve neural decoding in BCIs.

“This work is just a step towards the theory-based algorithms needed in neurotechnology to enable powerful BMIs,” Mathis said.

You can read the full study paper in Nature.

Comments are closed.