A team of researchers from the EPFL has developed an AI tool that can interpret rodent brain signals in real time and then recreate the video a mouse is watching.
Let me remind you that we also talked about Strange Enthusiasts that Asked ChaosGPT to Destroy Humanity and Establish World Domination, and also that Microsoft’s VALL-E AI Is Able to Imitate a Human Voice in a Three-Second Pattern.
And also the media wrote that UN calls for a moratorium on the use of AI that threatens human rights, but it seems that the apocalypse can no longer be stopped 😉
A machine learning algorithm called “CEBRA” has been trained to display neural activity at specific frames in a video. The algorithm can then predict and reconstruct the video clip the mouse is watching.
The mice were playing a 1960s black-and-white film about a man running to a car and opening the trunk. The original film and CEBRA reconstructed footage were found to be identical.
In their study, the scientists measured and recorded mouse brain activity using probe electrodes inserted into the visual cortex, as well as optical probes for mice that were genetically engineered to glow green when neurons in their brains transmit information.
CEBRA was trained on movies watched by mice and their brain activity in real time. With this data, CEBRA learned which brain signals are associated with certain frames of the film.
In addition, when CEBRA began to process new brain activity that was not in the training dataset (from a mouse watching a different movie), CEBRA was able to predict what frame the mouse was watching in real time, and the researchers turned that data into their own movie.