DeepMind’s AI-Crysa Decodes Brain, Paves Way for Perfect Robots

Scientists from Google Deepmind and Harvard University have collaborated to develop a groundbreaking virtual model that holds the potential to unlock the complexities of movement mechanisms and their applications in robotics. This innovative project, blending the advancements in artificial intelligence and neurobiology, has opened up new avenues for research in both fields.

The digital brain of a virtual rodent, constructed using neural networks, was meticulously studied based on the neural activity recordings from live rats. This model not only accurately predicts the brain activity of real animals but also replicates their behaviors, such as running and climbing hind legs, showcasing complex movements.

In order to create this model, researchers placed multiple rats in a specialized arena equipped with six cameras to capture every movement. The rodents were motivated with Cheerios flakes scattered throughout the arena to stimulate their activity.

Throughout the experiment, a total of 607 hours of video footage were recorded, alongside neural activity data obtained through the use of a 128-channel array of electrodes implanted in the rats’ brains.

An essential component of the digital brain is an algorithm called the reverse dynamics model. This program continuously monitors the position of the ‘body’ in space and predicts the subsequent movements required to achieve a specific goal based on this information.

Remarkably, the virtual rat displayed the ability to apply learned motor skills in unfamiliar situations, demonstrating its capability to assess the forces needed to navigate in novel environments.

Comparative analysis of the brain activities in real and virtual rats revealed that the artificial intelligence more accurately mimics the neural signals of the virtual rat during various physical tasks than earlier computational models.

The virtual rat serves as a valuable tool for studying movements in a digital setting. Researchers can manipulate the ‘neural connections’ in the virtual rodent to gain insights into how alterations in brain networks impact final behaviors.

Dr. Bens Olveki from Harvard University remarked that the collaboration with Deepmind has been exceptionally productive, indicating the success of the partnership.

Besides movement, the digital rodent also presents opportunities for studying other aspects of brain function, including vision, perception, and higher cognitive functions like reasoning. This approach proves to be more efficient than traditional laboratory experiments that often span weeks or even months.

In the realm of robotics, this methodology introduces a physical dimension to artificial intelligence, potentially leading to the development of more intelligent and adaptable robots capable of effective operation in diverse conditions.

Researchers are looking forward to further exploration, planning to test the virtual rat on more challenging tasks concurrently with their live counterparts.

/Reports, release notes, official announcements.