Scientists have created artificial intelligence that can analyze videos like the human brain. The newly developed Movienet model by specialists at Scripps Research scientific institute shows high accuracy in recognizing complex moving scenes.
Unlike traditional AI models that focus on static images, Movienet can identify and interpret complex changes over time. The research findings are published in the proceedings of the National Academy of Sciences.
In creating Movienet, scientists looked at studies of tadpole brains. They examined how neurons in the visual area of tadpole brains piece together visual information into cohesive sequences within 100 to 600 milliseconds.
During testing, Movienet showed an 82.3% accuracy rate in distinguishing normal and abnormal tadpole behavior while swimming. This is 18% higher than human observation accuracy and 10% higher than Google’s Googlenet AI model.
An important advantage of Movienet is its energy efficiency. By simplifying data to core sequences, the model requires fewer computing resources compared to traditional AI systems, while maintaining high performance.
This technology opens up new possibilities in medicine. Movienet can aid in early diagnosis of neurodegenerative diseases by identifying changes in patient motor activity invisible to the naked eye. Additionally, the system can track cell responses to drugs during medication development.
The potential uses of Movienet extend to autonomous driving systems, where AI can enhance safety by accurately recognizing changes in the road environment and pedestrians.
Developers aim to broaden Movienet’s capabilities to handle more complex scenarios and utilize it in various fields, such as environmental monitoring and wildlife tracking.