Astronomers from the University of Hall have developed a groundbreaking new method for detecting AI-generated images by analyzing reflections in the eyes. The research was presented at the Royal National Astronomical Conference astronomical society.
The innovative technique, adapted from tools typically used to study galaxies, allows researchers to pinpoint discrepancies in the reflection of light in the eyes. Under the guidance of Professor Astrophysics Kevin Pimblett, the head of the study, undergraduate Adezhumoka Vylabi, spearheaded the development of the method.
The premise of the method is fairly straightforward: a pair of eyes illuminated by the same light source should typically exhibit identical reflections in each eye. However, many generative systems deviate from this norm, resulting in variations in the reflections produced.
Pimblett elaborated that the newly developed technique enables the automatic detection of reflections in the eyes, facilitating a comparison of their morphological characteristics between the left and right eye. This analysis revealed significant differences in reflections present in deepfakes and other AI-generated images.
To quantify the reflections in the eyes, the research team employed established astronomy methods, including Gini’s coefficient – a measure commonly used to gauge light distribution in galaxy images. A coefficient value closer to 0 signifies an even light distribution, while a value closer to 1 indicates concentrated light in a single pixel.
While exploring the similarities between studying eye reflections and measuring galaxies, researchers experimented with applying CAS parameters – used to assess light distribution in galaxies – but found this method less effective for distinguishing AI-generated images.
Although the technique of analyzing reflections in the eyes displays promise, its efficacy may diminish as AI models begin incorporating physically accurate reflections. Additionally, the method requires high-quality close-up images of the eyes to function effectively and can yield false positives due to variations in reflections caused by diverse lighting conditions or post-processing edits.
Despite these limitations, eye reflection analysis has the potential to serve as a valuable tool in the fight against deep fakes, complementing existing methods such as hair texture analysis, anatomical scrutiny, skin detail examination, and background consistency assessment. Pimblett emphasized that while the method is not flawless, it lays the groundwork for the development of a comprehensive system for detecting digital forgeries.