Modern generative tools of AI have made the creation of a diplay-pornography accessible to everyone. In report from 2023 Home Security Heroes, engaged in reviews of personality protection services, it is indicated that a 60-second diplay-pornographic video requires only one clear image of the face and less than 25 minutes. This was made possible thanks to free tools.
The public attention to this problem was attracted in January 2023 Graphic Dipfecta Taylor Swift, spreading on social networks and gaining 47 million views before removal. Similar cases affected not only the stars of the entertainment industry, such as Korean pop stars, but also ordinary people. According to the report, 99% of the victims of such videos are women and girls.
Reaction and new technologies
This situation has prompted active actions by women. The founder of the Nadia Lee startup said: “If security technologies do not develop at the same speed as AI, we are waiting for a disaster.” Despite studies on the discovery of dipfaces, these tools do not keep pace with the technologies for their creation. In addition, their effectiveness depends on the desire of the platforms to deal with dipfaces, and most of these videos are posted on specialized sites.
Lee, that’SmyFace, develops instruments for recognition of images for corporate clients so that their logos, uniforms, or products do not appear in pornography. In the future, it is planned to create a tool for scanning the Internet for the availability of dipfake imitations or videos with users’ faces.
Personal experience and new solutions
The founder of the startup ackto ai, Breeze Liu, became the victim of the display of pornography in 2020, discovering more than 800 links to a fake video. The police were powerless, and she had to track and delete the video on her own. Liu decided to use AI to combat AI and created an application to verify the use of images of users on the main social platforms.