Instagram users are faced with an unexpected stream of cruel content in the Reels section due to a malfunction in the platform algorithm. Shots of violence, cruelty to animals, and death scenes appeared in the ribbons.
On Reddit, users discussed the incident, sharing their experiences. One of them wrote: “I just saw at least 10 people died in Reels.” Others mentioned the video in which a person crushes an elephant, someone falls into boiling oil, or is dissected by helicopter blades. Many published screenshots with numerous warnings about “sensitive content.”
Site 404 Media reported that one of the users appeared in the tape scenes where a man is set on fire, the cashier is shot point-blank, and a video from an account called PEOPLEDEADDALY. At the same time, the user himself was only interested in bicycle sports.
META (recognized as an extremist and prohibited in Russia), which belongs to Instagram, said that the problem has already been fixed. “We corrected the error because of which some users saw content that should not have appeared in their recommendations. We apologize for this failure,” the company representative said.
The failure occurred against the background of changes in the policy of moderation of META content, including reducing the number of facts and a decrease in censorship. However, the company claims that this incident is not related to updated rules.
In the UK, the Online Safety Act law requires social networks to prevent harmful content to minors, including algorithmic filtration. The online security group Molly Rose Foundation demanded an explanation from Instagram. According to their head, Andy Barrose, a decrease in the level of moderation of META can lead to such cases becoming more frequent.