Meta has been forced to apologize to Instagram
users and assure them that a “bug” on the platform has been fixed, which allowed inappropriate content to sneak into the personalized recommendations section of Reels’ short vertical videos.
Image source: Nimi Diffa / unsplash.com
«”We fixed a bug that caused some users to see content in their Instagram Reels feed that was not supposed to be recommended. We apologize for the error,” CNBC quotes a Meta
representative as saying. Earlier, Instagram
users began complaining en masse on other platforms about “not safe for work” and violent content appearing in recommendations — it continued to show up even when the strictest moderation settings were enabled. This includes images of physical details, violence, and the corresponding comments on such content. These publications are often marked as “Sensitive Content.”
Meta claims to have over 15,000 people searching for inappropriate content on its platforms, assisted by AI tools that detect and remove “the vast majority of prohibited content” before users can report it. Meta
is also working to ensure that its recommendation algorithms do not include content that is “low-quality, objectionable, sensitive, or unsuitable for younger viewers.” In early January, the company announced plans to relax its moderation policies to reduce the number of errors that sometimes made them seem more like censorship.
Despite the latest incident, Meta’s Reels short-form video section seems like a promising direction. Instagram
is considering launching a separate Reels app — the platform’s CEO Adam Mosseri told his subordinates this week, Information reported. In January, Instagram
released a separate video editor called Edits — an analogue of TikTok’s CapCut.