Social media giant Meta has admitted that a bug led to a spread of misinformation over Facebook between October and March, Daily Mail reported.
Facebook’s parent company Meta has admitted that a bug led to a ‘surge of misinformation’ and other harmful content appearing in News Feeds.
According to an internal document, engineers at Meta failed to remove posts from ‘repeat misinformation offenders’ for nearly six months.
The document said that Facebook systems also failed to demote or remove content related to nudity, violence and Russian state media during the war on Ukraine.
Talking to The Verge Meta spokesperson Joe Osborne said that they detected inconsistencies in downranking on five separate occasions, which correlated with small, temporary increases to internal metrics.
He added that they traced the root cause to a software bug and applied needed fixes. However, the bug has not had any meaningful, long-term impact on our metrics.
Also Read: Balle Balle Land: Daler Mehendi buys metaverse property
The Verge reported that instead of removing and blocking posts from repeat misinformation offenders that were reviewed by the company’s network of outside fact-checkers, the News Feed was instead distributed the posts.
Facebook Accounts designated as repeat ‘misinformation offenders’ experienced as much as a 30% spike in their views.
According to the internal document, the issue was finally solved three weeks ago, on March 11.
Also Read: Meta launches initiatives for woman safety in Pakistan