Meta’s decision to not remove three posts relating to the Southport stabbings that sparked UK-wide riots is being investigated by its independent oversight board.
In the hours after three girls were murdered in Southport while attending a Taylor Swift-themed dance class, rumours spread online that the murderer was a Muslim asylum seeker who had arrived in the UK by boat.
It later emerged that the suspect in the case, Axel Rudakubana, was born in Cardiff in Wales to a Christian family.
The three posts being investigated referred to migrants as terrorists, contained AI-generated images of Muslim men being chased, and shared protest gathering times.
The social media giant’s Oversight Board is an independent body of experts who make binding decisions on how Instagram and Facebook should moderate their content.
It has now opened an investigation into the decisions to keep those three posts online and wants to hear from the public.
The first post called for mosques to be smashed and buildings to be set on fire “where scum are living” and referred to “migrants, terrorist”.
It argued that without the riots, the authorities wouldn’t listen and put a stop to “all the scum coming into Britain”.
The second post showed what looked like an AI-generated image of a giant man wearing a union jack T-shirt who is chasing several Muslim men.
The post shared a time and place to gather for one of the protests and included the hashtag “EnoughIsEnough”.
The third post was another likely AI-generated image of four Muslim men running after a crying blonde toddler in a union jack T-shirt.
Read more from Sky News:
What does ‘brain rot’ actually mean?
Australia passes social media ban for under 16s
Reddit now more popular than X in UK
One of the men waves a knife while, above, a plane flies towards Big Ben.
The image is accompanied by the caption: “Wake up.”
All three posts were reported to Facebook by users but all three remained on Facebook after automated assessments.
Even after users appealed against those decisions, the content stayed on the site and was never reviewed by a human.
Once the Oversight Board took on the investigation, Facebook deleted the first post but the two others remained online.
It said when dealing with posts around protests, it favours maximum protection for “voice”, or freedom of speech.
As part of the investigation, the results of which will be published in around 90 days, the Oversight Board is calling for comments from the public about how social media impacted the riots, and any links between online hate speech and violence or discrimination.