Meta blames ‘technical error’ for mass removal of inoffensive LGBTQ+ posts
Meta – the social media giant that owns Facebook, Instagram, WhatsApp and Threads – has admitted to removing inoffensive LGBTQ+ posts, claiming this was due to a “technical error.”
In November, many Facebook administrators and members of the LGBTQ+ community reported that their innocent posts had been removed without explanation.
As reported by Australian LGBTQ+ news outlet QNews at the time, posts by the publication and fellow queer publisher OutInPerth were affected.
Black Pride Western Australia and Out South West also had promotional posts for LGBTQ+ counselling service QLife Australia removed.
‘Technical error’ to blame for post removals
The LGBTQ+ people and organisations who had their posts pulled had received warnings that they had “shared or hosted malicious software” that “goes against Community Standards on cybersecurity.”
Meta has now confirmed that this was not the case, instead, the posts were removed due to a “technical error.”
QNews added that the Meta spokesperson apologised for the inconvenience but didn’t detail what led to the issue – or what Meta will do to stop it from happening again.
Meta’s Transparency Centre states that its platforms use AI and machine learning tools to help remove “violating content.”
‘We’d definitely love to hear from Meta’
OutInPerth editor Graeme Watson told ABC: “Meta has given us no information beyond a little pop-up that says you’re a problem, we’ve taken away your posts, stop being a problem.
“We’d definitely love to hear from Meta. There’s a massive number of community groups and organisations who are all wanting an answer.”
In October last year, Meta’s oversight board criticised the company for its failure to remove a graphic video which showed two men in Nigeria who appeared to have been attacked for being gay.
Meta’s oversight board found that allowing the video – which violated four community standards – to remain on Facebook for five months posed “a risk of immediate harm to the men by exposing their identities, given the hostile environment for LGBTQIA+ individuals in Nigeria.”
In the video, the user included an English caption mocking the men. Months later, it had attracted 3.6 million views. The video was reported several times and reviewed by human moderators, who decided it did not violate the platform’s rules.
In response, Meta said the post was “left up in error.”
Share your thoughts! Let us know in the comments below, and remember to keep the conversation respectful.
How did this story make you feel?