Meta’s independent Oversight Board has called on the company to update its rules around the presentation of nudity, particularly as it relates to transgender and non-binary people, as part of a new ruling over the removal of two Instagram posts that depicted models with bare chests.
The case relates to two separate posts, made by the same Instagram user, which both featured images of a transgender/non-binary couple bare-chested with the nipples covered.
The posts were aimed to raise awareness of a member of the couple seeking to undertake top surgery, but Meta’s automated systems, and subsequent human review, eventually removed both posts for violating its rules around sexual solicitation
The user appealed the decision to the Oversight Board, and Meta did restore the posts. But the Oversight Board says that the case underlines a key flaw in Meta’s current guidelines as they relate to transgender and non-binary users.
As per the Board:
“The Oversight Board finds that removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities. These cases also highlight fundamental issues with Meta’s policies. Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy, or the publicly available guidance. This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”
The Board notes that Meta’s original removal of these posts was due to flawed interpretation of its own rules, which largely comes back to how they’ve been written.
“This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”
The Board further notes that Meta’s enforcement of its nudity rules are often ‘convoluted and poorly defined’, and could result in greater barriers to expression for women, trans, and gender non-binary people on its platforms.
“For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies.”
The Board has recommended that Meta update its approach to managing nudity on its platforms, by defining clearer criteria to govern its Adult Nudity and Sexual Activity policy.
“[That will] ensure all users are treated in a manner consistent with human rights standards. It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard.”
It’s an interesting ruling, in line with evolving depictions of nudity, and the significance of the message that such can convey. And with societal attitudes shifting in this area, it’s important that Meta also looks to develop its policies in line with this, in order to broaden acceptance, and push these key conversations forward.
The Oversight Board continues to be a valuable project for Meta’s policy enforcement efforts, and a good example of how external regulation could work for social media apps in content decisions.
Which Meta has been pushing for, with the company continuing to call on global governments to develop overarching policies and standards, to which all social platforms would then have to adhere. That would take a lot of the more complex and sensitive moderation decisions out of the hands of internal leaders, while also ensuring that all platforms are operating on a level playing field in this respect.
Which does seem like a better way to go – though developing universal, international standards for such is a complex proposal, which will take much cooperation and agreement.
Is that even possible? It’s hard to say, but again, Meta’s Oversight Board experiment underlines that there is a need for external checking to ensure that platform policies are evolving in line with public expectation.