Meta’s Oversight Board, which independently evaluates tough content moderation decisions, has reversed the company’s removal of two posts that depicted the bare chest of a non-binary and transgender person. The case represents a failure of a complicated and impractical nudity policy, the Council said, advising Meta to look seriously at reviewing it.
The decision involved two people who hoped to have major surgery (usually breast tissue reduction) as part of a fundraiser for one of the couples. They posted two images to Instagram, in 2021 and 2022, both with bare breasts but covered nipples, and included a link to their fundraising site.
These posts were repeatedly flagged (by AI and users) and Meta eventually removed them as violations of the “Sexual Solicitation Community Standard”, mainly because they combined nudity with solicitation for money. While the policy is clearly intended to prevent recruitment by sex workers (an entirely different issue), it was reused here to remove completely innocuous content.
When the pair appealed the decision and took it to the Board of Trustees, Meta recanted it as a “mistake.” But the board took it up anyway because “the removal of these posts is inconsistent with Meta’s community norms, values, or human rights responsibilities. These cases also reveal fundamental issues with Meta’s policy.”
They wanted to take the opportunity to point out how impractical the policy is as it exists, and to recommend that Meta take a serious look at whether its approach here actually reflects the stated values and priorities.
The restrictions and exceptions to the female nipple rules are extensive and confusing, especially as they apply to transgender and non-binary people. Exceptions to the policy range from protests to scenes of childbirth and medical and health contexts, including top surgery and breast cancer education. These exceptions are often complicated and ill-defined. For example, in some contexts, moderators must review the extent and nature of visible scars to determine whether certain exceptions apply. The ambiguity inherent in this policy creates uncertainty for users and reviewers and makes it unworkable in practice.
Essentially, even if this policy represented a humane and appropriate approach to nudity moderation, it is not scalable. For some reason Meta should adjust it. The executive summary of the council’s decision can be found here and includes a link to a more complete discussion of the issues. (When I asked about past times they’d challenged this policy, they noted this 2020 breast cancer awareness case.)
The obvious threat Meta’s platforms face, if they relax their nudity rules, is porn. Founder Mark Zuckerberg has said in the past that it is necessary to take a clear stance on sexualized nudity in order to make his platforms suitable for everyone. You are allowed to post sexy stuff and link to your OnlyFans, but please no hardcore porn in Reels.
But the Oversight Board says this “public morality” stance should also be reviewed (this excerpt from the full report has been slightly edited for clarity):
Meta’s rationale of protecting “community sensitivity” deserves further investigation. This rationale has the potential to align with the legitimate goal of “public morality.” That said, the Council notes that the aim of protecting “public morals” has sometimes been falsely invoked by government regulators to violate human rights, particularly those of members of minorities and vulnerable groups.
In addition, the Council is concerned about the known and recurring disproportionate pressure on expression experienced by women, transgender people and non-binary people as a result of Meta’s policies…
The Council received public comments from many users expressing concern about the presumed sexualization of female, trans, and non-binary bodies, when no similar image sexualization assumption is applied to cisgender men.
The board has taken the bull by the horns here. There’s no point dancing around it: the policy of acknowledging some bodies as inherently sexually suggestive and others not is simply untenable in the context of Meta’s ostensibly progressive stance on such matters. Meta wants to have its bread and eat it too: pay lip service to the likes of the trans and NB folks like the ones who brought this to the fore, but also respect the more restrictive morals of conservative groups and pearl clutchers worldwide.
Those Board members who support a gender- and gender-neutral adult nudity policy recognize that international human rights standards as applied to states allow for distinctions to be made on the basis of protected characteristics based on reasonable and objective criteria and when they serve a legitimate purpose. They don’t believe the differences within Meta’s nudity policy meet that standard. They further note that as a company, Meta has made human rights commitments inconsistent with an approach that limits online expression based on the company’s perception of sex and gender.
Citing various reports and internationally negotiated definitions and trends, the council’s decision suggests that a new policy should be forged that abandons the current structure of categorizing and removing images and replaces something more modern definitions. of gender and sexuality. Of course, this, they warn, could open the door to things like the posting of non-consensual sexual images (much of this being automatically flagged and removed, something that could change under a new system), or an influx of adult content. However, the latter can be tackled in other ways than with a total ban.
When reached for comment, Meta noted that it had already reversed the removal and welcomed the board’s decision. It added: “We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy groups on a range of issues and product improvements.” I’ve asked for specific examples of organizations, issues, or improvements and will update this post if I hear back.