Scroll to read more

I’ve got a bad feeling about this.

Among X’s various moderation challenges with its reduced staff pool, child protection has become a key concern, with X CEO Linda Yaccarino set to appear before Congress next week to explain the platform’s ongoing efforts to combat child sexual exploitation (CSE) material.

With that in mind, X has today reiterated its evolving strategies to combat CSE, while it’s also announced a plan to build a new “Trust and Safety center of excellence” in Texas, in order to improve its responsiveness in addressing this element.

As reported by Bloomberg:

“[X] aims to hire 100 full-time content moderators at the new location, according to Joe Benarroch, head of business operations at X. The group will focus on fighting material related to child sexual exploitation, but will help enforce the social media platform’s other rules, which include restrictions on hate speech and violent posts, he added.”

Which is good, addressing CSE should be a priority, and more staffing in this area to focus on this and other harmful elements, is obviously important.

On one hand, this could be seen as a proactive response to reassure lawmakers, while also improving X’s appeal to ad partners, but I have a sneaking suspicion that another, more controversial plan could be at play in this case.

Back in 2022, Twitter explored the possibility of enabling adult content creators to sell subscriptions in the app, in an effort to tap into OnlyFans’ $2.5b self-made content market.

Adult content is already very present on X, and readily accessible, so the logical step to make more money for the platform was to monetize this, leaning into this element, rather than just turning a blind eye to it.

So why didn’t Twitter go through with it?

As reported by The Verge:

Before the final go-ahead to launch, Twitter convened 84 employees to form what it called a “Red Team.” The goal was “to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly”. What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not – and still is not – effectively policing harmful sexual content on the platform.”

As you may have guessed, the most concerning elements raised as a result of this exploration were child sexual exploitation and non-consensual nudity.

Because X could not adequately police CSE, enabling the monetization of porn was a major risk, and with a portion of big name advertisers also likely to bolt due to the platform leaning into more risqué material, Twitter management opted not to go in this direction, despite the belief that it could net the company a significant revenue windfall if it did.

But maybe now, with X’s ad revenue still down 50%, and big name advertisers already pausing their ad spend, X is reconsidering this plan, and could be gearing up to expand into adult content subscriptions.

The signs are all there. X recently signed a new deal with BetMGM to display gambling odds in-stream, another controversial element that other social apps have steered clear of in the past, while it’s also now pitching itself as a “video first platform” as it moves towards Elon Musk’s “everything app” vision.

An everything app would logically incorporate adult content as well, and despite the additional cost of assigning a new team to police CSE violations, maybe, X sees a way to offset that outlay with an all-new monetization avenue, by enabling adult content creators to reach many millions more people with their work.

Definitely, X needs the money now more than it did when it first considered the proposal in 2022.

As noted, X’s main ad income stream is still well down on previous levels, while Musk’s purchase of the app has also saddled it with loan debt of around $1.5 billion per year. So despite Musk’s massive cost-cutting, X is still unlikely to break even, let alone make money. And with advertisers still avoiding the app due to Musk’s controversial remarks, it needs new pathways to build its business.

Spending millions on a new moderation center has to have a direct benefit, and while appeasing advertisers and regulators is important, I don’t think that CSE, at this stage, is what’s keeping ad partners away.

Worth noting, too, that X has made a specific note of this usage stat in its announcement:

While X is not the platform of choice for children and minors – users between 13-17 account for less than 1% of our U.S daily users – we have made it more difficult for bad actors to share or engage with CSE material on X, while simultaneously making it simpler for our users to report CSE content.

It seems like something else is coming, and that X is preparing for another push, and I would not be surprised at all if it’s revisiting its adult content plan.

This is, of course, speculation, and only those inside X know its actual strategy moving forward.

But given X’s freedom of speech push, and its need for more money, don’t be surprised if it takes a step in this direction sometime soon.