Pinterest has published its latest Transparency Report, which outlines all of the content removals and other enforcement actions it took throughout the first half of 2022.
And there are some pretty significant shifts in these numbers – take a look at this chart which measures the amount of Pins removed in each category over the past year.
Some pretty erratic shifts.
Specific notes:
- Removals for adult sexual services were way up in Q1 2022, but then normalized in Q2. Why? Pinterest says that this was due to ‘hybrid deactivation of a small handful of images, which account for almost two-thirds of Pins deactivated in Q1 for violating this policy’.
- Child sexual exploitation removals were way up in Q2 2022. Pinterest says that this was due to an update in its detection systems
- Conspiracy theory removals are way down, due to a mass clean-up in 2021
- Dangerous goods removals also way down – also following a ‘sweeping clean-up’ in 2021
- Graphic violence and threat removals are up this year – ‘in part because of the content related to the war in Ukraine.’
- Medical Misinformation removals way up in Q2 2022
- Self-injury and harmful behavior removals are way, way up as a result of Pinterest’s ongoing work to improve its detection and reporting processes
- Spam and harassment relatively steady, with seasonal peaks
As you can see, the numbers here fluctuate a lot, based on different approaches and updates, and it’s interesting to consider what this means for overall Pin activity, and how it’s working to protect users.
Does that mean that there’s long been more child exploitation and self-injury content in the app, that Pinterest has only recently been able to detect? That’s good, in that it’s improving its systems, but it also seems like a lot of this material has been active on the platform for some time.
Pinterest says that the majority of violative material is removed before anyone ever sees it – and again, it’s critical that Pinterest continues to improve on this front. But it is worthy of note that at least some of the worst kinds of content had been viewable in the app for some time, before these more recent updates.
Indeed, back in May, DailyDot reported that users had discovered various child grooming accounts on Pinterest, which may have sparked the platform’s extra attention on this element. Pinterest also recently issued an apology to the family of UK schoolgirl Molly Russell, an avid Pinterest user who had viewed self-harm material in the app. Russell, 14, took her own life as a result of online bullying.
Again, it is a positive that Pinterest has responded to such cases, and upped its enforcement in these areas, but it could also suggest that there are other areas that Pinterest isn’t as stringent on, as they haven’t been a source of media attention as yet.
But that’s also speculative. It does seem like media pressure has spurred Pinterest to take more action on certain elements, but it’s impossible to know the extent of each without additional oversight.
So, on one hand, it’s good that Pinterest is taking more action, but on the other, it may still have work to do.
But we can only go on the data provided, and it does appear as though Pinterest’s systems are improving, and helping to protect users from exposure to the worst types of content, while also reducing the amplification of such.