Daniel Oberhaus | August 20, 2021

The Scunthorpe Edition

On Dycks, Butts, and the eternal challenges of online content moderation

Recommended Products

Extraterrestrial Languages
Extraterrestrial Languages

A book about the art, science, and philosophy of interstellar communication.

Daniel Oberhaus (DMO) is a science journalist turned content marketer based in New York City. His first book, Extraterrestrial Languages, is about the art, science, and philosophy of interstellar communication. He previously wrote the WITI battery edition and the WITI salvia edition.

Daniel here. Earlier this year, Facebook’s moderation bots deleted the page for a small town in northeastern France for ostensibly violating the social media platform’s terms and conditions. The town’s crime? Using it’s real name for its Facebook page: Ville de Bitche. 

Shortly after it was taken down, the Bitche admins tried to contact the company to get the page reinstated, but their pleas fell on deaf ears. It was only after the story of Bitche’s plight went viral that Facebook acknowledged the error and restored the township’s page. 

In case it isn’t obvious what happened, Bitche bears a close resemblance to a rude epithet in English, which would violate Facebook’s T&C if it were used as the name of an official page for a city. So the platform’s automated moderation system flagged it for removal and swept it into a digital dumpster with the thousands of other rude, lewd, and occasionally illegal posts it scrubs from its platform every day

Why Is This Interesting? 

This wasn’t Bitche’s first run-in with the Facebook police. The town had a similar issue on the platform when it first set up its page in 2016. And it's hardly the first to be censored by a well-meaning algorithm. In fact, the issue is so common it even has a name: this is what computer scientists call “The Scunthorpe problem.”

Scunthrope is a mid-sized industrial town in the UK about an hour’s drive from Leeds and in 1996 it exposed an intractable challenge at the heart of automated content moderation. Like Bitche, Scunthorpe contains a naughty word, which prevented its residents from signing up for AOL using their real location. 

Although AOL soon fixed this specific problem for Scunthorpe’s residents, the general challenge of mistaken obscenity hasn’t gone away. It’s still a very real problem for people with last names like Weiner, Medick, Butts, Dyck, or Schmuck. And paleontologists giving virtual talks on pubic bones. And history buffs headed to Plymouth Hoe

On the surface, these are funny gaffes that allow us to feel superior to machines for a little while longer and revel in the timeless humor of punny nomenclature. Ben Dover. Mr. Glascock. Ville de Bitche. These will still elicit giggles from the back of the classroom long after the machines have taken over.

But the Scunthorpe problem is an instance of the very serious issue of ceding responsibility—and power—to algorithms. It’s fundamentally a problem of mistaken identity that occurs when a machine lacks the context to differentiate between a legitimate input and a malicious one. In every case, the result is the same: a user is wrongfully denied access to a tool, platform, or information. And in many cases, the user doesn’t have recourse to fix the problem. Bitche’s admins discovered this firsthand when they contacted Facebook and were stonewalled. 

The usual excuse is that the Scunthorpe problem is a necessary evil. Facebook manages a platform used by 2 billion people and it must balance user safety with user freedom on an unprecedented scale. It has legions of human moderators working around the clock to scrub the most heinous content from the platform—child porn, gore, animal abuse, you name it—and they can’t possibly catch everything, even with an assist from an army of bots. So sometimes, as in the case of Bitche, the algorithms use a bit too much caution and remove content that only appears to break a rule. 

But this is a dangerous abdication of accountability from organizations that have taken it upon themselves to police online spaces. By Facebook’s own admission, more than 1 in 10 posts it removes from its platform have not violated any rules—that shakes down to about 300,000 wrongfully deleted posts per day. Of course, most of these posts are not instances of the Scunthorpe problem, which was only a bellwether for a far larger problem resulting from the increasing automation and corporatization of the modern web. Facebook gets the most scrutiny due to its size, but it's a problem that is found everywhere that bots act as gatekeepers. And it’s only getting worse.

Nearly a quarter of a century after the Scunthorpe problem was given a name, we still lack an adequate solution. Ask the technologists and they’ll tell you we need smarter AI that can understand human content in context. Ask the Luddites and they’ll tell you content moderation is a job only fit for humans. Ask the content moderators and they’ll tell you they’re sick of doing a robot’s job. Before we find a solution, we need to acknowledge the urgency of the problem. We stand on the cusp of a future where “computer says no,” but this time Carol is nowhere to be found. (DMO)

Pun of the Day

When the US 100th Infantry Division came from South Carolina and liberated Bitche, France, they adopted the nom de guerre "Sons of Bitche.” Image via Ville de Bitche (DMO)

Quick Links: 

Thanks for reading,

Noah (NRB) & Colin (CJN) & Daniel (DMO)

Why is this interesting? is a daily email from Noah Brier & Colin Nagy (and friends!) about interesting things. If you’ve enjoyed this edition, please consider forwarding it to a friend. If you’re reading it for the first time, consider subscribing (it’s free!).

© WITI Industries, LLC.