A few years back, Scunthorpe Hospital installed a new computer system.

On the first day it was switched on, the staff began using email as usual.

Around lunchtime they noticed they weren’t getting any replies to any of their emails.

They quickly checked the new system and found it was working fine, too well in fact.

The problem was the new system’s profanity filter, it was geared to spot obscene words and block them.

Of course, a computer system doesn’t understand what words are, the only way it can spot words is by connecting a string of letters.

The system had been given a list of seven words which were considered obscene and not to be allowed.

The most obscene of these words was the 2nd, 3rd, 4th, and 5th letters of Scunthorpe’s name.

The name that was on every email the staff sent out.

So as soon as anyone outside the hospital pressed reply, the system read the letters on the address and the email was automatically blocked as obscene.

This has become known in computer circles as ‘the Scunthorpe problem’.

Systems all over the world were, and still are, rejecting strings of letters they believe to be offensive words.

Belgian political candidate, Luc Anus, was blocked for this reason.

So was Jeff Gold’s website, Shitake Mushrooms.

Arun Dikshit had the same problem, so did Ben Schmuck, also Mike Dickman, Craig Cockburn, Douglas Kuntz, James Butts, and Brian Wankum.

Places like Penistone, Middlesex, Clitheroe, and Lightwater were rejected for the same reason.

The Royal Society for the Protection of Birds was blocked for tits, cocks, boobies and shags.

Manchester Council planning department had problems with emails mentioning erections.

A councillor from Dudley was blocked for telling visitors the local faggots were tasty.

Even Arsenal football club and French TV station Canal Plus had similar problems.

Another filter automatically changed the word ass to butt, so ‘classic’ became ‘clbuttic’ and ‘assassinate’ became ‘buttbuttinate’.

The Horniman Museum faced restrictions, as did several Dick Whittington pantomimes.

So, it would seem the problem with technology is merely excessive vigilance.

Well not quite, American tech writer, Kaveh Wadell, tried an experiment.

He sent seven advertisements to run on Facebook, all contained extremely dangerous, fake Coronavirus advice.

The ads were for a fictitious advertiser called the Self-Preservation Society.

One ad said: Coronavirus is just a hoax, carry on living your life as normal.

Another ad said: Social distancing is ineffective, ignore it.

A third said: People under 30 are completely safe from Coronavirus.

A fourth said: Don’t stay in, just return to normal.

All these ads were approved to run within minutes.

But before they could run, Wadell had the site taken down, he’d made his point.

Facebook made $30 billion from advertising last year, they say they don’t have enough human workers, so automated ad-screening is done by algorithms.

But it seems technology is failing on both counts.

It’s failing to block what it should block, and blocking what it shouldn’t block.

Don’t we just need a few good old-fashioned human brains in there somewhere?

Something smarter than technology.