You want us to think of the children? Couldn’t agree more
Opinion If your cranky uncle was this fixated about anything, you’d always be somewhere else at Christmas. Yet here we are again. Europol has been sounding off at Meta for harming children. Not for the way it’s actually harming children, but because – repeat after me – end-to-end encryption is hiding child sexual abuse material from the eyes of the law. “E2EE = CSAM” is the new slogan of fear.
Encryption as the enemy of the people has been the manic state preachers’ sermon of choice ever since Bill Clinton was influenced by the NSA in 1993. His administration’s attempt to enforce an official, mandated backdoor, the Clipper chip, fell apart under its own contradictions, and that’s been one of the trifecta of touchstones of this sort of nonsense ever since. Weakening end-to-end encryption can’t work, trying to make it work will be actively harmful, and there’s no evidence that it’s a good idea.
These points were emphasized by Robin Wilton of the Internet Society in an interview with The Register last week, where he pointed out that this tired old beat on a clapped-out drum presents no evidence, suggests no secure alternative, and answers none of the points made in reply in years of this sort of thing. Indeed, as Wilton pointed out, if increased use of E2EE were hindering CSAM seizures, it would show in the stats. It doesn’t.
Thinking of the children is not a good enough excuse to destabilize the entire internet. Ironically, the victims of this sort of simplistic, panicky, magical thinking are young people themselves. They are a high value target in all this, and weakening their own transactional security is as bad an idea as it sounds. Another attack on them is going on in plain sight. As a recent report from Dublin City University shows, TikTok and YouTube Shorts algorithms are devastatingly effective at focusing streams of misogynistic and male supremacist material at their users. The most abundant, and most vulnerable, of those users are teenagers. This is not a neutral act.
All this feeds into the newest battle for the souls of the children. There is a crisis in young people’s mental health, the argument goes, and it’s the fault of social media. That comes via mobile phones so let’s stop them using the devices.
It’s unclear how this last bit will be done. Sound familiar? There is talk of banning mobile phones for under-16s, which is unmatched in stupidity when most homes have six or seven of the things sitting unused in the e-waste drawer. Also, there are about 37 million teenagers in the US and UK – good luck with that.
Another proposal is to emulate the Chinese government and impose strict time limits on phone usage and video games – China hasn’t got the message that the video game moral panic was so 2018. How we do this without the state mechanisms of authoritarian social and industrial control is glossed over.
As for the presumption that there’s a mental health crisis among young people in the first place, this is used to call for all sorts of things short of adequate mental healthcare for young people. There are people who say that there is such a crisis and it is the fault of screen time. They write books and are favorably quoted by politicians and activists. One such is Jonathan Haidt, an American psychologist whose The Anxious Generation is very much in vogue and quoted as evidence that something indeed must be done. Other psychologists find his methodology and groundwork less than convincing, but why would that bother anyone?
There are a lot of unhappy kids around, yes, and adults too. The pandemic was tough for us, twice as tough for them. For those coming into the responsibilities of adulthood, the world looks highly uncertain: jobs, money, houses, war. One of the few certainties is that everyone’s lives will be lived to a great extent online. Not letting young people learn those skills would be truly destructive.
None of this matters to the “something must be done” chums, for whom uncertain evidence, impossible cures, and worse outcomes are, as we have noted, no barriers at all.
Very well then, if our industry doesn’t want to be used as an excuse for moronic, well-meaning over-reaction, let’s show them what can be done. Fix the toxic algorithms, create standard, easy-to-use, cross-platform, secure, and powerful parental controls for their children’s devices, ones that let parents decide what’s right and tell the children what’s going on.
That’s where decisions should be made. If the industry commits to doing this to the best of its abilities, in consultation with lawmakers, activists, parents, and children, then far worse options will be avoided for the good of all.
Oh, and did we say? Stop those toxic algorithms. ®
READ MORE HERE