• Menu

0 recente resultaten

Google accuses you of child abuse? Impossible! Right?

The legislator in Europe is working on a proposal that could force companies to scan all messages we exchange for child abuse. The goal is noble, of course, but it can very easily go wrong. And if things go wrong, you might suddenly be accused of sexually abusing children.

All photos safe in the cloud

The New York TimesA Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. reported a month ago about a father in the U.S. whose child was sick. The boy's penis was swollen and painful. Their general practitioner asked the father to send pictures of the boy's genitalia. The father had taken those pictures with his phone. The phone was set up to automatically send each photo to Google's cloud as well. Handy, in case he lost his phone. But Google's computers use artificial intelligence that analyzes all the photos of all users. If there is possible sexual abuse of children, all alarm bells go off. And that happened here too, along with all its consequences. Google deleted the father's account and alerted the police.

Human as fail safe

Whoops. There was no child abuse at all in this story. But because the photos are evaluated automatically, such errors are unavoidable. This is partly because computers are much less able to assess the context in which such a photo was taken compared to humans. That is why policymakers who want to use this type of technology often propose a human fail safe. Before taking any action, a person not a machine must first have looked at it. But that's a tough "solution" to fix a system that we can't get to work flawlessly. Because when you think about it: that way of checking means asking ordinary people at ordinary companies to look into child abuse. These are not people who are trained for this, or who receive the much-needed psychological support for it, like the people working at the special victims department of the police do. And to make matters worse, these types of companies have a particularly bad reputation when it comes to supporting moderators.

A major tech company is accusing you (wrongly!) of child sexual abuse. Super intense.

Mistakes continue to occur

In addition, in the Netherlands, viewing images of child sexual abuse is punishable by law. So you would force companies to hire employees to do something illegal. In response, policymakers came up with a new proposal: an independent body will be set up to assess such images. Companies forward your photos to them without seeing them. But that practice is of course also deeply problematic because then the confidentiality of your photos is in even more peril. What parent would like to see sensitive photos of their child shared even further?

In the eyes of some policymakers, a human fail safe could fix the problem of computer errors. The New York Times story shows that human control at Google did not work so well. Google reported its user to the police after its own human check. And even after the police confirmed that nothing illegal had happened, Google stuck to its own truth.

False accusation

Take a moment to think about that. A major tech company is accusing you (wrongly!) of child sexual abuse. Very troubling. And then they report you to the police. Even more troubling. The police then starts an investigation into you. For the father from the New York Times story, it meant that the police requested all available data from Google, such as his location data, search history, photos and files. That may be understandable, but it is just as troubling. And that father, ironically, is a developer of the software that labelled him a criminal and probably knows how to fight back relatively well. But the vast majority of the population is less resilient and less familiar with the small print...

The legislator wants technology companies to monitor communications that ought to be protected with end-to-end encryption.

The approach to handling the situation discussed above may very well be dismissed as a voluntary (and stupid!) action by a single tech company. Or an incident. But it isn’t. There are more examples, also with Dutch users.

Europe wants to standardize this

But there is more to consider. The European legislatorEuropean Commission wants to eliminate online confidentiality is now discussing a bill that could force companies to scan all messages from all customers for child sexual abuse. TThey now want to enforce a practice that has already been proven ineffective and harmful in the case of Google and other technology companies. So we're going to see stories like this one more often.

Confidentiality of communication is indispensable for everyone, including children and victims of sexual abuse.

It is even more concerning that the European Commission's proposal endangers confidentiality on the Internet. It wants technology companies to monitor communications that right now are still protected with end-to-end encryption. But the confidentiality of communication is indispensable for everyone, including children and victims of sexual abuse. The proposal must therefore be dropped. Bits of Freedom and European partners are working very hard to achieve exactly that. Will you help us? Donate now!

This article was translated into Dutch by Celeste Vervoort and Philip Westbroek.

Help us and support us

With my contribution I support Bits of Freedom, this can be monthly or one-off.

You're an official supporter of a free internet!

Thank you for your support and welcome as an official supporter of a free internet.

You will receive the login code for the donor environment as soon as possible via email. Or click the button below to go directly to the donor environment.

To supporters area

Something went wrong during payment

Your payment was not processed correctly, please try again.

Support and donate!

Want to know more about donating to us? Read all about it here.