What the YouTube and Facebook statistics aren’t telling us
- 18 april 2019
- Foto: Martin Kenny
After the terrible attack in New Zealand, the large social media platforms published figures on their efforts to limit the spread of the video of the attack. What do those figures tell us?
Attack on their reputation
Terrorism, such as the attack in New Zealand, presents a challenge for all of us - and therefore also for the dominant platforms in our digital communications. They had to work hard to limit the spread of the live stream of the attacker on their platforms. Even just to limit the reputational damage.
Platforms need to focus on the type of conversation, its participants and the region in which it takes place.
And that, of course, is why companies like Facebook FacebookFacebook: "In the first 24 hours we removed 1.5 million videos of the attack globally" and YouTubeThe Washington Post on YouTube's moderation published statistics afterwards. All of that went to show that it was all very complex, but that they had done their utmost. YouTube reported that a new version of the video was uploaded every second during the first hours after the attack. Facebook said that it blocked one and a half million uploads in the first 24 hours.
Figures that are virtually meaningless
Those figures might play nice in the media, but without a whole lot more detail, they are virtually meaningless. The figures don't speak to the effectiveness of the effort, and even less to its unintended consequences. Both platforms had very little to say about the videos that they had missed, which were therefore not removed. Those are videos that were indeed able to reach a sizeable audience. We encountered those as well: just last week, we notified Facebook of a video of the attack that had been on the platform for a few days already.
Everything depends on the intent of the uploader and the context in which it is placed.
In violation of their own rules
Something else that the figures do not show: how many videos have been wrongfully removed? Facebook says, for example: "Out of respectFacebook on Twitter about removing videos of the attack for the people affected by this tragedy and the concerns of local authorities, we're also removing all edited versions of the video that do not show graphic content." So this is information that is apparently not in violation of the rules of the platform, but that is blocked out of deference to the next of kin.
However empathetic that might be, it also goes to show how much our public debate depends on the whims of one commercial company. And what happens to videos of media that report on the events? Or a video by the relative of a victim, who uses recordings from the events preceding the attack in a commemorative video of his own? In short, it's very problematic for a dominant platform to make such decisions.
The Americans use a definition of terrorism that is different from ours. And Erdogan's idea of terorrism is probably different again.
Blind to the context
It is equally important to dwell on the following example. Between 2012 and 2018, YouTube took down more than ten percent of the videos of the Syrian Archive account. The footage documented the attacks in Syria, as well as their terrible consequences. YouTube saw only 'violent extremismVital Human Rights Evidence in Syria is Disappearing from YouTube', and took down the videos. Apparently, YouTube's filters didn't properly recognize the context. Publishing such a video can be intended to recruit others to armed conflict, but can just as well be documentation on that armed conflict. Everything depends on the intent of the uploader and the context in which it is placed. The automated filters have no regard for the objective, and are blind to the context.
Anything but transparent
Those automated filters usually work on the basis of a mathematical summary of a video. If the summary of an uploaded video is on a list of summaries of terrorist videos, the upload is refused. The dominant platforms work together to compile this list, but they're all very secretive about it. Outsiders do not know which videos are on it. Of course, that starts with the definitionAll Dutch authorities use the same definition of terrorism. It is often far from clear whether something falls within that definition.
And then, of course, the definition differs between countries in which these platforms are active. The Americans use a definition of terrorism that is different from ours. And Erdogan's idea of terorrism is probably different again. That makes it even more difficult to use the list; platforms have little regard for national borders. If (if!) such an automatic filter were to function properly, it still would block too much in one country and too little in another.
It is just not possible to service the whole world with one interface and one moderation policy.
Objecting can be too high a hurdle
As we mentioned, the published figures don't say anything about the number of videos that was wrongfully removed. Of course, that number is a lot harder to measure. You could ask platforms to also say something about the number of times that someone objected to a decision to block or remove content. But those figures would also say little if you know that the procedure for such a request is often cumbersome and lengthy. Often enough, uploaders will just decide it's not worth the effort, even if the process would eventually have let them publish their video.
One measure cannot solve this problem
No, we do not believe that this can be solved with better computers or more human moderators. It is just not possible to service the whole world with one interface and one moderation policy. It is problematic that we have to communicatie in an online environment that is dominated by so few platforms, which can never scan all those communications to detect potentially objectionable forms of expression.
The user should have something to choose from again. We have to move to a world where platforms are focused on the type of conversation, its participants and the region in which it takes place. By making ourselves dependent upon a few juggernauts, we are selling ourselves short. Much too short.
The translation of this article was provided for by volunteers of Bits of Freedom, amongst them Jeroen Brakkee.