Last week, Meta, the owner of the Facebook app, allowed users in some countries to call for violence against Russians and Russian soldiers in the context of the Ukraine invasion. Specifically, users can call for the death of the Russian invaders of Ukraine, but not of prisoners of war, and also not of Russian civilians. An exception is President Putin. You are allowed by Meta to call for his death. And, for good measure, for the death of the president of Belarus as well. However, you are not allowed to specify the method nor the location where Russian invaders, President Putin, and President Lukashenko should find their death.
Also, where praise for the white supremacist, neo-Nazi Azov special operations detachment of the Ukrainian National Guard was previously banned from Facebook, now it is allowed to praise them as real heroes and patriots who defend Ukraine against the Russian invader. But it is not allowed to praise them as followers of Hitler or defenders of Ukraine’s white nationalist heritage, not even in the context of the Russian invasion.
Statistical delivery responsibility
We see here a new kind of responsibility in operation, different from the editorial responsibility of newspapers, which I call statistical delivery responsibility. Meta makes a statistical estimate of what type of content will increase engagement (screen time) of users and delivers that kind of content to them. It does not produce that content and does not take responsibility for it. But it will make delivery decisions that statistically maximize screen time.
The Facebook app is not a broadcasting channel, but an intelligent communication channel. Its algorithm looks at the content of a message, the characteristics of the sender, its potential receivers, and their newsfeed, and then decides who will receive which message. Users are responsible for writing the message, Meta is responsible for delivering it. Users are responsible for content, Meta is responsible for choosing the users who will see this content.
Section 230 of the Communications Decency Act says that Meta is allowed to block some messages but is not responsible for the messages that it does not block. In other words, Facebook can decide not to deliver a message to anyone based on its content, but it is not responsible for the content of the messages that it decides to deliver. This gives it the freedom to use message content to maximize screen time without worrying about any other impact of the message.
But there is a problem. Senders of a message have come to view the ability to broadcast what they write as their right. They view it as an obligation of Meta to deliver their messages to whoever matches the content of the messages. Receivers at the same time have come to view the messages they receive as carrying the approval stamp of Meta. After all, Meta decided to deliver this message to them.
Meta is taken to account by its users if it fails to deliver on these expectations. It responds to this call by formulating rules to block messages that satisfy certain criteria, such as the ones about Russian invaders. It will only deliver messages that satisfy its policy.
Statistical censorship
And here we enter the world of statistics. Facebook’s nearly 2 billion daily active users share about 100 billion messages per day. Let’s assume that a moderator can check a message in the absurdly short time of 2.5 minutes on the average that Facebook gives them. Then Meta would need 500 000 000 moderators working 8 hours straight every day to check all messages delivered to Facebook users. This is the only way Meta can take responsibility for its decisions to deliver these messages, and no other messages, to a user.
As of June 2020, Meta employed 15 000 moderators. By today this should have been at least doubled. But this comes nowhere near the 500M moderators needed to be able to take responsibility for all their delivery decisions.
To make the moderation problem smaller, Meta only does moderation for messages that are reported by users. In 2020, 3 000 000 messages were reported per day. With an error rate of 10%, 300 000 messages were falsely blocked or falsely delivered each day.
And AI won’t help. With a ridiculously and unattainable low error rate of 0.1%, Meta would still make 100 000 000 delivery mistakes per day.
The solution chosen by Meta is that it discharges of its delivery responsibility by statistical censorship. Meta can define rules to deliver messages that increase engagement and block messages that decrease engagement. If there is widespread demand to deliver message with a particular kind of content, it will generate more engagement if it delivers those messages. If 2 billion flies recommend eating shit, then deliver the message to eat shit.
Meta cannot guarantee that the rules are unambiguous nor that they are followed always. But thanks to section 230 it is not responsible for the content of the delivered messages, even though it uses content to make a delivery decision.
Delivering death wishes
What does this mean for the rules for messages about the Russian invasion of the Ukraine? Do these rules characterize the desire of users in a number of countries to spread these messages? Does Meta think it contributes to its mission to make the world more open and connected?
Whatever is the case, the rules are hard to understand, let alone that a moderator can apply them in 2.5 minutes on the average. And how to translate the criteria in other languages? They only apply to messages sent in Armenia, Azerbaijan, Estonia, Georgia, Hungary, Latvia, Lithuania, Poland, Romania, Russia, Slovakia, and Ukraine. More than 11 languages are spoken in these countries.
Clearly, this is a mess. Meta does not take responsibility for the death wishes that it delivers, because it is protected by Article 230. And even if it were not protected by Article 230, it cannot take responsibility for the death wishes that it delivers, because this is statistically impossible.
There are solutions to this: Move to a subscription model with a base fee and a fee for usage. Limit the number of receivers of a message. Take editorial responsibility for the messages you deliver with advertisements.
These and other known solutions can be realized with current technology. But they will cause a drastic reduction of Meta’s revenue (US$118bn in 2021, up from US$86bn in 2020). The solutions don’t “scale”, meaning that they do not generate outsize profits. Rather, they would downscale Meta’s revenue. But if that is what it takes to prevent a private company to make decisions about the audience of our death wishes, then I propose to Meta: Bite the bullet. Downscale to a business model you can take responsibility for. You will still be well off, and the world will be a better place.
No Comments Yet