The new boss of Twitter fantasizes absolute freedom of expression. It risks coming up against the trend of purging platforms of illicit content, argues Julie Owono, director of the Content Policy & Society Lab and member of Meta’s Supervisory Board.
Some controversies are very significant in Europe but not at all in the United States. You who work in the heart of Silicon Valley, do you see the same excitement about the takeover of Twitter?
This time, paradoxically, the perceptions are quite close. Many media analyze this “taking” of Twitter from the angle of freedom of expression. We can see a desire to have more weight, as users and citizens, in the way our expression online – especially on Twitter – is moderated. That’s the common point. What is a little different is the fact that the American political class is very divided on the answers to be given to the problems that certain content on social networks can pose, such as online harassment, hate speech, etc. This debate crystallizes here around a text, paragraph 230 of the Communication Decency Act, a body of laws relating in particular to communication on online services. This paragraph, adopted more than twenty-five years ago, reinforced the regime of non-liability of platforms for content created and shared by users. On the one hand, Democrats are concerned that Twitter is no longer moderating disinformation – especially Republican, in the run-up to the midterm elections – and are wondering what course of action to take. On the other, the Republicans, especially the hardest wing of the party, are delighted that Twitter becomes a public space as desired by Elon Musk and that, as such, the First Amendment can find new life. So, yes, we find this debate around the impact on freedom of expression but with the partisan coloring that divides the Democratic and Republican lines on the question.
To practice moderation is to receive sometimes very contradictory orders from different States.
You are a member of the Supervisory Board of Meta (Facebook and Instagram), a body responsible for deciding what can circulate there and what must be removed. Do you understand the concerns of organizations defending freedom of expression?
We cannot read into the future, but what is certain is that there is a historical history between the press and Elon Musk. Let’s say that the questions supported by certain journalists on the management of his companies have not always been well received. There is also another liability, which concerns coordinated harassment practices from followers on the platform. We had another example very recently when Elon Musk criticized certain Twitter teams, in particular those in charge of security issues. These teams have the experience that others, such as journalists, have had before them, namely finding themselves assailed by insults and attacks. Then I want to give him the benefit of the doubt. First, because we must remember that all these questions of freedom of expression have been around for a long time, but they are very recent in the history of the Internet. And then, because we can be wrong. We can decide to do certain things without necessarily wanting to harm the other. But you have to be able to learn from your mistakes, and I think Elon Musk would benefit a lot from having that humility. Especially since even the most permissive platforms like Parler (Editor’s note: the platform of the American far right) had to face these realities: there are things you can let go of and others you can’t. Child pornography, coordinated attacks, insurgencies that threaten democratic institutions, it is not possible. To do otherwise would be to misinterpret the First Amendment, which does not allow you to say whatever you want.
As powerful as he is, the boss of Tesla is not above the law. He must respect the legislation of each country. Europe, moreover, is in the process of strengthening its legislative arsenal in order to better regulate the Web…
He will have to understand that today, indeed, practicing moderation means receiving sometimes very contradictory orders from different States, many of which have adopted a law on the regulation of online content. Case in point: In Brazil, the Bolsonaro administration wants to pass a law that would prevent disinformation from being moderated. On the other hand, the European Union has decided to equip itself with a legal arsenal through the Digital Services Act (DSA), which provides for an obligation of vigilance and precaution for platforms. From now on, when you consider new functionalities, you must explain to the European and national regulators of each Member State what risk analysis has been carried out and how it suggests that it is good to launch this new functionality or this new product. Another obligation: the European Union will now ask you to explain your algorithms, which should be fine with Elon Musk since he said he intended to do so. But transparency also means accepting that outsiders can access your algorithm and study it. So a major question remains: will Elon Musk invest in the cybersecurity of his users, by which we mean public cybersecurity based on clear rules and procedures? The success of the platform it has just acquired could depend on this.
Moderation sometimes leads to absurd situations, such as when Meta censored the historic photo of “the little girl with napalm”, works of art representing nudity or photos of breastfeeding women. Conversely, conspiracy theories and racist, sexist discourse abound…
From the moment that three billion humans are on Meta, theories circulate and will always circulate… But you should know that content moderation is only the tip of the iceberg. There is a whole submerged part. This is why I created the Content Policy & Society Lab, to develop a content policy that will define all the rules that moderation must obey. For the moment, in most cases, we are navigating by sight. Which is extremely dangerous when you find yourself in crisis situations, for example that of the war in Ukraine, which you have to manage bit by bit. Now, in democratic nations, one should find oneself in a position where one knows that definite precepts are essential to guide moderation. We must also dare to ask ourselves fundamental questions such as: in a democratic society, can we be confronted with conspiracy?