Substack's approach to content moderation

Loading player The recent dispute between the streaming platform Spotify and the Canadian singer-songwriter Neil Young has revived in the United States and other countries of the world a debate that has been very present for several years about the limits of freedom of expression on social media and the responsibilities of the platforms developed by large Internet companies in the spread of disinformation. Basically, leaving out more complex reasons behind the differences, Young had all of his music removed from Spotify because he disapproves of the company's choice to host a hugely popular podcast responsible, according to him, for spreading false information about vaccines.

An original contribution to the debate on the role of platforms in the moderation of misleading or, in general, problematic content came in recent days from the founders of the Substack newsletter service. This contribution, read from the perspective of the dispute between Young and Spotify, which they have not explicitly mentioned, would seem to support the legitimacy and usefulness of Spotify's choice not to censor the controversial podcast, net of economic considerations.

In a post on the channel edited by the company, Substack founders Chris Best, Hamish McKenzie and Jairaj Sethi described the loss of trust of people as a more present and urgent problem in societies than that of disinformation, considered rather a consequence of first. And they indicated as the preferable model of content moderation the one less prone to censorship, considered by Substack a choice capable of increasing the negative consequences of disinformation rather than limiting it.

The question of trust, understood as the ability of people to trust not only the most reliable sources of information but also the public authority itself, had already emerged several times during the pandemic in the less superficial and more in-depth analyzes of the so-called no vax phenomenon. . The American anthropologist Heidi Larson, director of the project to combat disinformation on vaccines Vaccine Confidence Project and for years a collaborator of UNICEF in vaccination programs around the world, had been among the most cited experts and interviewed by newspapers and information programs.

– Read also: It's easy to say “no vax”

According to Larson, the success of vaccination plans is largely based not on the correct communication of the “facts”, generally rejected by many no vax, but on the solidity of a “social contract” between people, which tends to wear out in a context like that current, dominated by sentiments of anti-globalization, nationalism and populism. Feelings that “will not go away if you close Facebook tomorrow: they will move,” Larson said in a 2020 interview with the New York Times quoted by the founders of Substack, who share his point of view.

Also read:   TIM launches its satellite Internet experimentation: here are all the details of the upcoming offer

Substack is a popular newsletter creation and management service, whose success has greatly increased during the pandemic. It offers a free platform that allows you to write a newsletter and ask readers for money to fund it on a monthly or annual basis, and it keeps you by retaining 10 percent of each newsletter's total revenue. Each author can also decide from time to time which new releases of the newsletter to make free and readable by anyone and which only to lenders.

The growth in popularity of the service has led Substack to contend with greater and more frequent requests to intervene on content that to some people may appear questionable, offensive or even dangerous: a fairly inevitable situation for any platform that progressively increases its number of users ( of newsletter authors and commentators, in the case of Substack). “Our answer remains the same: we make decisions based on principles and not on public relations, we will defend freedom of expression and we will stick to our discreet approach to content moderation,” wrote Best, McKenzie and Sethi.

There are obviously limits to freedom of expression on Substack as well, defined by a series of guidelines that aim to safeguard the platform's respectability and prevent it from becoming a space or a tool for extremism. And that it preserves its current reputation that places it in a completely different league – even technically – from that of other platforms such as Reddit, 8chan and the like, which make the substantial absence of moderation their characteristic feature.

– Read also: The New York Times no longer wants to manage its Facebook group on cooking

Without prejudice to compliance with the guidelines, Substack favors an approach that attributes responsibility mainly to the reader and writer, not to those who manage the platform, which is also very different from those who filter and sort their contents through algorithms. It also views censorship “as a last resort, because we believe open debate is best for authors and for society,” wrote Best, McKenzie and Sethi. Although this can lead to having to tolerate the presence on the platform of things that many people consider wrong or offensive, but that others freely decide to read.

Also read:   The personal data of 92% of LinkedIn users is leaked

This approach “is a necessary precondition for building trust in the information ecosystem as a whole,” according to Substack, because as more attempts by institutions to control what can and cannot be said publicly, there is an increase in people “ready to create alternative narratives about what is “true” ». Those people, as many cases in our societies already demonstrate, actually end up being incentivized by conspiracy theories that proliferate in conditions of low trust in both traditional and social media, and limiting debate.

The polarization of the debate is also considered by Substack as an effect and at the same time, in a vicious circle, as a cause of the loss of mutual trust between people, as well as in institutions. In fact, conspiracy theories often end up being strengthened on social networks by the widespread tendency to defend an argument only to show some belonging to a group, even at the cost of being hyperbolic or intellectually dishonest. And this in turn generates in all groups the tendency to identify excesses in other groups and never in their own, behavior that increases the feeling of distrust and increasingly restricts the range of points of view and opinions acceptable within each group. .

The dynamic is always the same: «It is always the other party that is insane, dishonest and dangerous. It is the other party, which does not accept criticism because it knows it cannot win the discussion. It is they, who do not care about the truth ». And in this way each group becomes more and more annoyed by the misdeeds of the other groups and blind to their own. And if it is true that our current information systems did not create this mistrust, according to Substack, it is also true that those systems – starting with social media – have contributed to amplify it and accelerate certain dynamics.

Also read:   The Antitrust has fined Amazon for over 1 billion euros

– Read also: We should better study the effects of social networks on collective behavior

In particular, one of the effects of the advent of social media has been to make the competition for attention at all costs even more crowded and fierce, in which traditional press, television, radio, podcasts and other media participate. From this point of view, the very cautious approach to content moderation adopted by Substack could, according to its founders, prove more useful in building an “information economy” that privileges the strength and quality of relationships developed over time. . Contrary to what happens with the “economy of attention”, which develops through the exploitation of basic drives.

But the relationships in the information economy are all the more solid and founded on a relationship of trust the more the people who write and those who read do not feel deceived, pleased and “pampered”. They are solid if those people have the perception that they are on a platform that defends freedom of expression and offers greater guarantees on the absence of unclear manipulations of upstream sources of information. “Quite clearly: the censorship of bad ideas makes people less inclined, not more inclined, to trust good ideas”.

The way to make these reports work on Substack, according to its founders, is to allow people who write newsletters to have some freedom and autonomy in their relationships with readers. A condition clearly favored by a system focused on subscriptions rather than advertising, and which sets the bar for admissibility of external intervention by the administrators at an “extremely high” point.

Addressing people who disapprove of this approach and indeed ask for more intervention from those who manage the platforms, according to a trend that has recently become increasingly widespread, Best, McKenzie and Sethi ask themselves: «How is it going? Works?”. Substack's idea is that defending freedom of expression on the platform does not solve the problem of disinformation but prevents it from worsening that of lack of trust. Because trust arises from respecting relationships built over time, “it cannot be won with a press release or a ban on social media”, nor can it be strengthened by rejecting difficult conversations.

Leave a Comment