Again, Mark Zuckerberg, Jack Dorsey and Sundar Pichai had to answer questions from the US Congress The directors of Facebook, Twitter and YouTube were summoned to testify about disinformation and extremism on their platforms. But this time the call seemed personal: on January 6, 2021, a mob of misinformed people stormed the Capitol, terrorizing congressmen, in protest of the alleged electoral fraud that took away the presidency from Donald Trump. The objective of the hearing was to evaluate what the role of the platforms had been in promoting the insurrection.

 

The stakes are not low: the underlying discussion is whether Congress should reform Section 230 of the Communications Decency Act. This legislation gives broad immunity to platforms: they are not responsible for what their users publish, but neither are they responsible if they decide to moderate content. For this reason, both the congressmen who want the platforms to take more control and those who question them for blocking content or accounts are interested in modifying it.

 

Of course, the pandemic was also in the discussion. Although the platforms took steps to try to control harmful content, misinformation about the coronavirus still lives on the platforms. In this regard, we recommend reading The new normal?, the latest CELE paper, which addresses the response of platforms to disinformation about the virus.

 

As usual, the audience disappointed. Here's the good, the bad, and the ugly of the audience. Also, Facebook's proposal on how regulation should work.

 

The good

  • To questions about possible regulation, the three directors agreed that it is necessary for platforms to be more transparent about their content moderation processes.
  • Without hesitation, the directors they recognized that the harmful content that travels on their platforms has a more serious effect on minorities and African-American communities.

 

The bad

  • The topic proposed for discussion was the role of platforms in promoting disinformation and extremism. However, as usual, the questions went through all branches: what are they doing to control misinformation, suicides, illegal drugs, bullying, hate speech, anti-Asian sentiment, illegal human trafficking in the border with Mexico, etc.? And also: how do you ensure racial diversity in the work environment, how do you protect data, how do you ensure free competition, etc.? The congressmen tried to cover a lot and, consequently, they pressed little.
  • The dynamics didn't work out well. Each congressman on the committee had three witnesses and five minutes to speak. In an effort to seek absolute conclusions in a short time, they insisted on asking yes or no questions, while the directors insisted on giving complex explanations to complex issues. Communication frequently did not work out and the hearing was largely unsuccessful.

 

The ugly

Some congressmen did not do their homework and, as usual, asked inappropriate questions. One of them asked Facebook CEO Mark Zuckerberg how he and his children use YouTube. Another asked the three if they had already seen the documentary The Social Dilemma. Another, if they had already been vaccinated against the coronavirus.

 

What does Facebook propose for regulation?

Mark Zuckerberg was the longest on screen. Much of the attention of the congressmen was focused on Facebook. Although all three platforms spoke of regulation for transparency, it was Zuckerberg who made the most elaborate proposal:

 

  • That platforms have an obligation to control "clearly illegal" content (the bullying, for example, it is not clearly illegal; child sexual exploitation and terrorism, yes).
  • That platforms maintain immunity, which is earned by meeting “best practice” standards. That is, demonstrating that they have systems to identify and eliminate illegal content. However, it clarifies that platforms should not be responsible if a particular content escapes these systems.
  • That this obligation is proportional to the size of the companies, so that no demands are made that impede the competition of new and smaller players in the market.

 

The proposal, which at first glance might seem sensible, has received criticism: ultimately, the company proposes a regulation idea tailored to its needs. When it comes to pinning down industry "best practices," Facebook can argue that theirs are the best. What's more, as Mike Masnick points outIf Facebook's measures become the “industry standard,” it would create an incentive for other companies to copy this strategy and stop coming up with their own solutions to the problem of disinformation.

To learn more about how social media currently controls misinformation, we recommend visiting letrachica.digital, the project with which CELE is following up on changes in community standards for platforms.

 

Written by Luisa Isaza Tw: @luisaza

Composer