Bis repetita. The bosses of Google, Facebook and Twitter again faced elected officials from the upper house of Congress on Thursday who criticized the role of social media in the attack on Capitol Hill by pro-Trump extremists on January 6. . Some of them were also alarmed by the massive dissemination of disinformation, in the context of the Covid epidemic and the vaccination campaign. Still others lamented the detrimental effect of social networks on the mental health of children and adolescents.
The attack on the Capitol “started and was fed on your platforms”, rebelled Mike Doyle, Democrat of Pennsylvania who chaired the virtual meeting. The three bosses “have shown, on several occasions, that their promises of self-regulation do not work,” added Jan Schakowsky, who chairs the Consumer Protection Commission. “They must be held accountable for the disinformation that spreads on their platforms, poisons our public debate and threatens our democracy.”
The elected officials from both sides were annoyed by the evasive tone of the three bosses, and in particular their refusal to answer yes or no to their questions. But if almost all have strongly criticized the role of platforms, Democrats and Republicans put forward different, even opposing, problems.
The former are unanimously calling for more moderation efforts on the part of platforms, while many Republicans have been moved by the ban on Donald Trump, kicked out of Facebook and Twitter for having incited his supporters to storm Congress. Bob Latta, elected Republican of Ohio, accuses the three social networks – Facebook, Twitter and YouTube – of “censoring conservative voices” and of serving a “radical progressive agenda”.
At the heart of the debate is the reform of Section 230. This 1996 text allowed the development of platforms by protecting social networks from prosecution when their users share illegal or hateful content. Most elected officials agree on the need to reform this legislation, without being able to agree on the main measures to be put in place. Between increased moderation and the defense of freedom of expression, the fight against harassment and concern for protecting privacy, the very principle of this reform remains obscure for the moment.
Large platforms and young start-ups
Between two reproaches, the bosses sometimes took the opportunity to express their preferences for this upcoming legislation. This is particularly the case of Mark Zuckerberg, who published a detailed notice before the start of the hearing. The latter pleads for a reform of section 230, which would oblige companies to set up effective content moderation systems, without however holding them responsible if certain messages escape their vigilance.
The boss of Facebook also insists on the need for transparency. During the hearing, he reminded elected officials that his network publishes, every three months, detailed reports on the type of content that has been removed or has been the subject of warnings.
His plan has been the subject of criticism: Facebook having already set up such a system, this rule could serve to kill all competition by making it more difficult for new entrants to arrive. To counter this reproach, Mark Zuckerberg suggested distinguishing between “large platforms”, including Facebook, which would be subject to high requirements, and young start-ups which would be largely exempt.
Asked about the relevance of this plan, Sundar Pichai and Jack Dorsey did not oppose it, while asking for more details before committing. At the start of the hearing, the boss of Google defended section 230 which “allows companies to take decisive action to limit disinformation and control actors who try to circumvent the rules”. The founder of Twitter has meanwhile declared himself in favor of more transparency, while stressing that it would be difficult to distinguish the “big” players from the small ones.