Twitter CEO Jack Dorsey appeared to take a break in the middle of his appearance Thursday before a congressional subcommittee grilling to tweet simply “?”
The midhearing tweet was in the form of a poll, with the available answers being “yes” or “no.”
Dorsey’s tweet summarized the most memorable and frustrating aspect of the joint hearing before two House Energy and Commerce subcommittees meant to press chief executives of Facebook, Google and Twitter about those companies’ roles in promoting disinformation and extremism online.
But the CEOs seemed to evade more questions than they answered, at times offering clear indications of their own frustration with some of the proceedings.
Just before Dorsey’s tweet, Rep. Billy Long, R-Mo., required the tech leaders to each answer whether they knew “the difference between the word ‘yes’ and ‘no’?” They all answered in the affirmative, to which Long responded that he had won a steak dinner from a colleague for getting a straight answer.
In the hourslong hearing, members of Congress questioned Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Dorsey over a wide range of concerns including Covid-19 misinformation, child exploitation, racial bias, targeted advertising, hate speech, harassment, algorithmic amplification of disinformation, and the radicalization of Amercians including rioters who took part in the Capitol attack.
The CEOs testified remotely. Wearing suits and ties, Zuckerberg and Pichai appeared from office spaces, surrounded by plants and pottery. Dorsey, wearing a suit jacket, testified from a kitchen, framed by a shelf loaded with dinnerware and carafes.
Clearly better briefed than in previous hearings, members asked questions that disinformation experts called informed and critical.
“Many of the questions have been focused, nuanced, and go beyond ‘content’ toward design, business model, and incentives,” tweeted Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab.
But the tech leaders also adeptly avoided giving any meaningful answers. Members repeatedly pressed the CEOs to answer “yes or no,” a request that was met with long-winded answers that circumnavigated members’ questions.
“It’s irritating all of us,” Rep. Anna Eshoo, D-Calif., said to Zuckerberg after one question. “No one seems to know the word ‘yes’ or the word ‘no.’”
The hearings come at a time when Congress is considering what legislative actions to take against the companies in light of the Jan. 6 Capitol riot and the prevalence of Covid-19 misinformation online. Several policies are reportedly being considered, including reforming Section 230 of the Communications Decency Act of 1996, which shields tech companies from liability for content posted to their platforms.
“Self regulation has come to the end of its road,” said Rep. Jan Schakowsky, D-Ill., who said she is introducing Section 230 reform legislation focused on consumer protection.
Though the tech leaders have faced two similar interrogations on Capitol Hill over the last year, this was the first hearing since the Jan. 6 riot. Several lawmakers focused on social media’s responsibility in creating conditions that led to the violence.
“The spread of disinformation and extremism has been growing online particularly on social media, with little to no guardrails in place to stop it,” Rep. Frank Pallone, D-N.J., said in his opening remarks. “Unfortunately, this disinformation and extremism doesn’t just stay online, it has real-world, dangerous and even violent consequences and the time has come to hold online platforms accountable for their part.”
When pressed by Schakowsky on whether the Capitol rioters had organized on Facebook, Zuckerberg seemed to walk back earlier remarks from Chief Operating Officer Sheryl Sandberg that laid blame on other platforms.
“Certainly, there was content on our services, from that perspective, I think there’s further work that we need to do,” Zuckerberg said. He also said fault for the attack should be attributed to the rioters alone.
When asked by Rep. Mike Doyle, D-Penn., about their role in the attack, both Zuckerberg and Pichai declined to provide the requested clear answer. Dorsey obliged.
“Yes,” Dorsey said, sneaking in an addendum, “but you also have to take into consideration the broader ecosystem.”
Congress members’ questions also zeroed in on algorithms — the secretive, ever-changing rules that determine what content appears prominently in users’ feeds — and how users are pushed toward certain behaviors on the platforms. These algorithms, many of which are designed to get people to spend more time on a website or a platform, play an important role in deciding what users see, such as a tweet at the top of a timeline, a video suggestion from YouTube, or a political group post in a Facebook News Feed. Algorithms guide the companies’ powerful recommendation machines, which members of Congress reiterated push users toward more extreme content and dangerous groups.
The partisan divide was not dominant, but still present during questioning from Republican members, several of whom asked about the claim that social media platforms are biased against conservatives, a claim that has been consistently debunked by researchers.
Zuckerberg sought to pre-empt talk of Section 230 reform that might negatively affect Facebook by suggesting his own plan, one that requires companies to have systems in place to address “unlawful content,” but that stops short of holding them liable if those systems fail.
Pichai warned that Section 230 reform would have “unintended consequences” for free expression and the nimbleness of platforms to moderate content and respond to evolving threats.
The lawmakers’ concerns also extended to harmful effects of social media on children, with Rep. Cathy McMorris Rodgers, R-Wa., and others grilling the CEOs on the increase in teen depression and suicides linked to online activity.
“Congress is fired up and there are so many issues here,” said Joan Donovan, research director at Harvard University’s Shorenstein Center on Media, Politics and Public Policy. “It boils down to finding out what these companies are willing to do regarding hate, harassment, disinformation and incitement now, and how much of their profits they are willing to dedicate to fixing the problems created by their products.”
In the run-up to Thursday’s hearings, critics from all sides turned up pressure on the social media giants. Advocacy groups published reports this week on Facebook’s role in the Capitol attack and the continued spread of Covid-19 misinformation across platforms. Twelve state attorneys general signed a letter urging Facebook and Twitter to remove anti-vaccination misinformation from their platforms.
“The questions were specific and well-researched,” said Claire Wardle, co-founder of First Draft, a nonprofit that provides research and training on misinformation for journalists. “But overall, these hearings are simply not useful spaces for the kind of nuanced discussion that needs to happen. The CEOs are perfectly trained to deflect the questions, each politician only has 5 minutes, and the result was frustrating exchanges around the meme-able catchphrase ‘just give me an answer, yes or no.’ But as the politicians become more aware of the complexity of these issues, maybe there is some hope that we will start to see political action around harmful content online.”