Hearing Examined How Platforms’ Designs Lead to Increasing Spread of Provocative Content, Including Extremist Ideologies
WASHINGTON, D.C. – U.S. Senator Gary Peters (D-MI), Chairman of the Homeland Security and Governmental Affairs Committee, convened a hearing to examine how social media platforms continue to prioritize increased user engagement and revenue over safety and security. During the hearing, the Committee heard from two separate panels. During the first panel, former executives from social media companies confirmed to Peters that platforms are designed to keep users engaged and – as a result – the platforms’ recommendation algorithms and other products end up amplifying dangerous and radicalizing extremist content. Peters also called a second panel of current chief product executives from Meta, YouTube, TikTok, and Twitter to testify for the first time before Congress about how their companies balance the pursuit of increased user engagement and revenue with the risk of funneling people towards violent and dangerous content. Peters pressed the executives on whether they effectively build in efforts to reduce the spread of harmful content as they are developing products, and whether trust and safety efforts to reduce harmful content are key parts of product development employees’ compensation and promotion. The hearing also discussed how the proliferation of this extremist content, which can include white nationalist and anti-government ideologies, presents a serious national security threat.
“From the 2017 neo-Nazi ‘Unite the Right’ rally in Charlottesville, Virginia that was organized using a Facebook event page, to the violent January 6, 2021 attack on the U.S. Capitol spurred to action in part by ‘Stop the Steal’ content that repeatedly surfaced online, to the shooter who livestreamed as he massacred Black shoppers at a Buffalo supermarket, there is a clear connection between online content and offline violence,” said Peters during his opening statement. “The central question is not just what content the platforms can take down once it is posted, but how they design their products in a way that boosts this content in the first place, and whether they build those products with safety in mind to effectively address how harmful content spreads.”
Addressing the chief product executives, Peters continued: “Whether users are fully aware of it or not, the content they see on your platforms shapes their reality. And the business decisions you make are one of the main driving forces of that phenomenon…This extremist content can spread like wildfire, amplified by the recommendation algorithms and other tools your teams build to increase your companies’ audiences and profits.”
To watch video of Senator Peters’ opening remarks from Panel 1, click here. For text of Peters’ opening remarks from Panel 1, as prepared, click here.
To watch video of Senator Peters’ questions from Panel 1, click here.
To watch video of Senator Peters’ opening remarks from Panel 2, click here. For text of Peters’ opening remarks from Panel 2, as prepared, click here.
To watch video of Senator Peters’ questions from Panel 2, click here.
To watch video of Senator Peters’ closing remarks, click here.
The hearing follows previous efforts by Peters to press the Chief Executive Officers of Facebook, Twitter, YouTube, and TikTok for more information on the relationship between extremist content and the platforms’ recommendation algorithms and targeted advertising tools that generate significant revenue for the companies.
During the first panel, Peters and the witnesses discussed how platforms’ recommendation algorithms end up quickly spreading extremist content, which can include racist and dangerous ideologies that stoke real-world violence, as a result of how they are designed to amplify content that increases user engagement. They also discussed what actions social media companies can take to prevent the rapid proliferation these harmful ideologies, and how actions taken by trust and safety teams at platforms are in direct competition with the goals of the product development teams. The former social media executives also affirmed that companies often prioritize user engagement and growth over the safety of their users in the product development process.
The second panel provided an opportunity for members to hear directly from social media companies’ chief product officers about how social media companies make business decisions, including to what extent companies are considering how to address extremist content and related challenges in their product design processes. Peters pressed the witnesses on how they allocate their companies’ personnel, including engineers, and other resources to product development compared with trust and safety. He also grilled the witnesses on whether they consider trust and safety metrics before launching products or features, and whether trust and safety performance is considered for employee compensation and promotions. Peters questioned whether companies’ emphasis on “the average user” in responding to questions masked the extent of extreme content that has actually circulated online before it is taken down. Despite prior requests for concrete data, the executives were unprepared to provide key information. Peters will continue to press for the data.
The hearing builds on Peters’ work to investigate the rise of domestic terrorism, including white supremacist and anti-government violence. Peters previously convened a hearing with independent social media experts to discuss how the spread of extremist content on social media platforms translates to real-world violence. In June, he convened a hearing with outside experts focusing on the threat of white supremacist extremism, including violence inspired by racist ideologies such as Great Replacement Theory. Peters also convened a two-part hearing with experts representing faith-based, civil rights, and academic and policy research organizations on the continued rise of domestic terrorism, including white supremacist and anti-government violence. Peters also secured $250 million in funding for the Nonprofit Security Grant Program, which supports the needs of houses of worship and other nonprofit organizations that want to secure their facilities against potential terrorist attacks. In 2019, Peters helped convene the Committee’s first domestic terrorism hearing with a focus on white supremacist violence.
###