Reports major social media platforms and the FBI and are not doing enough to address the threat of domestic terrorism from white supremacists and anti-government extremists.
In a 128-page report, the Senate Homeland Security Committee alleged that federal agencies, including the FBI and the Department of Homeland Security, as well as the major social media platforms, are not doing enough to address the threat of domestic terrorism from white supremacists and anti-government extremists.
The FBI itself has come under scrutiny when it came out that they had the Colorado Springs gunman on their radar just weeks before that shooting, when he had been arrested for threatening to kill family members. Agents closed out the case just a few weeks later.
On the social media front, each of the major platforms such as Meta (formerly known as Facebook), TikTok, and YouTube have all released their own statements. While a Meta spokesperson talked about the company’s Community Standards Enforcement Report, which showed a low prevalence of terror and organized hate content, Nick Clegg, a top executive, had a different opinion. Last year, Clegg stated, “The reality is, it’s not in Facebook’s interest — financially or reputationally — to continually turn up the temperature and push users towards ever more extreme content.”
“These companies point to the voluminous amount of violative content they remove from their platforms, but the investigation found that their own recommendations, algorithms and other features and products play in the proliferation of that content in the first place,” the report said. “Absent new incentives or regulation, extremist content will continue to proliferate on these platforms and companies’ content moderation efforts will continue to be inadequate to stop its spread.”
The Committee’s investigation also found that the four major social media companies —use the same model, maximizing user engagement, growth, and profits, basically incentivizing extremists and their content and entered algorithms that pushed users into so-called rabbit holes. The algorithms suggest groups and people that could lead to the radicalization of users.
Please be careful.
You can read the entire report by CLICKING HERE