Facebook’s controversial algorithms protect its users from being exposed to extreme content, hate speech and misinformation, the beleaguered company’s vice president for policy and global affairs claimed in interviews on Sunday.

Nick Clegg defended Facebook against allegations from whistleblower Frances Haugen that its algorithms push clickbait and extreme content — but insisted the company would never be able to entirely eliminate misinformation and hate speech from its platforms.

“If you remove the algorithms… the first thing that would happen is that people would see more, not less, hate speech — more, not less misinformation,” Clegg told Dana Bash on CNN’s State of the Union. “These algorithms are designed precisely to work almost like giant spam filters to identify and deprecate bad content.”

“For every ten thousands bit of content, you’d only see five bits of heights of hate speech,” he said. “I wish we could eliminate it to zero.. We have a third of the world’s population on our platforms. Of course we see the good, the bad and the ugly of human nature on our platforms.”

Clegg insisted to Bash that Facebook’s algorithms did not play any special role in the lead-up to the Capitol Riot on Jan. 6. On NBC’s “Meet The Press,” he told Chuck Todd that Haugen’s claim Facebook lifted measures intended to tone down user feeds after the 2020 presidential elections was “simply not true.”

“We in fact kept the vast majority of them right through to the inauguration, and we kept some in place permanently,” Clegg told Todd, adding that some of the changes were “one-off.”

He said the company rolled back “blunt tools” — such as reducing the circulation of videos, civic engagement opportunities and political ads — that had been inadvertently “scooping up a lot of entirely innocent legitimate legal playful enjoyable content.”

Nick Clegg, Facebook’s vice president for policy and global affairs.
Facebook executive Nick Clegg denies that the Jan. 6 Capitol riots were fueled by the platform’s algorithms.
AFP via Getty Images
An illustration of Facebook's logo with people on their laptops.
Facebook has also been scrutinized for its lack of moderation on drug cartels, human traffickers and pedophiles.
REUTERS

“We did that very exceptionally,” Clegg said. “We just simply let perfectly normal content just circulate less on our platform. That’s something we did because of the exceptional circumstances.”

Clegg told Todd the onus is on Congress to “create a digital regulator” and set rules for data privacy and content moderation.

“I don’t think anyone wants a private company to adjudicate on these really difficult trade offs between, you know, free expression on the one hand, and moderating or removing content on the other,” he said. “There is fundamental political disagreement. The right thinks we… censor too much content, the left thinks we don’t take down enough.”

Nick Clegg, Facebook’s vice president for policy and global affairs.
Facebook executive Nick Clegg argues “civic engagement opportunities and political ads” were damaging the company’s reputation.
dpa/picture alliance via Getty I

Clegg told ABC’s George Stephanopoulos it was “extremely misleading” to analogize Facebook’s reported knowledge of the harm its products cause children and society to tobacco companies’ awareness of the danger of cigarettes.

“In the ’80s and ’90s there were analogies that watching too much television was like alcoholism, or arcade games like Pac Man was like, you know, drug abuse,” he said. “We’ can’t change human nature. You’ll always see bad things online.”



Source link