Facebook we were too slow to recognise our 'corrosive' effect on democracy

in #facebook7 years ago

Social network hiring 10,000 more staff to combat spread of fake news, and harassment use of Facebook as a weapon in cyberwar

facebook 1.jpg

Facebook has admitted to being “too slow to recognise” Russian election interference, but says that social networks simply reflect human intent, “good and bad”.

In a blogpost, Facebook product manager Samidh Chakrabarti says that “at its best, [social media] allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy.

“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,” Chakrabarti adds.

Chakrabarti’s post catalogues situations in which Facebook has been accused of aiding or empowering opponents of democracy, admitting the site was initially “far too slow to recognise how bad actors were abusing our platform.”

On Facebook’s use as an “information weapon” by Russian state actors during the 2016 US Presidential election, he says: “Russian interference worked in part by promoting inauthentic Pages, so we’re working to make politics on Facebook more transparent.

“We’re making it possible to visit an advertiser’s Page and see the ads they’re currently running. We’ll soon also require organisations running election-related ads to confirm their identities so we can show viewers of their ads who exactly paid for them. Finally, we’ll archive electoral ads and make them searchable to enhance accountability,” Chakrabarti said.

“It’s abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society,” he added.

On fighting “False News” on the site, Chakrabarti cited the enrolment of third-party fact-checkers, and the creation of “trust indicators” to “help people sharpen their social media literacy”.

Chakrabarti lists Facebook’s attempts to counter numerous other accusations commonly laid at its feed, including that it “creates echo chambers where people only see viewpoints they agree with”, puts women off commenting on politics online and enables harassment of politicians and citizens.

In response, he says, Facebook is “hiring over 10,000 more people this year to work on safety and security”, but warns that it is hard to that sort of moderation “at a global scale … since it is hard for machines to understand the cultural nuances of political intimidation.”

Facebook is trying to fight the creation of echo chambers online with a feature called “related articles” that shows people other articles about the stories news they’re reading.

Chakrabarti’s post is part of Facebook’s “Hard Questions” outreach effort, in which the site directly addresses some of the most damaging allegations levied at it. A previous post in the series looked at whether social media presents a mental health risk – and concluded it only did if users didn’t post enough.

In a post also published by Facebook, Harvard Law School professor Cass Sunstein, argued that social media platforms are “very much a work in progress” acknowledging problems with polarisation and the echo chamber effect. But he states that they “are not merely good” for democracy, “they are terrific”.

The admissions of failure to act by the social network come after Mark Zuckerberg announced in January that his personal goal for 2018 was to “fix” Facebook, acknowledging that the site makes “too many errors enforcing our policies and preventing misuse of our tools”.

Facebook has been directly blamed for damaging the democratic process in a number of countries beyond just the US. In October, it trialled a move to remove news content from the news feed in six smaller nations around the world, including Guatemala, Cambodia and Slovakia.

The trial was criticised as “downright Orwellian” by one of the publishers who saw more than two thirds of their readers disappear. “The Facebook explore tab killed 66% of our traffic. Just destroyed it … years of really hard work were just swept away,” said Dina Fernandez, a journalist and member of the editorial board at Guatemalan news site Soy502, at the time. “I’m worried about the impact of Facebook on democracy.”

As the trial went on, reports from a number of the nations involved suggested that some of Fernandez’s fears may have been accurate. The New York Times reported that Cambodia, Slovenia Bolivia and Sri Lanka all saw massive reductions in the amount of trusted news content shared on the site – with little corresponding reduction in low-quality, politically inflammatory memes that still spread like wildfire across the network.

In January, Facebook announced a similar change to that trialled in the smaller nations would be rolled out worldwide, acting to deprioritise news in general on the site, a move that has been called bad for democracy.
line break.jpg

Source: www.theguardian.com/technology/2018/jan/22/facebook-too-slow-social-media-fake-news-hiring?CMP=fb_gu

Sort:  

Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://www.theguardian.com/technology/2018/jan/22/facebook-too-slow-social-media-fake-news-hiring