Facebook culls ‘tens of thousands’ of fake accounts ahead of UK election

Facebook has revealed that it has purged “tens of thousands” of fake accounts in the U.K. ahead of a general election next month.

The BBC reported this non-specific figure earlier today, with Facebook also saying it is monitoring the repeated posting of the same content or a sharp increase in messaging and flagging accounts displaying such activity.

Providing more detail on these measures, Facebook told us: “These changes help us detect fake accounts on our service more effectively — including ones that are hard to spot. We’ve made improvements to recognize these inauthentic accounts more easily by identifying patterns of activity — without assessing the content itself. For example, our systems may detect repeated posting of the same content, or an increase in messages sent. With these changes, we expect we will also reduce the spread of material generated through inauthentic activity, including spam, misinformation, or other deceptive content that is often shared by creators of fake accounts.”

Facebook has previously been accused of liberal bias by demoting conservative views in its Trending Topics feature — which likely explains why it’s so keen to specify that systems it’s built to try to suppress the spread of certain types of “inauthentic” content do not assess “the content itself.”

Another fake news-related tweak Facebook says it has brought to the U.K. to try to combat the spread of misinformation is to take note of whether people share an article they’ve read — with its rational being that if a lot of people don’t share something they’ve read it might be because the information is misleading.

“We’re always looking to improve News Feed by listening to what the community is telling us. We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. In December, we started to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it. We’re now expanding the test to the UK,” Facebook said on this.

The company has also taken out adverts in U.K. national newspapers displaying tips to help people spot fake news — having taken similar steps in France last month prior to its presidential election.

In a statement about its approach to tackling fake news in the U.K., Facebook’s director of policy for the country, Simon Milner, claimed the company is “doing everything we can.”

“People want to see accurate information on Facebook and so do we. That is why we are doing everything we can to tackle the problem of false news,” he said. “We have developed new ways to identify and remove fake accounts that might be spreading false news so that we get to the root of the problem. To help people spot false news we are showing tips to everyone on Facebook on how to identify if something they see is false. We can’t solve this problem alone so we are supporting third party fact checkers during the election in their work with news organisations, so they can independently assess facts and stories.”

Fakebook?

A spokesperson told us that Facebook’s “how to spot” fake news ads (pictured below) are running in U.K. publications, including The Times, The Telegraph, Metro and The Guardian.

Tips the company is promoting include being skeptical of headlines; checking URLs to view the source of the information; asking whether photos look like they have been manipulated; and cross-referencing with other news sources to try to verify whether a report has multiple sources publishing it.

 

Facebook does not appear to be running these ads in U.K. newspapers with the largest readerships, such as The Sun and The Daily Mail, which suggests the exercise is mostly a PR drive by the company to try to be seen to be taking some very public steps to fight the fake news political hot potato.

The political temperature on this issue is not letting up for Facebook. Last month, for example, a U.K. parliamentary committee said the company must do more to combat fake news — criticizing it for not responding fast enough to complaints.

“They can spot quite quickly when something goes viral. They should then be able to check whether that story is true or not and, if it is fake, blocking it or alerting people to the fact that it is disputed. It can’t just be users referring the validity of the story. They have to make a judgment about whether a story is fake or not,” argued select committee chairman Damian Collins.

Facebook has also been under growing pressure in the U.K. for not swiftly handling complaints about the spread of hate speech, extremist and illegal content on its platform — and earlier this month another parliamentary committee urged the government to consider imposing fines on it and other major social platforms for content moderation failures in a bid to impose better moderation standards.

Add to that Facebook’s specific role in influencing the elections, which again will be facing scrutiny later today when the BBC’s Panorama program screens an investigation of how content spread via Facebook during the U.S. election and the U.K.’s Brexit referendum — including considering how much money the social networking giant makes from fake news.

The BBC is already teasing this spectacularly awkward clip of Milner being interviewed for the program, where he is repeatedly asked how much money the company makes from fake news — and repeatedly fails to provide a specific answer.

Facebook declined to respond on this when we asked for comment on the program’s claims.

Safe to say, there are some very awkward questions for Facebook here (as there has been for Google too, recently, relating to ads being served alongside extremist content on YouTube). And while Milner says the company aspires to reduce “to zero” the money it makes from fake news, it’s clearly not yet in a position to say it does not financially benefit from the spread of misinformation.

And while it’s also true that some traditional media outlets have or can benefit from spreading falsity — earlier this year, for example, The Daily Mail was itself effectively branded a source of fake news by Wikipedia editors who voted to exclude it as a source for the website on the grounds that the information it contains is “generally unreliable” — the issue with Facebook goes beyond having an individually skewed editorial agenda. It’s about a massively scalable distribution technology whose core philosophy is to operate without any preemptive editorial checks and balances at all.

The point is, Facebook’s staggering size, combined with the algorithmic hierarchy of its News Feed, which can create feedback loops of popularity, means its product can act as an amplification platform for fake news. And for all The Daily Mail’s evident divisiveness, it does not control a global distribution platform that’s pushing close to two billion active users.

So, really, it’s Facebook’s unprecedented reach and power that is the core of the issue here when you’re considering whether technology might be undermining democracy.

No other media outlet has ever come close to such scale. And that’s why this issue is intrinsically bound up with Facebook — because it foregrounds the vast power the platform wields, and the commensurate lack of regulation in how it applies that power.

Ads in national newspapers are therefore really best viewed as Facebook trying to influence politicians, as lawmakers wake up to the power of Facebook. So maybe there should be an eleventh tip in Facebook’s false news advert: Consider the underlying agenda.

In the U.K., Facebook says that it is working with local third-party fact-checking organization Full Fact, and with the Google News Lab-backed First Draft organization, to work with “major newsrooms to address rumors and misinformation spreading online during the UK general election” — echoing the approach it announced in Germany in January, ahead of German elections this September… although the effectiveness of that approach has already been questioned.

Facebook says full details of the U.K. initiative will be announced “in due course.” The U.K.’s surprise General Election — called by Prime Minister Theresa May late last month, despite her previously stated intent not to call an election before 2020 — presumably caught the company on the hop.

With just one month to go until polling day in the U.K. it remains to be seen whether May’s election U-turn also caught the fake political news spreaders on the hop.