Fake News – FUMBLY PLAY

By now, anyone who has watched global news or been active on social media would have heard the constant reminders that the Russians maliciously impacted the U.S. presidential elections of 2016. It is shocking that foreign actors could have infiltrated a democracy as old and revered as the U.S’. This causes fear for the rest of us. But more than the act itself, it is the manner in the Russians pulled the act off, which is shocking.

The first revelations showed that the Russians had hacked into the email servers of the Democratic National Committee and the Clinton Campaign. They released some damning emails to the American public with the intended goal of negatively affecting views of Clinton. But that’s not what interests me. As we have learned over the past month, the more effective intrusion came subtly through social media. The simplicity of their technique is astounding. I write this is as a report on how they executed it, and a guide on how the Bhutanese can avoid being duped as the Americans were.

The human thinking suffers from a great many biases – these are flaws in how we think and separate our thinking from rational reasoning. Of the many cognitive biases we suffer from, the Russians heavily relied on the frequency bias. Humans tend to believe that a view is true (or acceptable) if they hear about it enough – so, the frequency with which a view is held in the public is considered a valid measure of their truthfulness.

The Russians created hundreds of fake profiles on social media – from Facebook to Twitter. The profiles had goals of decontextualizing videos or photos, adding real accounts as friends/followers, hyping up fake news, and most importantly, granting legitimacy to fake stories by saturating the news feed of some real user. A good example of that happening in the U.S. involves the story of Texas.

You see Texas is a proud state and has a history of trying to secede from the American Union (this is why Texas is called the “Lone Star State.”) But in reality, the secession had always been a fringe movement, and became somewhat noticeable in the last decade (it’s still pretty small, with only 25% of Texans supporting the idea). If the people who hold these views were left to themselves, their ideas would be drowned out by other, more popular ones. That’s the natural way of filtering out ‘crazy’ ideas. But thanks to social media, it is easier than ever before to find likeminded people to form an echo chamber. When members of these echo chambers ‘share’ their views with non-members, the latter will see the idea to be frequent and accept it. The Russians were so successful at exploiting this a page they created, ‘Heart of Texas,’ became the most popular pro-secession page on Facebook (with close to 300,000 likes before being shut down) and they even managed to get their followers to get out of their homes to support secession.

People who had marginally held that fringe view are the first victims of an exploit like this. In this case, the Texans who had agreed that their state should secede. But as the idea spreads, those who had been undecided about the issue start to mellow. They see that more ‘people’ now hold what was once an unpopular view, and begin to accept the atypical as typical. And moreover, even if we see that the story is fake, once it has spread, it is almost impossible to undo the damages. To understand how that works, look at a recent, Bhutanese example. The story about cabinet ministers using government helicopters frequently broke from private, suspicious Facebook accounts – no news agency claimed it. I first saw it posted (not shared – mind you) to a forum on Facebook, by an account that everyone accepts as fraudulent. Within the next few hours, multiple real accounts had shared the story, and those in their friend networks followed suit. Because of the frequency of the story in the social media, even the more educated, more experienced among my friend networks had started to believe the story.

The government responded within a few hours of the story going viral – with the actual numbers. But by then, it was too late. Because the government’s defense was not as sexy as the accusations made against them, fewer people shared it. Of those who saw the defense, a few became more entrenched in their belief that the first story was true. This is the backfire effect when the defense against a fake news story only ends up confirming the story to a few.

At that point, the government was seen as trying to save their behinds. In case of the Americans, this phenomenon was so common and widespread that it spawned the rise in popularity of a new acronym – TPTB (The Powers That Be). TPTB refers to an authoritative force that tries to change the public discourse.

Imagine the marginalization capability of stories like this. The fake story of helicopter usage by ministers has already negatively affected the public opinion of the government. In the group whose views have been affected are those who never saw the defense and those that took the defense as false. As more and more stories like this arise, more and more people become marginalized in their view of the government. And the target group is immaterial to the success of the attack; it can be anyone. This is why we must keep our guards up, and I recommend the following:

Take classes in media literacy – the Bhutan Center for Media and Democracy organizes them regularly.

Only accept news stories reported by credible news media or journalists. No matter what your view of journalists is, remember that all of them feel bounded by ethics. They are duty-bound to ensure the credibility of any story they break.

Don’t accept stories just because you like them – this means the stories that flatter your side or disparage your opponent’s. Both are equally harmful, according to most, but I am of the opinion that false flattery is much worse.

Generally, disregard fake social media profiles that preach anything political or religious.

Google the source to check for a story’s validity.

Report any suspicious story to Business Bhutan – they’ll clear it up!