Your Fake News problem is not just “them” it’s you.

On 9th August Time magazine published a really great article titled “How Your Brain Tricks You Into Believing Fake News“. I can highly recommend reading. It dives below the surface into the heart of the problem and is not simply another article having a bit of a moan about it all.

I have one huge criticism of it. It makes reference to many things, yet cites or contains no links to the various bit of research by the subject matter experts that it references. I get it, they are a print media, and so it needs to work in a printed context, and yet there is scope for the online copy to be enriched with such links.

OK, let’s fix that. I’ll quote mine parts and cite references to some of the key bits of research that they refer to.

Minor Side Note: the term “Fake News” is not being used in the Trump sense of stuff that might indeed be wholly factual but negative. Instead it is a description of the distribution of disinformation that is designed to manipulate you. (For example, almost anything that Trump says).

How Your Brain Tricks you into Believing Fake News

The Time article starts off with a mini story of a smart highly intelligent professor buying into fake news.

Key Point: being fooled by Fake News does not mean that you are stupid. That is what the opening story drives home.

Sitting in front of a computer not long ago, a tenured history professor faced a challenge that billions of us do every day: deciding whether to believe something on the Internet.

On his screen was an article published by a group called the American College of Pediatricians that discussed how to handle bullying in schools. Among the advice it offered: schools shouldn’t highlight particular groups targeted by bullying because doing so might call attention to “temporarily confused adolescents.”

Scanning the site, the professor took note of the “.org” web address and a list of academic-looking citations. The site’s sober design, devoid of flashy, autoplaying videos, lent it credibility, he thought. After five minutes, he had found little reason to doubt the article. “I’m clearly looking at an official site,” he said.

What the professor never realized as he focused on the page’s superficial features is that the group in question is a socially conservative splinter faction that broke in 2002 from the mainstream American Academy of Pediatrics over the issue of adoption by same-sex couples. It has been accused of promoting antigay policies, and the Southern Poverty Law Center designates it as a hate group.

Trust was the issue at hand. The bookish professor had been asked to assess the article as part of an experiment run by Stanford University psychologist Sam Wineburg. His team, known as the Stanford History Education Group, has given scores of subjects such tasks in hopes of answering two of the most vexing questions of the Internet age: Why are even the smartest among us so bad at making judgments about what to trust on the web? And how can we get better?

Key Reference 1: OK, let’s pause there for our first key reference. Here is a link to the Stanford History Education Group. There you will find a highly valuable educational resource that can be used with students and helps them to learn online critical thinking. It’s all free. This is a group run by Stanford university with the goal of improving education. It is popular and more importantly worth knowing about.

The Time article also makes reference to this …

A 2016 Pew poll found that nearly a quarter of Americans said they had shared a made-up news story.

Key Reference 2: You can find that Pew poll here. Under the tag of “News Media Ethics and Practises” you will often find insightful survey results from Pew. They have no specific political axe to grind, they are wholly nonpartisan and do not take policy positions. The name is down to the Pew Charitable Trust being the primary sponsor. That in turn was founded in 1948 with the specific goal of serving the public interest by “improving public policy, informing the public, and stimulating civic life“. The Name Pew originates from the family who gave their wealth to establish it.

The Time article also drops this reference …

In his experiments, MIT cognitive scientist David Rand has found that, on average, people are inclined to believe false news at least 20% of the time.

Key Reference 3: Over on his webpage you can find references to all of David Rand’s research. Here for example is one of his very latest research papers: “Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning

New Term: Deep Fake.

By tapping into a popular bit of right-wing cultural paranoia regarding the term “deep state” we have a new phrase via that Time article that is powerfully descriptive …

Our inability to parse truth from fiction on the Internet is, of course, more than an academic matter. The scourge of “fake news” and its many cousins–from clickbait to “deep fakes” (realistic-looking videos showing events that never happened)–have experts fearful for the future of democracy.

Very Important Observation: Don’t trust google.

I really like this observation, it is something that is not often appreciated …

Instead of working harder, we often try to outsource the job. Studies have shown that people assume that the higher something appears in Google search results, the more reliable it is. But Google’s algorithms are surfacing content based on keywords, not truth. If you ask about using apricot seeds to cure cancer, the tool will dutifully find pages asserting that they work. “A search engine is a search engine,” says Richard Gingras, vice president of news at Google. “I don’t think anyone really wants Google to be the arbiter of what is or is not acceptable expression.”

In other words, if you search for crap, then that is exactly what you will be served up by google.

Discovering who is better at this: Who does not get fooled and why?

Going back to Wineberg’s research (ref 1 above), the Time article lays down a path to a rather good observation. First we have the build up to it …

Wineburg, an 18-year veteran of Stanford, works out of a small office in the center of the palm-lined campus. His group’s specialty is developing curricula that teachers across the nation use to train kids in critical thinking. Now they’re trying to update those lessons for life in a digital age. With the help of funding from Google, which has devoted $3 million to the digital-literacy project they are part of, the researchers hope to deploy new rules of the road by next year, outlining techniques that anyone can use to draw better conclusions on the web.

His group doesn’t just come up with smart ideas; it tests them. But as they set out to develop these lessons, they struggled to find research about best practices. “Where are the studies about what superstars do, so that we might learn from them?” Wineburg recalls thinking, sitting in the team’s office beneath a print of the Tabula Rogeriana, a medieval map that pictures the world in a way we now see as upside-down. 

… and here it comes …

Eventually, a cold email to an office in New York revealed a promising model: professional fact-checkers.

Fact-checkers, they found, didn’t fall prey to the same missteps as other groups. When presented with the American College of Pediatricians task, for example, they almost immediately left the site and started opening new tabs to see what the wider web had to say about the organization. Wineburg has dubbed this lateral reading: if a person never leaves a site–as the professor failed to do–they are essentially wearing blinders. Fact-checkers not only zipped to additional sources, but also laid their references side by side, to better keep their bearings.

In another test, the researchers asked subjects to assess the website MinimumWage.com. In a few minutes’ time, 100% of fact-checkers figured out that the site is backed by a PR firm that also represents the restaurant industry, a sector that generally opposes raising hourly pay. Only 60% of historians and 40% of Stanford students made the same discovery, often requiring a second prompt to find out who was behind the site.

Another tactic fact-checkers used that others didn’t is what Wineburg calls “click restraint.” They would scan a whole page of search results–maybe even two–before choosing a path forward. “It’s the ability to stand back and get a sense of the overall territory in which you’ve landed,” he says, “rather than promiscuously clicking on the first thing.” This is important, because people or organizations with an agenda can game search results by packing their sites with keywords, so that those sites rise to the top and more objective assessments get buried.

So the point is that they can then leverage such insights to then teach these same skills … and then make those lessons available for free.

That’s an important discovery. We need not simply shout at the wind when the storm of disinformation blows, but can instead create a shelter in the minds of those facing the assault by giving them the vital skills needed.

Lessons You can Learn as an individual

What can you do?

Well here is a good start …

  • Learn how to reverse-search an image (to make sure a photo is really portraying what someone says it is)
  • Practise using neutral queries … or at least look for criticism by adding words such as “debunked”, “skeptic” or “rebuttal”.
  • Be willing to question your prejudices, and to second-guess information you might like to believe
  • Develop an awareness of “digital pollution” on the Internet. You are “polluting” by forwarding stuff that isn’t true. Clean air and clean water matters, so does having a clean digital environment. We are now generally socially aware enough to know that tossing trash out of your car window is not good. We need to teach Internet users to become equally aware of this in a digital context, and we also need to learn that as individuals. Today we are choking on the fumes of digital pollution and so we need to clean up our act.
  • Up-skill and develop fact-checking prowess.
  • You find an “interesting” web site. Jump out of that bubble to a new tab and find out what others say about this organisation, and also who is funding them.
  • Practise “Click Restraint”. Scan all the search results returned and perhaps also the next page or two. Search results can be manipulated by some who pack web pages with heaps of keywords.
  • Ask yourself “Who is behind this?”

Bottom Line

You can be fooled, but you can also learn new skills that will greatly reduce the possibility of that.

It’s not just about having better technology solutions that “they” need to fix, much of the problem rests within our own heads. You  are awash with many cognitive biases that can be leveraged by those with a malicious intent to manipulate.

The solution is to address that by taking a defensive posture and improve our digital literacy and fact-checking skills.

Leave a ReplyCancel reply

Exit mobile version