Talk by Cecy Correa
- Presented at Refresh Austin, October 2017.
- Presenting at Keep Ruby Weird 2017
Understanding the Psychology of Fake News
“You are entitled to your own opinion, but you are not entitled to your own facts.” — Daniel Patrick Moynihan
“...the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free?” — Alan Kay
The tech that powers fake news:
- Unfurl url from Facebook
- Selective feeds on Facebook or Twitter
- Blocking
- Creates an echo chamber
- There is a whole economy based on fake news: it’s easier than ever to set-up a fake news site:
- https://www.buzzfeed.com/craigsilverman/how-the-hyperpartisan-sausage-is-made
- http://www.npr.org/sections/alltechconsidered/2016/11/23/503146770/npr-finds-the-head-of-a-covert-fake-news-operation-in-the-suburbs
- There are even fake news generators: http://breakyourownnews.com/
- Biases / slants are also reflected based on publications
But to say it’s “the internet’s fault” is not entirely accurate
Study: http://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php
Pull quotes:
“While concerns about political and media polarization online are longstanding, our study suggests that polarization was asymmetric. Pro-Clinton audiences were highly attentive to traditional media outlets [...] but pro-Trump audiences paid the majority of their attention to polarized outlets”
“Over the course of the election, this turned the right-wing media system into an internally coherent, relatively insulated knowledge community, reinforcing the shared worldview of readers and shielding them from journalism that challenged it”
“Our analysis challenges a simple narrative that the internet as a technology is what fragments public discourse and polarizes (sic) opinions, by allowing us to inhabit filter bubbles or just read “the daily me.” If technology were the most important driver towards a “post-truth” world, we would expect to see symmetric patterns on the left and the right.”
So is it tech’s fault?
- Yes and no
- Let’s start with the no — no, it’s not tech’s fault because
- Fake news have always been around
- We are “wired” to believe fake news (more on this later!)
- Yes because our tools facilitate this cycle more than ever
Fake news through history: Yellow journalism (est. 1890s)
How are we wired to think this way? — The psychology of fake news:
- System 1 / System 2 thinking (Thinking, Fast and Slow, Daniel Kahneman)
- System 1 thinking: fast, automatic, intuitive (beliefs, learned behavior such as riding a bike)
- System 2 thinking: analytical / reason
- It is harder to change beliefs because they are part of our System 1 thinking. It is as hard to change a belief as it is hard to unlearn how to ride a bike or drive a car.
- The Knowledge Illusion, Philip Fernbach and Steven Sloman
- “As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.”
- “As a rule, strong feelings about issues do not emerge from deep understanding”... but it deepens as more people believe this together
- When asked to elaborate on views, most people realize how shallow their understanding is, and less vehemently believe what they believe.”
- Information Diet, Clay Johnson
- Consuming information that we agree with is like eating junk food, it tastes delicious to our “brain” — appeals to System 1 thinking.
- Consuming information that is challenging is like eating healthy food you don’t like — it’s good for you, but it is not satisfying to the brain in the way fake / sensationalistic information is — involves System 2 thinking, which is harder for us to do.
Powerful research groups understand this psychology:
“A Facebook ‘like’, he said, was their most ‘potent weapon’. ‘Because using artificial intelligence, as we did, tells you all sorts of things about that individual and how to convince them with what sort of advert. And you knew there would also be other people in their network who liked what they liked, so you could spread. And then you follow them. The computer never stops learning and it never stops monitoring.’”
Understanding the psychology behind behavior is key.
Okay, so how do you ‘fight’ fake news or misinformation
Human solutions:
- Teaching tech and media literacy to the masses (easier said than done)
- Careful, as facts may actually hurt you: http://www.businessinsider.com/sociology-alternative-facts-2017-2
- Daryl Davis, How to argue: http://loveandradio.org/2017/02/how-to-argue
- Talking more about psychology and behavior as it relates to the technology we build
Tech:
- Perspective API http://www.perspectiveapi.com/
- “Outside the Bubble” feature on BuzzFeed
- Blue Feed / Red Feed on WSJ http://graphics.wsj.com/blue-feed-red-feed/
- Facebook algorithm and fact checking of fake news http://adage.com/article/digital/facebook-automates-effort-flag-fake-news-fact-checking/310016
- Facebook finally analysing propaganda claims https://newsroom.fb.com/news/2017/09/information-operations-update/
- Knight Foundation projects to tackle fake news https://www.knightfoundation.org/articles/20-projects-will-address-the-spread-of-misinformation-through-knight-prototype-fund
But can tech companies care about this? What does it look like?
Companies that care about both profit and purpose: https://medium.com/@sexandstartups/zebrasfix-c467e55f9d96#.b7ewtdg91
For the future:
In a recent study by Google on Generation Z, they found this generation wants tech makers to “...live up to the standard GenZ has set [...] and continue to inform, inspire, and create products and marketing that facilitate the world which they want to live.”
Source: http://storage.googleapis.com/think/docs/its-lit.pdf
I think this is important — let’s build tools that inspire and facilitate, not divide.