Skip to content

This repo collects examples of intentional and unintentional hacks of media sources

Notifications You must be signed in to change notification settings

Helpful-Bus/hack-the-media

 
 

Repository files navigation

Hack the Media

tl;dr

A media literacy guide that highlights ways that our media sources can give us a flawed view of the world (also there is a technology industry-specific media guide and a financial media guide).

Puppet cutting it's string - Male Puppet cutting it's string - Female

These images are available to use freely (CC BY 4.0). Free stickers can be requested here — original SVGs are here) and here.

Background

There are many ways for motivated parties to “hack” humans by influencing their media sources and fellow community members.

This document collects examples of intentional and unintentional hacks of media. The hope is to unbias these sources and provide antibodies to us all. (This originally started as a general media guide for software engineers, with lessons specific to the tech industry now collected here; there is also a media guide specific to financial media coverage).

Contributions are welcome (see our guidelines), especially when they include vivid and relevant examples. Follow me on Twitter (@nemild) to see more examples of media manipulation.

General Notes

  • Key questions to ask with any media:
    • Why are they writing about this topic and how do they and/or their sources personally benefit? Always need to understand the personal motivation of writer and their sources
    • Why is the media distributor (social network, news organization) putting this piece of content in front of me? How do they benefit from having me see this content?
    • Do I "need to know" this topic that this content is covering? Would investigating this topic be a productive use of my time, or could I use this time investigating other topics?
  • Reader Interest is King: When something is heavily covered, it is often due to large reader interest — not due to the importance of the event for you; media sources often focus on what audiences want to hear, not what they need to hear
    • Editors, commentators, and especially social media algorithms focus on engagement as it maximizes interest and revenue; also, a key reason for confirmation bias and filter bubbles (read about Cecil the Lion's coverage, or the US coverage of the war in Yemen to see what stories sell to an American audience)
    • Journalists are under immense pressure by media owners and investors to provide content that is highly read/watched/shared and therefore most profitable (see this hilarious video clip of Sam Zell, former CEO of Tribune)
    • Journalists focus on "newsworthy" content, which many believe is the content their audience needs to make good decisions. Instead, newsworthy media is what their audience wants to consume (and is also most read and most profitable for their publication)
    • Rather than thinking of news as mirror of what’s going on in the world, think of it as mirror of what people want to read/hear; this "invisible hand of the reader" dictates what content is created and how well it will be distributed
    • Example:

Jon Stewart: “What ...is the role for CNBC? … There were literally shows called Fast Money.”

Jim Cramer: "There's a market for it and you give it to them ... I too like you want to have a successful show ...I'm trying to bring in younger people who really don't want to hear about the stuff ..." (link)

  • Selective facts are everywhere: Selective facts are “true” facts that only tells us part of the story. Partisan news and social media algorithms give us the facts that confirm our beliefs and purposefully exclude important facts that give us the full picture (for more, see my article on selective facts)

  • Issues with Online Democracy: Social networks algorithms and voting mechanisms generally treat each of us equally, which diminishes how much voice is given to experts; often an issue on deeply technical topics

  • Empathy Gap: Media and social media forums often have little empathy for the “other” side, as a given audience prefers to see themselves in the best light — and it is more engaging to see the most easily caricatured/dismissed voices on the other side (see opposing political tribes). Newsfeed algorithms and friendship-based social graphs encode this bias based on the user choices they see. Journalists also create content to cater to this demand.

  • Access and the Turning Journalism into PR: Journalists are sometimes incentivized or subtly threatened to write positively about select topics, as one tech journalist writes:

“It’s a game of access, and if you don’t play it carefully, you may pay sorely. Outlets that write negatively about gadgets often don’t get pre-release versions of the next gadget. Writers who ask probing questions may not get to interview the C.E.O. next time he or she is doing the rounds. If you comply with these rules, you’re rewarded with page views and praise in the tech blogosphere. And then there’s the fact that many of these tech outlets rely so heavily on tech conferences. “If you look at most tech publications, they have major conferences as their revenue,” Jason Calacanis, the blogger and founder of Weblogs, told me. “If you hit too hard, you lose keynotes, ticket buyers, and support in the tech space.”" (link)

  • The Power of Narrative: Humans generally value stories with certain narrative notes (e.g., good vs evil, David vs Goliath, a new technology/startup will make the world a better place); Hollywood writers and journalists face similar incentives to cater to this demand
  • Attention Economy: All media sources are in constant competition against each other for the 'attention' of their audience, in the hopes that they can later exploit that attention (examples: branding, propaganda, donation solicitation, ad monetization). Attention is a very scarce commodity (only 24 hours in a day), so the competition for your eyeballs can reach a fever pitch.
    • John Stankey, chief executive of Warner Media:

I want more hours of engagement. Why are more hours of engagement important? Because you get more data and information about a customer that then allows you to do things like monetize through alternate models of advertising as well as subscriptions ... (link)

Content that sells

Running a social network, a news channel, or a newspaper has some similarities to running a convenience store. Store owners stock and feature the products that sell, not necessarily what their audience needs. Even the most thoughtful media barons struggle with this tradeoff: how do you tradeoff catering to market demand vs. what the market needs.

The ”junk food” of the media industry evokes our base emotions, and is the easiest for media sources to sell:

  • Fear
    • Fear is profitable; “If it bleeds, it leads” is a famous saying about the newspaper business; vivid deaths get featured because they maximize engagement — and increase revenues of media organizations and social networks
    • Focusing on these vivid deaths alone often doesn't make sense for good decision making (see my data analysis on death coverage in the NY Times)
  • Outrage
    • A common tactic is to have an article showing an extremist — in a tribe that opposes your own — saying/doing something crazy; audiences often mistakenly use this as a shorthand for the views of the entire opposing tribe, and these are highly shared to highlight the craziness of the other side
    • Outrage-based journalism is a common technique by partisan media sources eager to stoke anger, maximize clicks, and meet the demands of their audience

Media Sources and Hacks

Social Media

Common Issues Across Social Media Platforms

Example Hacks

  • Selective facts (intentional): Share partial facts with your followers (i.e., factual coverage that only shows part of the picture - and conveniently ignores facts that don't support a pre-existing view) (Example: Partisan news organizations; for more background, see my article on selective facts)
  • Fake News: False content that comport to reader’s views (example: fake news)
  • Glurge: False content that aims to "inspire" readers; as they don't advocate for a political position, they tend to avoid as much scrunity as "fake news" (example: Snopes' page on "glurge")
  • Comment Filtering: When an organization controls the Facebook page or website, comments can be filtered to highlight a favored view
    • Example at Missouri Electric Cooperatives link:

Our plan will be to promote the feel-good activity and news from the event. Comments that are positive will be liked and possibly shared. Comments that are derogatory and/or abusive will be hidden from public view. Commenter receives no notification this hiding has happened.

Other Issues

  • Selective facts (unintentional): Algorithms favor factual coverage that only shows part of the picture, as this maximizes engagement; key issue with relying on likeminded friends and algorithms that focuses primarily on engagement to dictate content
  • Extreme “other”: Lack of empathy for other side, as more relevant to see the most extreme actions of the other side and ignore the poor actions of the most extreme people on your side (example: CIA deaths versus limited coverage in US of CIA actions elsewhere)
  • Clickbait: Decision of clickworthiness made on title alone, leading to incentives for clickworthy titles and easily explained content

Reddit and Hacker News

How it works

  • Anyone can join and vote in community, with everyone’s vote counted equally
  • For new posts, the number of upvotes soon after posting determine placement in the front page ranking
  • Comments are similarly upvoted, with early comments generally advantaged over later comments
  • Goal is to favor content that is likely to be heavily upvoted for the community of upvoters
  • (Popular input source in startup communities and for junior engineers)

Example hacks

  • Upvoting Ring: Asking your friends and supporters to upvote; this can work despite social network countermeasures
  • Allied Commenters: Get your allies to be the first commenters (which your friends will then upvote and bubble to the top), subtly shaping the views of everyone who reads the content
  • Confirmatory Content: Creating content that justifies pre-existing views or financial incentives of subreddit holders; see what popular views are before, and ape them (example: cryptocurrency subreddits that promote their own currency, and discredit competing currencies)

Other issues

  • Tribalism: Tribal behavior by key influencers can determine how certain topics are received (example: Though HN was quite negative to MongoDB, what would the reaction have been if MongoDB was a Y Combinator company?, How does one cryptocurrency subreddit approach another?)
  • No more experts: No distinction for experts versus others; one layman has the same voting power as the world’s most thoughtful expert (Example: a non-engineer vs. the world’s most thoughtful database expert on MongoDB posts); readers may also not take the time to understand background/expertise of writer
  • Militant Minority: Upvoting and posting community is likely small compared to readers, providing lots of power to a small group of motivated users; motivated users are often people who personally benefit from post

Facebook and Twitter Feeds

How it works

  • Algorithms take newly posted content from often likeminded friends/followers and decide what to feature so that user engagement is maximized (click, like, share/retweet)
  • Unlike Reddit model, algorithm is focused on maximizing engagement for each individual user, not for a broader, more diverse community

Other issues

  • Confirmation Bias: Extreme degree of confirmation bias
    • We click on content that justifies our individual views without interacting with the full body of research on a topic; newsfeed algorithms see these choices and give us more of the same
    • Gives us an irrational confidence that we know what is “reality”, because all evidence we see justifies a certain view - and the only opposing side that breaks through is the most extreme, outrageous voices that can be easily dismissed

Potential Antibodies

  • Engagement is not reality: Realize that media sources (both content creators and distributors, like social networks) focus on maximizing "engaging" content; this is often extreme or outrageous events, and don't give you a good sense for what is really going on

  • Wants vs Needs: Social feed algorithms focus on user "wants", but huge value to determining your "needs"

  • Realize huge degree of confirmation bias on Twitter/Facebook — and echo chambers in every social network

    • Ari Paul, CIO Blocktower:

“When I tweet anything positive about cryptocurrency it gets 10x the likes/shares as anything negative…” (link)

This example shares a lot of similarities with publication bias in the sciences. Reader interest influences what content is produced ("the invisible hand of the reader") and distributed on social networks. It also influences what research scientists do, as their research choices can be influenced by virality, clicks, and getting papers published.

  • Avoid trolls (see prettydiff's great guide)
  • Realize it is newsworthy/relevant to see the excesses of the opposite side, but not very newsworthy/relevant to see excesses of your side (leads to empathy gap); applies also across religions, countries, and political tribes
    • Noah Smith: “There are always a handful of people out there doing any stupid, crazy, or annoying thing you can imagine. And the media has an incentive to find those people and shove their excesses in your face.” (link); social media algorithms especially value these extreme events, as they help them maximize engagement
    • My research on death coverage in the NY Times (part 1, part 2): NY Times Terrorism Coverage vs. Deaths

Journalism

How it works

  • Trained (and untrained) journalists research various topics and work with editors to publish on blogs and print news
  • Today, media organizations most often make money from ads; in some cases, they make money from subscription fees (in the past, classifieds and subscription fees were more important funding sources)
  • At its best, goal of journalism is to give the "facts [we] need to make good decisions"
    • Baser goal is to optimize eyeballs and number of paying subscribers by providing content that audiences want to read
    • News site can have some similarities to a convenience store that determines product placement based on maximizing sales (maximizing viewership is a key reason for the journalistic saying "If it bleeds, it leads")

Example Hacks

Other Issues

  • Engagement is king: Reader interest and social media algorithms prize engaging content, rather than information that leads to good decisions for the audience (example: feverish coverage of Cecil the Lion, example why atrocities in Yemen are not well covered in the US press)

    • Coverage is a function of reader interest; fundamental problem with using media to make decisions

      • What’s covered is not the same as what’s important for the decisions we make each day, though the latter is a key purpose of journalism
      • Journalists will say they're covering important, "newsworthy" stories. Social media product designers will say they're surfacing "relevant" content. Both techniques surface the content readers want to read - and what is profitable for the media organization. (most journalists don't have the luxury to distinguish between "newsworthy" and the facts their readers need for good decisions)
      • Focus on newsworthiness and relevance leads to substantial “sampling bias.” An old journalistic saying is "If it bleed, it leads." Even though they don't give you a representative view of what's going on in the world, vivid deaths get the most engagement from readers and attract an audience: New York Times Coverage of Terrorism and Homicides
    • Opportunity to create fake content that meets reader interest (example: Jim Cramer talking about how easy it is to create fake news that Apple’s original iPhone isn’t selling well, allowing him to profit off a fall in the stock price, 2016 US Election)

    • Important content that we need, but isn't widely read/shared won't be produced (example: David Attenborough criticizing BBC for lack of art programs because they don't attract audience)

  • Getting past PR: PR steers journalists in favored directions and sometimes implicitly threatens things journalists value like access to a CEO (example: public relations, marketing and advertising at Theranos )

  • Access: Some journalists may trade favorable coverage for access/tips; businesses/politicians favor journalists that "toe the line"

  • Journalists try to:

    • maintain good source relationships (e.g., investors) and access for future information
    • manage substantial story deadlines with limited time (example: Paul Graham’s post on maximizing PR and importance of making stories easy for journalists)
    • show strong engagement metrics to ensure job security (Example: Greta Van Susteren at MSNBC)
    • can be fired or influenced to prevent unfavorable coverage even in seemingly innocuous beats like business or technical reporting (example: Newsweek firing for critical coverage of Newsweek’s parent company) (there are obviously much worse things that happen to journalists, and my focus is in tech reporting)
    • use “three’s a trend” as a common heuristic, despite the fact that this would be laughable to most statisticians (example: Great Clown Scare of 2016, anecdotes vs data)
    • Often overweight single data points to make too broad generalizations (e.g., first Tesla fatality calling into question if autonomous driving will ever be safer, the low usage of a cryptocurrency collectible game implies the entire use case will never be successful)
    • are susceptible to narrative - and confirmation bias - that then influences the follow-up coverage they do (example: "Facebook is bad" narrative in early 2018 means that all Facebook's decisions are suspect, even those that have a lot of nuance)
    • have a bias to cover subjects (entrepreneurs, companies, technologies) that will “make the world a better place” — as this is an affirming message their readers value; motivates parties that want to be covered to stress these elements, no matter how unrealistic this sounds or how unclear their impact is (examples: Theranos's early media message and fawning coverage)
    • cover events less as they become more common, even though that may not be right for the best decision making (example: shooting deaths vs terrorism deaths in the US)
    • (though this may seem critical of journalists, it is primarily critical of the readership and financial incentives faced by many journalists; it also suggests that statistics needs to be more widely taught in journalism)
  • Blacklisting criticism: Editors sometimes lock out writers who thoughtfully critique their publication's work, reducing the likelihood that journalists criticize publications they want to be published in (At the Washington Post, "the paper could not take such writers who supported strong condemnation of [The Post's] work")

  • Media frenzy: A singular event is reported, and a frenzy of media coverage ensues on stories that fit this narrative; the precipitating event:

    • validates that readers/viewers care about this issue and that for news editors there are impressions and profits to be made (see Cecil the Lion’s coverage at the Washington Post to see the thought process for one editor about what to cover)
    • encourages insiders/critics to leak more information — and search for receptive journalists looking for this (example: one journalist searching for more leaks after the FB/Cambridge Analytica issues)
    • makes journalists invest more in this topic, uncovering new issues (this also has a risk of confirmation bias, where stories are reported that fit the narrative, see previous FB example; those that don’t are poorly reported, deemed un-newsworthy, or simply ignored by the audience)
    • Targeted groups (startup founders, employees of a company, a politician) get deeply defensive, since a media frenzy is not always fair. These targeted groups fixate on the stories that are inaccurate or unfair. This then (inappropriately) leads them to dismiss most criticism, even those that a thoughtful observer would consider a fair critique.
    • Examples: Facebook and Cambridge Analytica -> Facebook is bad, WSJ story about Theranos -> Theranos firestorm

Potential antibodies

  • When something is heavily covered, it is substantially due to large reader interest — not due to the importance of the event for you; risk of confirmation bias and sampling bias if you don’t adjust signal
    • Paul Graham: “The number of news stories about a problem is not a sign of how serious it is, but of how much demand there is for stories claiming so” (link)
    • Tren Griffin: "The journalistic formula of 2018 so far seems to be: I found a few bat shit crazy people in region X doing Y. Therefore the practice of doing Y is widespread in that region X. A few anecdotes are not data establishing something is a widespread practice." (link)
    • My research on death coverage in the NY Times and risk assessment (link)
  • Realize vivid stories are powerful for user engagement, while lots of drier data and stories that inform good decisions are less monetizable (example: Shooting of Australian in US - and media needs/pre-existing views of Australian readers)
  • When you see something covered, ask yourself who is motivated to have it covered this way
    • Especially valuable in laudatory profiles on a company or person
    • In leaks, who could have leaked it and what was their motive? What important information might be unleaked?
  • When a PR-like piece is shared (A24 in GQ, IBM Watson, a laudatory profile) , ask what the covered party’s motivation is to get the word out now (recruiting, sales, corporate branding)
  • Distinguish between guest written or sponsored pieces, and something written by the staff
  • Rather than looking to journalists to understand journalistic incentives, look to statements from media business owners and editors
  • Read All the News That’s Fit to Sell (Stanford Professor James Hamilton) and Public Opinion (Walter Lippmann) for more on how events are turned into news

About

This repo collects examples of intentional and unintentional hacks of media sources

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published