{"id":54404,"date":"2020-11-12T07:00:20","date_gmt":"2020-11-12T15:00:20","guid":{"rendered":"https:\/\/www.eigene-homepage-erstellen.net\/?p=54404"},"modified":"2021-01-13T10:35:13","modified_gmt":"2021-01-13T18:35:13","slug":"fake-news-is-a-marketing-feature-not-a-hack-part-1","status":"publish","type":"post","link":"https:\/\/www.eigene-homepage-erstellen.net\/blog\/content\/fake-news-is-a-marketing-feature-not-a-hack-part-1.htm","title":{"rendered":"Fake News Is a Marketing Feature, Not a Hack: Part 1"},"content":{"rendered":"\n
Truth is the arbiter of reality; a sacred, unbiased, and unwavering lens we view and understand the universe through.<\/p>\n
At least, truth used to hold this esteemed responsibility.<\/p>\n
In the digital world, unearthing “truth” is no longer simple or reliable. Search results, social media, legitimate and illegitimate news organizations, and paid advertising overflow with misinformation and disinformation.<\/p>\n
This phenomenon is summed up in two words: fake news.<\/p>\n
Fake news is the most powerful and socially destructive marketing technique in the 21st century. The fake news pandemic is global, unyielding, and we are all susceptible to its infection. Our widespread vulnerability is exactly why it’s crucial for marketers to understand why and how disinformation is created, spread, and—most importantly—combatted.<\/p>\n
This article is the first in a three-part series about the relationship between fake news and marketing. This post lays the groundwork needed to ensure we’re all on the same page about what fake news is, why it’s a problem, and how it relates to marketing. The second article talks about how disinformation affects our brains and manipulates our behaviors. I wrap up the series by discussing how the business model perpetuates fake news and what can be done about it.<\/p>\n
In the simplest terms, fake news is optimized disinformation<\/a>.<\/p>\n Optimized disinformation has a veneer of legitimacy and commonly rewrites “the truth” using advertising, fabrication, manipulation, political satire, and propaganda.<\/p>\n Usually, this tactic is employed to manipulate the beliefs, motivations, and actions of like-minded people. The strategy is also used to sow confusion around polarizing topics, stymy constructive public discourse, and erode trust in traditional paragons of truth, like scientists, journalists, and healthcare officials. In some egregious cases, optimized disinformation is used exclusively as a marketing and money-making tool, such as the InfoWars Sandy Hook conspiracy<\/a>.<\/p>\n Although disinformation polymorphs into many nebulous disguises with equally shadowy goals, few types of disinformation are more sinister—or more effective—than junk news, which exploded in popularity<\/a> during the 2016 U.S. presidential election.<\/p>\n Because junk news is so popular and spreads effectively on social media, it is the primary type of fake news I reference most in this series.<\/p>\n In the research paper, Disinformation Optimised: Gaming Search Engine Algorithms to Amplify Junk News<\/a>, Samantha Bradshaw, a doctoral researcher at the Oxford Internet Institute, defines junk news as a website that misleads people by using at least three out of the following five deceptions:<\/p>\n Junk news websites that the study evaluated include InfoWars, Breitbart, Zero Hedge, CNS News, Raw Story, The Daily Caller, and The Federalist. There are also more than 450 hyper-partisan websites that often get labeled as fake news<\/a>, but these groups usually peddle inflammatory misinformation rather than blatantly optimized disinformation.<\/p>\n Granted, at times the distinction is fairly thin. And unfortunately, Google doesn’t do a great job of distinguishing between legitimate and junk websites.<\/p>\n Disinformation campaigns maliciously leverage every nuance of the surveillance-based business model that search engines and social networks are built around.<\/p>\n I’ll dive into the nuts-and-bolts of this topic in the third article of this series, but here are a couple of examples until then.<\/p>\n First, let’s look at junk news and advertising.<\/p>\n In 2019, The Global Disinformation Index, a UK nonprofit that rates websites’ trustworthiness, analyzed programmatic advertising rates among 1,700 junk news websites<\/a>. The analysis shows that 70 percent of these websites had programmatic advertising and earned $235 million from those ads.<\/p>\n Several of the household-name brands mentioned in the GDI report that inadvertently bankrolled junk news sites include Audi, Sprint, Honda, Office Max, American Airlines, Casper, and Oxford University.<\/p>\n Now, let’s take a gander at organic search.<\/p>\n In 2016, Google’s search algorithms failed miserably at providing accurate information to an extremely serious question: did the Holocaust happen? At the time, the answer was “no.”<\/p>\n As The Guardian reported<\/a>, the top result was a link to the article, “Top 10 Reasons why the Holocaust Didn’t Happen,” published by stormfront.org, a neo-Nazi site. The algorithmic failure didn’t stop there. The third result was the article “The Holocaust Hoax; IT NEVER HAPPENED.” The fifth ranking was owned by “50 Reasons Why the Holocaust Didn’t Happen.” The seventh position was a YouTube video, “Did the Holocaust Really Happen?” And the ninth result was “Holocaust Against Jews is a Total Lie – Proof.”<\/p>\n Since this event sparked global outrage, Google tweaked its algorithm to change the search results and prevent similarly optimized disinformation from ranking for the term.<\/p>\n The algorithm changes had a noticeable effect on four junk news websites with significant organic keyword growth (InfoWars, Zero Hedge, Daily Caller, and Breitbart), the Oxford Internet Institute report shows. Since August 2017, the report states that all four top-performing domains appeared less frequently in top-positions for non-branded Google searches, based on the keywords they were optimized for.<\/p>\n Despite the progress, Google’s algorithms still have a long way to go.<\/p>\n For example, take the phrase “climate change hoax.” Ahrefs shows the phrase gets 1,000 monthly searches. As of Oct. 26, 2020, three of the top 10 results are disinformation:<\/p>\n Before the 2016 election, I was naive about the insidious reach and power junk news websites and fake news have across the world. Here are the three significant consequences of fake news that we’ve seen unfold in the past few years.<\/p>\n Pizzagate, QAnon, birtherism, climate change denial, anti-vaxxers, Holocaust denial, COVID-19 being fake … the list of new, widely-supported conspiracies is nearly endless. For years, algorithmic failures and a lack of gatekeeping at tech giants like Google, Facebook, and Twitter perpetuated these shared delusions and allowed conspiracies to flourish.<\/p>\n Some companies are taking action against some conspiracy-related fake news, such as Twitter and Facebook shutting down QAnon accounts<\/a> in July of 2020, but these reactionary measures are often too late.<\/p>\n Unfortunately, the fake news marketing tactics conspiracy pushers use have already proved successful and influenced their target audiences’ beliefs.<\/p>\n As an example, let’s look at how fake news marketing amplified the absurd QAnon conspiracy, which is associated with a string of violence<\/a> and that the FBI labeled a domestic terrorist threat<\/a> in 2019.<\/p>\n The conspiracy, in case you’ve remained blissfully unaware, spawned on 4chan in late 2017 and notions that Donald Trump is secretly fighting against a “deep state” cabal of cannibalistic child sex-traffickers and satanic cultists. Since its inception and rise in popularity, the conspiracy’s ideology became more malleable and adopted other popular delusions, such as promoting the lie that COVID-19 does not exist.<\/p>\n Ironically, the COVID-19 pandemic spurred the growth of QAnon exponentially. In March of 2020, membership of the largest public QAnon Facebook groups grew by 700 percent<\/a>, the BBC reported in July. The report ties the popularity growth to increased internet use and more exposure to junk news and social media disinformation during quarantine.<\/p>\n An October report by CBS and Wired<\/a> highlights how data collection techniques, marketing tools, and content recommendation algorithms from Facebook, Twitter, YouTube, and Google create a self-fulfilling prophecy and “rabbit-hole” for users who search for QAnon content, or have demographic markers that associate them with users who participate in these conspiracy groups. These systems start pushing advertisements, videos, hashtags, trending content, sponsored content from junk news sites, people to follow, and online communities that provide users with more and more conspiracy content and confirmations.<\/p>\n The conspiracists transformed from having a negligible amount of political power to earning enormous political capital in an extremely short amount of time.<\/p>\n As of September 2020, a Daily Kos\/Civiqs poll<\/a> shows that 86 percent of Americans have at least heard of the QAnon conspiracy, compared to 65 percent in 2019. Here’s the poll’s breakdown of engagement from the two major political parties:<\/p>\n These conspiracy zealots are now actively shaping the landscape of U.S. politics. There are currently 24 congressional candidates<\/a> who publicly support and advocate for QAnon conspiracies and are on the ballot for House races in the 2020 election.<\/p>\n As the 2016 and 2020 presidential elections show us, election interference and widespread voter misinformation are the bread-and-butter outcomes of fake news campaigns. Enormous amounts of content have been written about this subject, so I’m not going to rehash that information or harp on it. Instead, I’ll quickly highlight just how easy voters’ behaviors make these tactics to execute.<\/p>\n July<\/a> and September<\/a> reports by The Pew Research Center show that 26 percent of U.S. adults get their news from YouTube, and 18 percent of U.S. adults say social media is their primary source for political and election news.<\/p>\n Of the YouTube news crowd, 23 percent get their news from independent YouTube channels, and 14 percent of these channels publish videos that are primarily dedicated to conspiracy theories.<\/p>\n For the traditional social media crowd, only 17 percent can answer at least eight out of nine questions correctly about foundational political knowledge, such as which party supports certain policy positions.<\/p>\n These statistics soften the blow for the following two highlights of the Pew reports:<\/p>\n During the 2016 election, Twitter users shared as much “junk news” as professionally produced news about politics, the Oxford Internet Institute reports. When fake news gets in the hands of its target demographics, users are doing an excellent job at spreading the disinformation without being any the wiser.<\/p>\n And when fake news spreads, it does so extremely fast.<\/p>\n “True news, for example, took about six times as long as false news to reach 1,500 people on average—and false political news traveled faster than any other kind of false news, reaching 20,000 people almost three times as fast as other categories reached 10,000,” Vice reported in 2019<\/a>.<\/p>\n When this false news is politically inflammatory and has the KPI of manipulating and intimidating voters, the outcome can sway elections and invalidate votes.<\/p>\n As NPR reported in October<\/a>, “One false rumor circulated in Texas that bar codes on mail-in ballot envelopes can reveal personal information, including whether the voter is a Republican or a Democrat.” After receiving several ballots with blacked-out bar codes, Tarrant County Elections Administrator Heider Garcia took to Twitter and posted a video warning voters<\/a> that this could lead to their ballots being rejected.<\/p>\n Although the number of people who succumb to these disinformation campaigns may be the minority of voters, sometimes that’s all it takes. For example, in the 2000 presidential race, 0.01 percent of votes swung the election. In the 2016 election, 0.72 of Pennsylvania voters decided who won the state.<\/p>\n Extremists have a long history of using disinformation and propaganda to gather recruits and sway minds. But, as Cambridge Analytica’s 2016 disinformation campaign<\/a> showed, the sheer reach and pinpoint user targeting of paid advertising and social media has spurred growth in these ideologies.<\/p>\n In the European Union’s 2019 case study, Understanding Citizens’ Vulnerabilities to Disinformation and Data-Driven Propaganda<\/a>, researchers analyzed how social networks became platforms where disinformation gets spread exponentially fast. They determined that fake news, often political propaganda, would surge in popularity and then be uncritically picked-up and redistributed to an even larger audience by traditional media outlets and junk news websites.<\/p>\n The EU report’s findings demonstrate that today’s society is increasingly vulnerable to disinformation operations. The vulnerability stems from “information overload and distorted public perceptions produced by online platforms algorithms built for viral advertising and user engagement.”<\/p>\n When content creators and bad-faith actors push content that spews actionable polarization through these disinformation channels, such as the George Soros conspiracy theories<\/a> or the COVID-19 disinformation infodemic<\/a>, the results spread radical ideologies and behaviors to larger user groups, some of whom eventually take action. Three recent examples include:<\/p>\n A 2019 report by PBS<\/a> shows how paid advertisements, content amplification options, and more “have rejiggered the landscape of content visibility on social media websites” to inspire more radical behavior.<\/p>\n But how does somebody go from reading a false blog article or seeing a politically charged meme to becoming a domestic terrorist?<\/p>\n It’s all about how these stories are made and marketed.<\/p>\n The marketing strategy behind fake news—despite being entirely unethical, dangerous, and socially destructive—is brilliantly executed.<\/p>\n\n
How Does Disinformation Relate to Marketing?<\/h2>\n
\n
The Consequences of Fake News<\/h2>\n
Conspiracies<\/h3>\n
\n
Election Interference<\/h3>\n
\n
Radicalism<\/h3>\n
\n
How Fake News is Made<\/h2>\n