How ‘non-facts’ contaminate issue discourse

Tony Jaques
RMIT University


One of the most contentious areas of societal discourse revolves around what is and is not a ‘fact’. And the evolution of social media has further blurred that crucial distinction in what is becoming known as a post-truth society. But in addition to what is and is not a fact is a worrying phenomenon which might be called a ‘non-fact’ – something which is not true yet has come to be accepted as true. Such non-facts can contaminate legitimate discussion on important issues of the day. The focus here is not on high-profile publicly contended matters such as climate change denial, or whether windfarms cause health effects, but the more insidious process by which non-facts ‘become’ facts and are regarded as the norm, accepted beyond contention.

A vivid example of the operation of a ‘non-fact’ concerns how society managed the turn of the last century. Experts such as the Royal Observatory at Greenwich and the US Navy Institute were unanimous that the new millennium began on January 1, 2001. But popular belief was that it began on January 1, 2000. That false idea effectively ‘became’ a fact and governments and communities around the world spent hundreds of millions on massive millennium celebrations on the ‘wrong day’. Needless to say, the few brave souls who dared to point out the error were accused of being pedants and curmudgeonly spoilsports. As the fantasy fiction author Terry Pratchett (2008) has written: ‘Just because things are obvious doesn’t mean they’re true.’

Such things happen for a variety of reasons including lazy journalism, the decline of basic fact-checking and the echo chamber of social media which is then reflected in mainstream news and commentary. Moreover, the outcome is not always as innocuous as the day society decided to have a millennium celebration. But this case captures the two underlying mechanisms by which non-facts become facts – firstly persistent repetition, and secondly because it is something which people would like to be true, or in some cases because a specific stakeholder or activist group want it to be seen as true (as we will see shortly).

Unlike modern myths which often have an element of impish misinformation, non-facts typically have no such underlying ambiguity and are unquestioningly used and cited in serious and even scholarly sources. Take a simple example from World War II (acknowledging of course that ‘truth is the first casualty of war’). It has become little more than a cliché that the summer of 1940 during the Battle of Britain was one of the warmest and clearest spells of weather for years. This has continued to appear for decades in innumerable books, documentaries and movies. Yet experts who have studied meteorological records confirm that the summer of 1940 was no different from the usual weather pattern over southern England. While it is not clear how and when this non-fact began, it is reasonable to assume it was part of the romantic narrative of brave young Spitfire pilots in the summer sky saving the country (even though the majority of aircraft involved in the campaign were Hurricanes, not Spitfires – another common misconception).

Such examples highlight that non-facts are quite distinct from lies, rumours, hoaxes, urban legends, falsehoods, or what is now called ‘fake news.’ Indeed, there is no agreed language to define the non-fact. One man who tried was American writer Normal Mailer, who coined the useful term ‘factoid.’ He intended it to describe an item of information which could be true and that becomes accepted as a fact even though it is not actually true. Examples include ‘Eskimos have a hundred words for snow’ or ‘The Great wall of China is the only manmade structure which can be seen from space.’ Or as Mailer (1973) put it: ‘… facts which have no existence before appearing in a magazine or newspaper.’ These days we can also add on social media.

Extraordinarily the term ‘factoid’ has now come to also have exactly the opposite meaning, being commonly applied to those ‘strange but true’ snippets much loved by newspaper subeditors with an awkward space to fill or compilers of online listicles. As a lover of language, Mailer would doubtless have enjoyed the irony that the Oxford English Dictionary says both, contradictory, meanings are now accepted.

In this same context, respected dictionaries have more recently accepted another new word to define a non-fact, this time created by American comedian Stephen Colbert in 2005. In his comedic television persona as a right-wing media commentator, Colbert introduced the term ‘truthiness’ which has been defined as an argument or assertion a person claims to know intuitively ‘from the gut’ or because it ‘feels right’ without regard to evidence, logic, intellectual examination, or facts. The word generated extensive media coverage and discussion at the time, not only because it tapped into a legitimate concern but because it filled an apparent gap in modern vocabulary. Indeed, although Marc Peyser (2006) in Newsweek described ‘truthiness’ as ‘a fake word by a fake newsman’, it was selected by both the American Dialect Society and the Merriam-Webster Dictionary as their respective ‘Word of the Year.’

While Colbert’s concept of truthiness was originally used for satirical effect, it illustrates a central aspect of the non-fact which bears directly on modern issue discourse, namely the idea that non-facts not only seem like facts, but that we would like them to be true, and that it would suit us if they were true. As Colbert himself explained: ‘Truthiness is sort of what you want to be true as opposed to what the facts support’ (Steinberg, 2005).

The same idea was evident in the 2016 Trump election campaign, which was widely criticised for promotion of untruths. Indeed, Trump promoted his own description of the technique which he called ‘truthful hyperbole’. His bestselling book The Art of the Deal (Trump and Schwartz 1987) said: ‘People want to believe that something is the biggest and the greatest and the most spectacular. I call it truthful hyperbole. It’s an innocent form of exaggeration – and it’s a very effective form of promotion.’ New Yorker magazine (Mayer, 2016) described it as an artful euphemism, needed to ‘put an acceptable face on Trump’s loose relationship with the truth.’ Moreover, ghostwriter Tony Schwarz, the man who coined the euphemism, later disavowed the term as being ridiculous. But Trump’s ‘innocent exaggeration’ evidently struck a chord with the electorate and helped propel the candidate into the White House. In the wake of the US election and the UK Brexit referendum, Oxford Dictionaries announced ‘post-truth’ as the 2016 ‘Word of the Year’.

Moving from the intellectual conceit of Mailer, the satiric intent of Colbert, and the ambition of politicians, we see the effort by modern stakeholder and activist groups to frame and influence societal understanding. Added to the concept that people actually want certain ideas to be true, the first foundation of a non-fact is constant repetition, which has long been identified as a potent mechanism. It was Vladimir Lenin who said, ‘A lie told often enough becomes the truth’, and it was Josef Goebbels who developed the concept of ‘the big lie’. Repetition helps build credibility and can quickly turn a casual observation or an unwarranted assumption or even an outright lie into a seemingly credible non-fact. A distressing example of the extraordinary persistence of some non-facts is the claim that childhood vaccination can cause autism. This belief began with a now-discredited paper published in The Lancet in 1998 by Dr Andrew Wakefield and his colleagues (two later withdrew their support for his conclusions, Wakefield et al 1998). After an exhaustive official investigation, the paper was declared to be fraudulent and was retracted, with Dr Wakefield struck off as a medical practitioner for deliberate falsification.

Yet this dangerous non-fact continues to thrive, driven by anti-vaccination activists around the world, aided by the power of the internet and by participation of high-profile campaigners such as former Playboy model and TV celebrity Jenny McCarthy. And unlike some non-facts, this one has a very real impact. The non-fact that vaccination is a health risk has ‘become’ an actual health risk, with thousands of parents who refuse or delay vaccines exposing their own children, and other children, to life-threatening diseases, and triggering outbreaks of some easily-prevented illnesses.

As in this case, the process of constant repetition in creating such non-facts is further accelerated when celebrities become involved, regardless of the truth and their credentials. One of the first modern celebrity interventions in a serious scientific issue was in 1989 when actress Meryl Streep was recruited as part of an activist PR blitz against the chemical Alar, used in the production of apples (Jaques, 2011). Thousands of media reports across the United States repeated wildly unfounded allegations against the chemical. Although most experts did not believe Alar posed a credible health concern, the campaign caused nationwide panic among American consumers and government regulators, who forced the product off the market and temporarily brought the apple industry to its knees. The American Apple Producers’ PR advisor Frank Mankiewicz said at the time: ‘We got rolled. When you’re dealing with a nutritionist named Meryl Streep, you haven’t got a chance.’ (cited in Patterson, 2005, p.110).

Health is certainly a fertile breeding ground for non-facts, and the Internet is a major engine of conspiracy theories. For example, a 2014 survey by the internet research company You.gov, found that almost half of Americans believe in medical conspiracy theories, with 20 percent believing cell-phones cause cancer and that large corporations are keeping health officials from doing anything about it (Shute, 2014). The study, published in the Journal of American Medical Association’s Internal Medicine, concluded that people who are firm believers in medical conspiracies are less likely to get regular medical checks and are more likely to buy organic food, shun flu shots and sunscreen, and use vitamins and herbal supplements.

Both the groundbreaking Alar case, and the later anti-vaccination crusade, have a strong anti-corporate tone, taking us to the second foundation of the non-fact, which is when a particular stakeholder or activist group wants something to be seen as true and has a vested interest in promoting that view. This is not to suggest that all activist groups are dishonest or deliberately misleading. But many do operate on a different set of rules, often based on the belief that the end justifies the means. Just as political revolutionaries may argue that ‘treason against tyranny is no crime’, some activists firmly believe that breaking the law to oppose ‘corporate tyranny’ is acceptable or even desirable. Examples would include trespassing onto a research farm to uproot genetically modified crops or breaking into a laboratory to ‘liberate’ test animals. The same approach can be used to justify the promotion of non-facts in support of what is seen as a just cause.

Unlike corporations, activist organisations have a certain freedom in that they are most often not answerable to investors or corporate regulators, they usually don’t have to reveal their sources of income or affiliations, and typically don’t have valuable assets at risk in the event of litigation. The distinction between corporate and activist roles and responsibilities was addressed by well-known commentator Paul Holmes (2002) in the magazine PR Week.

Another great thing about running an NGO is that credibility is not contingent upon competence. Think of Greenpeace, which experienced a surge of support after it successfully defeated Shell’s plans to sink the Brent Spar oil rig in the North Sea – despite scientific consensus that sinking it was the environmentally-friendly solution. Activists can be in the wrong side of the scientific debate and still emerge with their reputation enhanced.

In addition to identifying a recognised contradiction, this commentary is notable for an unintended reason, namely that Holmes himself had fallen victim to the activists’ promotion of a non-fact. The actual plan in 1995 was to tow the Brent Spar oil storage platform from the shallow North Sea for disposal in the deep Atlantic, not to sink it in the environmentally-sensitive North Sea.

However, Greenpeace actively encouraged this false perception. Challenged on a BBC documentary as to whether Greenpeace deliberately misled the public, organiser Jochen Vorfelder of Greenpeace Germany replied: ‘If you are a political pressure group you have to be naughty. That’s fine with me. If you talk about the principle of saving the seas, it doesn’t matter’ (BBC, 1995). Though perhaps it did matter in Germany, where belief that the North Sea was at risk led to protesters fire-bombing some Shell petrol stations. The Greenpeace strategy with the simple non-fact about the proposed disposal of the Brent Spar was certainly successful, not only repeated by an experienced communication expert like Paul Holmes, but regularly repeated elsewhere, including in respected textbooks.

Staying with the oil industry, the notorious Exxon Valdez oil spill along the Alaskan coast in 1989 was consistently described as the worst, or one of the worst, oil spills in history. Some environmentalists and the media had good reasons to promote this idea, but it was a classic non-fact. It may have been one of most highly publicised spills, but the volume of oil which leaked from the tanker was much less than 30 or more bigger spills around the world. In 2010 Exxon Valdez was overtaken in public perception by the Deepwater Horizon disaster in the Gulf of Mexico, which now became the ‘world’s worst oil spill’. But again, this is a frequently repeated non-fact. It was certainly the biggest accidental spill, but it was less than half as much oil as was deliberately released into the Persian Gulf in 1991 when Iraqi forces attempted to prevent American forces from landing in Kuwait during the Gulf War by opening valves at an offshore oil terminal and dumping oil from tankers. At one level such labels might be deemed unimportant, but they demonstrate that some stakeholders have very obvious reasons to promote non-facts, and why these false statements find such an easy audience with the news media. They also demonstrate the self-evident reality that statistics are a rich source of non-facts.

Professor Joel Best at UC Berkeley has explored the activist use of what he calls ‘social statistics’ and their non-fact cousins ‘mutant statistics’. They are hard to retract once they are in circulation. Best (2001) instances the very widely circulated statistic that an estimated 150,000 American women die each year from anorexia. He traced this number back to its source which said 150,000 young women suffer from anorexia, which can lead to death. The actual number of deaths is about 70 per year, but Best said feminist activists mutated the number of cases into the number of deaths which produced a dramatic and memorable non-fact. He also points out that this mutant statistic should have been easily disproved, as anorexia typically affects younger women and a much smaller total of females aged between 15 and 44 – about 55,000 – die from all causes in the United States each year (Best, 2001, p. 64). Best believes such mutant statistics are not necessarily evidence of dishonesty. ‘Many advocates are perfectly sincere, yet innumerate,’ he says. ‘However, there is also deliberate manipulation, conscious attempts to turn statistical information to particular uses. Whether they are sincere or cynical, advocates prefer dramatic statistics, numbers that make the problem seem as serious – and the need as urgent – as possible’ (p. 94). He concluded that the more dramatic a number’s implications, the more likely it will be repeated, and that ‘drama ensures repetition, while innumeracy discourages critical thinking.’

Such ‘drama and repetition’ is illustrated more recently in debate over the impact of plastic waste which has frequently featured the claim that 500 million plastic straws a day are used in the United States. This appealing number was first proposed in 2011 by nine-year-old Milo Cress and has appeared many times in some the world’s leading media outlets. However, market research firms put the actual figure between 170 million and 390 million (Chokshi, 2018). The symbolic use of such ‘social statistics’ is acknowledged by Cress, now aged 17. ‘The precise number is less important than the waste. We use far too many straws than we need to, and really almost any number is higher than it needs to be.’

Beyond statistics lies another activist method specifically against target organisations, which is based not just on non-facts but on a multipronged, swarm variant which can be called the ‘Black Cloud Strategy’. Its purpose is to drive a public issue simply through raising doubt, justified or otherwise. Unlike conventional non-facts, the emphasis here is on creating a cloud of suspicion and concern rather than establishing ‘proof.’ A hypothetical example of the technique might be when critics of a target pharmaceutical allege that it hasn’t been properly tested on children; that it has unknown side-effects; that adverse testing results have been suppressed; that research had been cut short to speed the path to market; and that the price had been artificially increased to subside other failed products. None of the allegations need to be proved or true, just sufficiently credible to suggest there are ‘questions about this product.’ Opponents of the product can then rely on belief in ‘no smoke without fire’ and misapplication of the ‘precautionary principle’ to achieve their purpose.

The success of the black cloud strategy was observed in late 2014 when the state of New York officially banned the controversial practice of ‘fracking’ to exploit deep deposits of shale gas when the recommendation of Health Commissioner Howard Zucker was accepted by Governor Mario Cuomo. Zucker admitted there was still a lack of hard data about the effects of fracking on public health but said a high-profile six year campaign against fracking by hundreds of celebrities such as Yoko Ono and Lady Gaga ‘raises serious questions’ sufficient to warrant a ban (Goldenberg, 2014).

In a similar case, a massive wind-farm project at Bald Hills in Victoria, Australia was blocked because of a widespread belief that wind turbines are a danger to native birds, especially the threatened orange-bellied parrot. The decision was eventually reversed when it became clear that the statistical risk to the little parrots was one death every 600-1,000 years (Minchin, 2006). And this trend is hardly new. In an analysis of American cases involving endangered species, Greg Broderick (2004) concluded that species conservation policy is increasingly determined by activists and judges rather than scientists.

In deploying strategies such as the black cloud or mutant statistics, one of the ways non-facts become facts is not just through repetition but through constant citation, often morphing from a hypothesis, an opinion or unverified conclusion into an apparently demonstrated fact. Academics call it the ‘Woozle Effect’, based on the children’s story by A.A. Milne in which Winne-the-Pooh, believing he is tracking an imaginary Woozle, discovers he is following his own footprints. The Woozle Effect begins when one investigator reports a finding, often with qualifications. A second investigator then cites the first study’s data, but without the qualifications. Others then cite both reports, and the formerly qualified data gains the status of an unqualified, generalizable truth, often a non-fact.

Although non-facts arise in many ways, and there are different mechanisms by which they become facts, they remain a constant threat to productive discourse, and a constant challenge for businesses attempting to manage issues when their product, service or reputation is at risk. In the heat of a high-profile issue campaign it is not possible to respond to each and every attack. However, fundamental non-facts which undermine the core issues should be challenged and not allowed to stand.

The process by which non-facts become facts has parallels in the process by which trademarks become generics. Companies usually act promptly and firmly to prevent their registered trade names from becoming generics. Similarly, they must also act to prevent non-facts from becoming facts, though with the caveat that the focus should be business and never personal. As issue management pioneer Rafael Pagan (1987) advised: ‘Deal with the issue, not the activist critics’ (p. 439).

While corporations must make decisions about their own issues, the sad reality is that objective truth is getting harder to find. It has been argued that facts are like currency. A banknote gets passed from hand to hand with both the giver and the receiver accepting its value, even though it has no intrinsic worth, because a stable monetary system relies on unquestioning acceptance of its ‘value’. Likewise, facts get passed from hand-to-hand without question until someone challenges whether they might not be true. In earlier times gold coins were literally worth their face value. They could be weighed for size and tested for authenticity. Then clippers began to chip away at the edges of the coin, removing some of the gold and hoping the receiver would not notice. As a result, banks invented milling the edge of coins to prevent clipping and to provide the giver and receiver assurance they actually contained as much precious metal as stated on the face. In the same way that the exchange of currency demands trust on the part of both giver and receiver, we need to develop a way to ‘mill the edges’ of facts, so that we can be objectively certain that they represent their purported ‘value’ before passing them on to others.

While some objective sources do tell the truth, sadly, many modern facts are instead non-facts, and some of those passing them on simply don’t care. Indeed, some knowingly and deliberately ‘devalue’ the currency of legitimate facts. Look no further than the spoof news sites which daily create imaginary stories, often mischievous or satirical, some of which get picked up by mainstream sources and unwittingly reprinted as if they were true.

More seriously, as speculators sometimes deliberately undermine the value of currency for their own benefit, some special interest groups and activists too devalue facts for the same reason and promote non-facts. While the former is illegal, or at least immoral, the latter unfortunately seems to be accepted and even applauded.

For society to enjoy rational debate on important issues, and maybe to avoid the worst mistakes, just as we value currency and try to avoid counterfeits, we need to properly value objective facts and actively combat non-facts.

References:

British Broadcasting Corporation. (1995). The battle for Brent Spar. [Transmitted September 2, 1995]. BBC2 Television.

Best, J. (2001). Damned Lies and Statistics: Untangling numbers from the media, politicians, and activists, Los Angeles, CA: University of California Press.

Broderick, G. T. (2004). Towards Common Sense in ESA Enforcement: Federal Courts and the Limits on Administrative Authority and Discretion under the Endangered Species Act. Natural Resources Journal, 44(1), 77-124

Chokshi, N. (2018, 19 July). How a 9-Year-Old Boy’s Statistic Shaped a Debate on Straws. New York Times. https://www.nytimes.com/2018/07/19/business/plastic-straws-ban-fact-check-nyt.html

Goldenberg, S. (2014, December 18). New York State to ban fracking over ‘red flags’ to public health. The Guardian. Retrieved from https://www.theguardian.com/environment/2014/dec/17/new-york-state-fracking-ban-two-years-public-health

Holmes, P. (2002, September 23). Monsanto’s world-hunger solution can’t be implemented until its PR problems are solved. PR Week. Retrieved from http://www.prweek.com/article/1233536/paul-holmes-monsantos-world-hunger-solution-cannot-implemented-until-its-pr-problems-solved

Jaques, T. (2011). Managing issues in the face of risk uncertainty: Lessons 20 years after the Alar controversy. Journal of Communication Management, 15(1), 41-54.

Mailer, N. (1973). Marilyn: A biography. New York: Crosset and Dunlap.

Mayer, J. (2016, July 25). Donald Trump’s Ghostwriter Tells All. New Yorker. Retrieved from https://www.newyorker.com/magazine/2016/07/25/donald-trumps-ghostwriter-tells-all

Minchin, L. (2006, July 26). Minister ignored parrot advice. The Age. Retrieved from http://www.theage.com.au/news/national/minister-ignored-parrot-advice/2006/07/25/1153816182617.html

Pagan, R. D. (1987). Corporate strategies for effective crisis management: Corporate decision making and corporate public policy development. In S.P. Sethi, & C. M. Falbe (Eds.), Business and Society: Dimensions of Conflict and Cooperation (pp. 432-449). Lexington, MA: Lexington Books.

Patterson, P. (2005). Role of public relations in the Alar scare. In P. Patterson & L. Wilkins (Eds.), Media Ethics: Issues and Cases (pp 108-112). Boston, MA: McGraw-Hill.

Pratchett, T. (2008). Wyrd Sisters London, UK: Random House.

Peyser, M. (2006, February 12). The truthiness teller, Newsweek. Retrieved from http://www.newsweek.com/truthiness-teller-112951

Shute, N. (2014, March 19). Half of Americans believe in medical conspiracy theories. National Public Radio. Retrieved from http://www.npr.org/sections/health-shots/2014/03/19/291405689/half-of-americans-believe-in-medical-conspiracy-theories

Steinberg, J. (2005, December 25). In a word: Truthiness. New York Times. Retrieved from http://query.nytimes.com/gst/fullpage.html?res=9F0CE6D81530F936A15751C1A9639C8B63

Trump, D.J. & Schwartz, T. (1987). Trump: The art of the deal New York: Ballantine Books.

Wakefield, A. J., Murch, S. H., Anthony, A., Linnell, J., Casson, D. M., Malik, M., . . . Walker-Smith, J. A. (1998). RETRACTED: Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet, 351(9103), 637-641. doi:10.1016/S0140-6736(97)11096-0.

About the author

Dr Tony Jaques is a lecturer in the School of Media and Communication at RMIT University. He has a PhD in issue management from RMIT and has worked for more than 30 years in corporate issue and crisis management communications, mainly in the Asia-Pacific. He runs his management training and consulting services company Issue Outcomes. A prolific writer on the topic, Tony has also written a number of books including Issue and Crisis Management: Exploring issues, crises, risk and reputation (2014) and Crisis Proofing: How to save your company from disaster (2016).
Email: tony.jaques@rmit.edu.au


Back to top