The Mess of Information and the Order of Doubt

Jonathan Paul Marshall
University of Technology Sydney (UTS), Australia


Abstract

The failure of news media to adequately report ‘accurate’ information is frequently noted. However, information seems widely distorted in other situations as well, and it may not be possible to explain this distortion fully by factors such as inevitable reinterpretation, or political and pro-corporate propaganda. This paper argues that magnified distortion and inaccuracy can be traced to fundamental factors within the ‘information society’ itself, such as ‘information overload’ and ‘information-grouping’. In this society, doubt becomes a political tool with which our sense of status quo, meaning, group membership and personal identity is defended when under threat. Doubt is an ongoing part of information’s reception in the ‘information society’ and helps filter what I call ‘information mess’. The paper elaborates some ‘information mess principles’ (IMPs), to explain these features.

Introduction

Information society theory tends to imply that information is the equivalent of useful knowledge, and that learning, mastery and improvement are central features of life. However, structural features, or patterns (which I call ‘information mess principles’ or ‘IMPS’), operating in this ‘information society’, lead to ongoing destabilisation of information, or ‘information mess’, particularly in news media. Disinformation, misinformation and ‘paranoia’, are as vital to information society dynamics as any possible ‘accuracy’. Consequently, doubt and scepticism act as stabilising factors allowing the maintenance of group loyalties and identity.

Information society, knowledge society

Contemporary society can be classified in many different ways. Common academic and official classifications include the ‘information society’ and its variants: ‘knowledge society’, ‘information economy’, ‘information revolution’ and so on. There is the ‘World Summit on the Information Society’, OECD reports on the information society, and politicians pronounce the importance of ‘this knowledge-driven economy’, claiming that “Knowledge and the ability to innovate … are the raw materials of this revolution” (Blair, 1999). Before the mining boom, there was a dream of Australia becoming a ‘clever country’ (Lowe, 1998). In business, discussion of ‘knowledge management’, ‘information management’, ‘information science’, ‘knowledge ecologies’, ‘knowledge brokers’ and ‘working smarter not harder’ is common. Information theory is even used to explain the cosmos (Gleik, 2011). Common ideas like the ‘digital divide’ presume the benefits of access to technology and knowledge, with concepts like ‘information poverty’ (Britz, 2004) or ‘information inequity’ (Lievrouw & Farb, 2003), implying access to information is beneficial.

Economists Linde and Stock write, ‘knowledge society’ and ‘information society’ are “viewed as more or less quasi-synonymous” (2011: 3). Likewise Britz mentions the “globalized information-driven economy, also referred to as the knowledge economy” (Britz 2004: 192). There has been discussion of knowledge industry since Fritz Machlup (1962) stressed the importance of R&D, education, and information services in the new economy, arguing that “all information in the ordinary sense of the word is knowledge, though not all knowledge may be called information” (1962: 15). This position was crucial to many formulations about the new society:

What has now become decisive for society is the new centrality of theoretical knowledge, the primacy of theory over empiricism, and the codification of knowledge into abstract systems of symbols.

(Bell, 1971: 4-5)

Similarly, “The end product of all information service markets is knowledge. An information market enables the consumer to know something that was not known beforehand” (Porat, 1977: 22). Castells influential trilogy refers to ‘network society’, but carries the overriding title of The Information Age. He writes:

What is specific to the informational mode of development is the action of knowledge on knowledge … A virtuous circle of interaction between the knowledge sources of technology and the application of technology to improve knowledge generation … [I]nformationalism is oriented towards technological development, that is to the accumulation of knowledge and towards higher levels of complexity in information processing …

(Castells, 2000: 17)

Elsewhere Castells argues that industrialism failed because it “could not manage the transition to knowledge-based productivity growth by using the potential unleashed by information and communication technologies” (2004: 15). On the other hand, the basic work of informationalism is for workers to “find the relevant information, recombine it into knowledge, using the available knowledge stock, and apply it in the form of tasks oriented towards the goals of the process” (ibid: 26).

Linde and Stock write:

The knowledge society is concerned with all kinds of knowledge, but scientific and technical knowledge reaches a particular significance … as production is heavily driven by scientific-technological results

(2011: 82)

Since Peter Druckers’1959 book Landmarks of Tomorrow, ‘knowledge workers’ have been hailed as the rising and possibly pre-eminent class, due to their technical or professional proficiency. They produce knowledge, technology or ‘culture’, and engage in continuous education.

Information in action

Despite this supposed importance of knowledge, the failure of journalism to adequately present ‘accurate’ information is frequently noted, and often reduced to political or corporate bias (Herman & Chomsky, 1998; Boehlert, 2006) or political spin (Castells, 2009: 196ff., 204ff., 224ff.). However, available information seems widely ‘messy’ in other situations as well. We cannot fully explain this distortion by deliberate intent. Distortion and inaccuracy can be traced to fundamental factors within this supposed ‘information society’ itself, its structures and means of communication and the ‘information mess principles’ some of which I detail later. In this society, information is often coupled with doubt, and this becomes a tool by which information is filtered, and our status quo and understanding is defended when under threat from information overload. Doubt helps us retain loyalties to what I call our ‘information-groups’ or those imagined communities gathered around information sources, values, loyalties and conflicts. Doubt allows us to engage in acts for which there is little positive information, but which are propelled by other social needs, and helps us make sense of the world. People doubt climate change, ex-President George W. Bush’s intelligence and honesty, that Iraq under Saddam Hussein was, or was not, an immediate threat, and so on.

In English speaking information societies, doubt connects to a general distrust of the accuracy of information (or even to whether information can ever be relatively accurate). According to the Gallup Organisation,

Americans’ distrust in the media hit a new high this year, with 60% saying they have little or no trust in the mass media to report the news fully, accurately, and fairly

(Morales, 2012)

This is a rise from 53% in 1998. Those who had some trust in the media (a great deal to a fair amount) declined from 53% to 40% in the same period. In the 1970s, up to 72% of people had such trust (Morales, 2012).

The majority of Americans (60%) also continue to perceive bias, with 47% saying the media are too liberal and 13% saying they are too conservative

(Morales, 2011)

This perhaps needs explanation, as it might be expected that a media run by the corporate sector could easily be seen as pro-corporate and hence ‘conservative’.

Socio-caricature

The problem in portraying and analysing disorder and mess is to make it perceptible and explicable without over-regularising it. While constructing an ‘ideal type’, I do so recognising that ideal types generally delete disorderly elements so as to give those types the regularity, reality and explanatory force they possess. Therefore, rather than pretending this sketch expresses absolute reality, although still claiming there is something to be discussed, I present a socio-caricature generated by ‘mess principles’ and use a number of vignettes as examples.

In socio-caricature, a distortion may explain and summarise, but because the distortion is marked the model is unlikely to be taken as a ‘real thing’, unlike some other concepts (such as say, ‘postmodernity’, ‘social structure’ or ‘class’). It is a deliberately partial ‘truth’, pointing at reality, in a particular moment. It is precisely a rude sketch rendering the informational mess created by the information society in stark outline. Charitable readers might think of this paper as resembling a Hogarth or Rowland cartoon in which the bodies of those drunk with over-information give birth to imps which torture those bodies and drive the process of intoxication and distortion along.

Social Knowledge Surveyed

As an example of the nature of information in the USA, a Harris Poll taken in March 2010 stated that:

Republicans believe that President Obama:

  • — Is a socialist (67%)
  • — Wants to take away Americans’ right to own guns (61%)
  • — Is a Muslim (57%)
  • — Wants to turn over the sovereignty of the United States to a one world government (51%); and
  • — Has done many things that are unconstitutional (55%)

Large numbers of Republicans also believe that President Obama:

  • — Resents America’s heritage (47%)
  • — Does what Wall Street and the bankers tell him to do (40%)
  • — Was not born in the United States and so is not eligible to be president (45%)
  • — Is a racist (42%)
  • — Wants to use an economic collapse or terrorist attack as an excuse to take dictatorial powers (41%)
  • — Is doing many of the things that Hitler did (38%).

Even more remarkable perhaps, fully 24% of Republicans believe that “he may be the Anti-Christ” and 22% believe “he wants the terrorists to win”

(Harris Interactive, 2010)

Earlier, research by Hofstetter et al. (1999) and Kuklinski et al. (2000), suggested that rather than being badly informed, or self-recognisedly ignorant, US citizenry were strongly misinformed. For example, most people polled thought the US spent more on foreign aid than defence. We may explain this result by suggesting it arises from ‘unprincipled propaganda’ emitted by right wing think tanks and news-sources such as Fox News. However the research referred to above, implies that such misinformation was not present in the media at the time of research, whatever might be the case now. We also have to explain why this propaganda works so effectively, especially given the argument that abundance of information should allow people to check up ‘facts’ and decide in favour of information that is more probable. If information as knowledge is important, then improbable distortions should not have such widespread authority. Republicans promulgating bad information should be discredited by its use rather than relatively triumphant.

This does not deny the existence of propaganda. Readers are undoubtedly familiar with the process of casting doubt on climate change, and any other science that challenges corporate profit (Hoggan & Littlemore 2009; Oreskes & Conway 2010). People of leftish persuasion can blame doubt on a conspiracy of energy companies, sponsored scientists, and right wing media. Those calling themselves ‘sceptics’ can blame a conspiracy of normal scientists and left wing media for people’s fear of climate change. However, again we may wonder why these supposed conspiracies are so successful when the science is relatively easily available to people, and why the discrediting of ‘sceptic’ positions (or scientific positions if you so choose) seems to have little effect on the arguments of, or the credibility of, those making them. We might wonder why people claiming to have researched the science for themselves generally rely on particular groups for their access to that science. Doubt towards ‘the other’ seems the first response, followed by arguments which depend on a person’s group allegiances for their effectiveness. To accept the improbabilities revealed by the Harris Poll, people have to doubt everything about Obama’s ‘front’ and what he says, and what his supporters claim. They have to strongly not identify with him and his associated groups.

The Role of Doubt

Even if interested, concerted and deliberate disinformation explains everything, this distortion is not limited to politicians and corporations acting against information which might be risky for themselves. In an email exchange, journalist John Elder drew my attention to two suggestive cases brought forward on the ABC TV programme Media Watch. The first case (ABC Media Watch, 2010a) demonstrated that the magazine, Woman’s Day, not only fabricated coverage of local TV star Kate Richie’s wedding, but also used Photoshop to engineer pictures of the wedding. According to other reports: “The couple shunned offers of up to $500,000 for a television and magazine deal to keep their nuptials at the historic Quamby Estate private” (The Mercury, 2010). This implies that unless a story is exclusive, or fed to media, ‘newsworthy’ people are free game for distortion but, again, what does this say about the audience and their reception?

When a reader’s letter congratulating Woman’s Day on the story won $50 as that magazine’s letter of the week, Media Watch contacted the reader asking whether “she minded that her favourite mag had deceived her”. She replied “It wouldn’t bother me that much actually because it’s media so, yeah, I’m not that stupid as to realise everything’s true in the magazine” (ABC Media Watch, 2010b).

Not only does this statement imply a general sense of doubt, but as Elder pointed out, it suggests a degree of collusion, or bonding, between reader and media in accepting a pleasing fantasy. At least some of both audience and media prefer a good story to a relatively accurate story. According to this Woman’s Day reader, doubt is part of her expected reading. Doubt does not lead to a re-evaluation of this media outlet, as all media is supposedly inaccurate. The reader does not protest, and apparently still identifies as a Women’s Day reader. It is unlikely that many readers of the magazine became aware of the inaccuracy, but that failure is not incidental to the information environment. It is part of the way that environment operates.

The second story mentioned by Elder (ABC Media Watch, 2010c), showed that Channel 7 television’s current affairs show, Today Tonight, faked a story about how their reporter was able to smuggle a bomb into the Commonwealth Games arena in Delhi, avoiding police security. Media all over the world took up this story. Members of the media, like the Woman’s Day reader, seemed to want a good story more than they worried about its accuracy. Doubt about the story was suspended, while its acceptability was enhanced by real doubts about security in India at that time; these doubts could have led to Today Tonight ‘making’ the story in the first place. Many athletes reportedly decided it was too dangerous to attend the Games (Chamberlain 2010). Channel 7 increased the sense of danger, giving further validity to the news, which further justified the story. We might assume that the audience reacted like the Woman’s Day reader and, rather than being alienated, largely remained loyal through deploying generalised doubt.

Other, less commercial, instances of information mess and the responses to it further help our understanding. An article in the Sydney Morning Herald (Laube, 2011) reported that after the 2011 tsunami in Japan, people posted fake images of the waves and devastation on the internet. More interesting than the fakery, is the generally dismissive online response to this article. For example, one person wrote:

These sorts of stories are not news anymore. Fake photos on the internet, facebook party invites go crazy, imposters saying outrageous things on the internet, etc. etc. …

It’s not news. Those that don’t understand the environment of the internet nowadays are the minority

Another person:

You can’t believe everything you read/see. That’s not new.

There are some sick puppies out there. That’s not new.

The only thing that is new is that half the planet can now broadcast anything to everyone.

The funny thing is some of the reality is way stranger than fiction. Why make things up?

Another sees it as vicarious participation in events:

I think people do it because they can, because [they] are thinking about and imagining what the disaster was like, and they upload these images to see if they can be read as real.

People borrow other people’s stories all the time

Quite a number of people criticised the paper for publishing examples of the faked images and encouraging fakers, even though no such people were named or promoted.

By showing these photos here, the SMH is just as guilty of feeding the monster and giving these digital nitwits extra kudos. Stop feeding the monster.

Why is smh.com putting the fake photo as the main photo on their homepage. Why even give these ridiculous things the time of day.

Some readers complained that the images presented were not faked pictures of the tsunami, but fakes of other events. The paper was also criticised for using metaphors instead of literal truth and for being a hypocritical news organisation.

Don’t forget to point the finger at news agencies such as Reuters who have been outed for their own photographers doctoring images.

A minor dispute arose about the values of the ‘internet generation’, with most commentators being in favour of that ‘generation’ and identifying with it. In general, commentators seemed: a) fairly relaxed about the issue of fakery; b) annoyed with the message bearer or; c) expressive of general doubt about all news/information. Being able to doubt such images and be relaxed about it, marked people’s identity as savvy members of the internet generation. Permanent, but perhaps retrospective, doubt was considered a positive attribute.

The Mess of Information

Elsewhere, I have presented a theory of communication (particularly online communication – see Marshall, (2007)), arguing that communication is affected by the structures and organisation of communication, and by the social patterns of interpretation. Here I briefly assert that information involves communication, and communication is not about transmitting a message down a conduit from one mind to another passively receptive mind – rather, it involves active interpretation in a generally unstable context. One important factor providing context, is the placing of the ‘emitter’ in a socially recognised group by the interpreters. This placement then tells interpreters the likely function and content of the message, although neither this placement nor the attributed meaning of the placement may be consistent across a population. As experience, feeling, context and meaning are not completely shared, communication tends towards divergence in meaning; within limits. Information is largely evaluated in terms of what it does in a wide sense for a socially-embodied interpreter, so there is no necessary correspondence between information and empirical conditions (whatever they might be) for different people and groups – again within limits. Being hit by a speeding train will generally damage naked people, whatever their theories. Similarly, while almost every literary critic may present a different reading of Hamlet, few persuasively argue it is about the mating habits of elephants. However, while it is hard to demarcate these limits or present rules for determining accuracy, information distortion, misunderstanding, suspicion and information failure, appear to be marked features of information society.

Ambiguities and uncertainties are present from the beginning. ‘Information’, itself, is a term covering many different things/events with different properties and applications (Levy, 2008; Floridi, 2010). ‘Information’ is not a coherent category and generates disordering effects if everything classified as information is treated as being ‘information’ in the same way. Information ‘is’ information in many different ways.

The vagaries of the term ‘information’ imply that ‘information society’ is unlikely to be a coherent category either. Information, in information capitalism, is also (dis)ordered by ‘information mess principles’ (IMPs) which arise from that organisation of communication, destabilise the relation between information and relative accuracy, and generate doubt as a social fact. Some IMPs overlap considerably, and mutually reinforce each other. The order of presentation does not imply order of importance.

IMP 1: Ease of information production, generates information overload.

‘Information overload’ or ‘data-smog’ (Shenk, 1997) is an everyday feature of life in this society (Levy, 2008). Data smog occurs because the easier it is to create information, the more information will be created and distributed and the more overload or smog will be in effect. Information overload derives from relative ease of access to contemporary communication technology and its powers of distribution and storage.

Worldwidewebsize.com (February 7, 2013) estimates the size of the World Wide Web as “at least 13.32 billion pages”. Eric Schmidt (2005) of Google cited a study indicating there were “roughly five million terabytes” (ie roughly 5×1018 bytes) of information in the world, and he estimated that only 170 terabytes, or 0.0034%, was searchable. Hilbert and Lopez (2011) argued that, in 2007, humankind was able to store 2.9×1020 “optimally compressed bytes”. The US Library of Congress Web Archiving FAQ (2013) states:

As of January 2013, the Library has collected about 385 terabytes of web archive data (one terabyte = 1,024 gigabytes). The web archives grow at a rate of about 5 terabytes per month.

The more information that can be collected and stored, the more analysis of that information can occur, the more detail can be uncovered, the more commentary on commentary can be made. Hence information keeps increasing. Hilbert and Lopez (2011) estimate the increase to be currently about 23% per year. Mark Ware estimates that academic journals have increased in number by about 3% per year over the last 200 years, nowadays with journals typically publishing more articles, and a potential publishing explosion in Asia (2009: 5, 21). He says that in

UK universities, 102 million full text articles were downloaded in 2006/07, an average of 47 for every registered library user, with an annual rate of growth of about 30%”

(24)

The pressure on academics to publish to keep their positions and to progress in their careers has been seen as generating distortion (De Caterina et al, 2011). Researchers attach their names to papers they have not researched, or issue redundant and repetitive papers and so on. There is also less time to check, referee, or replicate research (Matías-Guiu & García-Ramos, 2010; Ware, 2009: 22, 25ff.). The drive to produce information lessens accuracy.

This quantity of information produces a Borgesian situation in which it is possible to eventually find information confirming almost any kind of position. Equally, information can easily be overshadowed by other information depending on its access route. Quantity of information does not equal quality of information, and indeed opens people to confusion, or to constant delay. There is always more information to be found, analysed and issued. People need to filter.

IMP 2: Information depends on other information.

People judge and interpret information by what they already know, or by the purposes they have for that information. Other ways of phrasing this statement are: a) knowledge is structured by existing knowledge and its social purpose; or b) knowledge guides observation and evaluation of information. The strong version of this position suggests that, as knowledge directs both perception and acceptance of other knowledge then, the more ardently knowledge is held, the more it produces ignorance.

‘Social usage’ may include whether the information tells a good story, has the required emotional effect, generates group togetherness, or generates the required activity such as magazine/information purchase. Hence the response of the reader who liked the wedding story in Women’s Day. It was a good wedding story about people she cared about (because of the media), gave the desired feelings, and thus served its purpose for her. She appraised the quality of the information by the expected forms and usages of socially important narratives. The ‘real story’, with media peeking over walls or spying from helicopters (and failing to perceive much) may not have been as appealing.

The greater the level of information overload, the less anyone can command the data and the more likely they are to judge or filter it by what they already know or require. Simplified information is always inaccurate, yet fully detailed information is often unable to be processed; completeness cannot be reached. It becomes easier to generate bad information than to explore complex realities. For example, security in Delhi could have been lax, but it was easier and quicker to manufacture the point than to do the work of investigating; and probably much safer than actually smuggling a bomb. Today Tonight could easily doubt that security was as good as the Indians claimed, as information can always be doubted and doubt makes a story. Doubt framed or reinforced an existing ‘knowledge’, or suspicion, that security was not good. Doubt allowed action.

Afterwards ‘the story’ creates the reality/knowledge, which then justifies the story. The perception that security was slack is reinforced, and there is little the Indian authorities can do about that. If a security scare had occurred, then Today Tonight would have felt vindicated, and used that scare to justify the story which would have gained further credibility and effect – we warned you and were ignored. If nothing happened, then this could be explained by alleging the Indians upgraded their security because of the story. Other media took up the story because its dramatic appeal attracted attention, and perhaps because it made terrorism real again. It was another expected knowledge-narrative with a ready audience and transmitters. If it proved untrue then those other outlets have probably ‘always’ doubted the story and could issue a quick unnoticeable correction if forced.

IMP 3: Communication and information involve power relations.

Information and communication are nearly always rhetorical and political, aiming to persuade or produce an effect; such as getting someone to do something, or perceive the world in a certain way. Information is, therefore, always embedded in relations of power and persuasion (Peckham, 1979), and hence in group membership and hierarchies.

Information, in this sense, becomes a strategic tool. The further apart the ‘sides’, the more this is so. If information allows ‘our’ side to further their position or causes the others to blink or lose support, then it has served its purpose, accurate or not. Think of the reiterated, and false, claims that Australia is “going it alone on climate change” (Crook, 2012). The Today Tonight story intended to make an effect, that is, cause alarm, make the games look badly run as part of an ongoing India-Australia media slanging match, get the show and the reporters mentioned, and quite possibly make security better – not all political intentions are ‘bad’.

IMP 4: Information becomes commoditised.

Information is used to make money. This is basic to the economics of information, although obviously not the whole of it (Linde & Stock, 2011). Commoditising information means that information is valued by ‘appeal’ or profit, rather than by accuracy. If the story generated more comment, audience and revenue than it generated costs, then the media outlet has profited. If people enjoy the wedding story, buy the magazine and it pleases potential advertisers, this is more important to the media corporation than accuracy. Information is validated by economic criteria.

‘Appeal’, however, may not be purely financial. If Channel 7 has people using its story all over the world, this promotes them. Worldwide, people now know that Today Tonight can ‘break a story’ which attracts audiences, and the journalists may gain career-furthering recognition.

Also vital to an information economy is that information has to be ‘fenced off’ to become private and sellable. So ‘good information’ becomes restricted. The more value it appears to have, the more it is restricted (Marshall & da Rimini, forthcoming). When profit connects to restriction of information, then information overload, and inevitable deficiencies in information checking, can result in the promotion and sale of goods that have not been independently tested, with their unpleasant side-effects unrecognised. Companies primarily generate, and supply to the media, information that supports profit. This is allegedly particularly prevalent in the pharmaceutical industry (Healy 2012: 99ff.), but could be expected to occur elsewhere.

IMP 5: Workers in the information industry are insecure.

In information society, ‘news’ is manufactured by workers in an industry facing decline of profit and workforce (Davis, 2008; Lanchester, 2010; newspaperdeathwatch.com). Cost cutting in news organisations to increase or preserve profit, produces progressively less time for journalists to do original research or check facts. The likelihood increases of journalists rewriting press releases from public relations companies or from organisations hiding their role as fronts for public relations, so information produced for its political effect is widely circulated as ‘real’ (Davis, 2008). According to Jackson (2013) “[t]he long-term trend showed that growth in the PR industry was far outpacing journalism”.

Journalists also become increasingly likely to go with someone’s story if it strikes them as pleasing (having the right effect), plausible (meets their preconceptions) and relatively risk free, to generate enough news to keep their job (Davis, 2008: 42ff.). These conditions may also provoke reporters to falsify, exaggerate, or anticipate what they think is the real truth, beyond the evidence, to make compelling publishable stories. This could promulgate journalistic cynicism, in which journalists doubt their work’s value, but doubt anyone does differently. Here doubt acts as a salve to conscience: “I don’t really believe in what I do, I’m above it. It’s what you do to live. It’s the best I can do”. A soothing rebellion that leaves them in the system, with relatively few companies engaged and few other jobs. Furthermore, with the decline in the number of companies reporting news, most news information comes from increasingly fewer sources, and hence errors are less likely to be discovered before being widely reported.

IMP 6: Information is tied to group identification.

Information marks loyalty to a group, a source and an identity. Information is more persuasive the more it appears to come from an exemplar of groups we identify with (Haslam et al., 1996; Hopkins & Reicher, 1997). The information a person accepts marks their wider social group’s, ‘imagined communities’ and shared values, making what I call an ‘information-group’. As information marks loyalty, this can imply the demeaning of other information-groups. Hofstetter et al (1999: 355) remark that ‘hosts’, or information sources “rarely make bold counterfactual assertions, but more usually fill programming with the invectives of sarcasm, innuendo, and diatribe, repeatedly directed against targets”, foreshadowing the abuse should someone leave the group, while aiming to increase distrust of information-outgroups. David Frum, speechwriter for George W. Bush and coiner of the phrase ‘Axis of Evil’, was dismissed from his work in Conservative think tanks after criticising the Republican Party following Obama’s victory. He suggests that cultivation of hostility and information-groupings also has a business function:

The business model of the conservative media is built on two elements: provoking the audience into a fever of indignation (to keep them watching) and fomenting mistrust of all other information sources (so that they never change the channel) …

(Frum, 2011)

As this implies, the economics of gaining an information audience leads to distortion (see Kitty, 2005 on Fox News for examples).

In a situation which can be characterised as ‘data smog’, only rarely can the full range of information available on a topic (whether this range can be quantified or not), be perceived by the audiences who potentially exist for that information. This intensifies information-group filtering as information’s political/membership function leads to the information being splintered by audience. If some audiences refute the information, those corrections will have little effect on other audiences as they are unlikely to perceive, or be able to find, the refutations, or process them without risking their group allegiance and identity. We may instance how regularly people claim that the world is cooling despite data to the contrary, and the ways in which receptivity to climate change data is marked by long-term political allegiance.

Loyalty to a purpose and to a valued source gives information authority and renders it less likely to be challenged within the group. Group identification leads to the valuing of both information and doubt. If a story issued from our group should appear to be wrong, then people can excuse themselves by claiming they didn’t believe that particular story, and the media in general, anyway. Or they can doubt the refutation by claiming it is biased coming from an ‘outgroup’. In either case, they can still stay with their particular authoritative media/source and their shared identity. Other media, marking allegiance to outsider information-groups, can be doubted even more strongly. People may come to disbelieve anything they don’t want to believe, being savvy members of the internet generation. They almost certainly don’t check the accuracy of stories they ‘instinctively’ believe, or which are in harmony with group preconceptions. Believing the refutation risks rejection from their information-group as happened with David Frum (described above). Doubt both distances them, (‘I am not being manipulated, I know this is not true’), helps keep them in the same information-grouping (despite its occasional perceived inaccuracies), and helps maintain consistent values and expectations of the world, in the face of information overload.

A paradox arising from information-group allegiance, is that a real media elite, with the capacity to sway millions can, by virtue of that popularity, pretend that other outlets, with small audiences, and almost no sway, are elitist plots, when those outgroup media disagree with them. The audience identifying with the source, or an information-group using that source, easily doubt the motivations of those who criticise it or them. As the audience is not elite, and identifies with its source, then someone else must be elite. This ambivalent disengagement and engagement, maintains the status quo. The more stringent the information-group boundaries, the more hostile the members may become to other sources of information.

When information becomes political and attached to information-groups, then no referee appears neutral. In the 2012 American presidential elections, people set themselves up as fact checkers, to keep the debate honest, but found they were perceived, and treated, as part of the political contest (Corn, 2012). If they disagreed with an information-group’s sources and beliefs, then their information was doubted as they became defined as an information-outgroup. Sunstein (2009) points to a further reinforcing factor: when people of similar opinions talk to each other in a group, that group’s opinions become more similar and more opposed to those of their outgroups or opponents. As groups selected by views, information-group members tend to become less able to observe counter evidence and process it as a group. They more strongly doubt their ‘others’.

IMP 7: Bad information confirming bias is distributed easily.

As motivation helps the spread of information, we can hypothesise that information which engages with people’s deepest feelings or survival impulses, whether of fear or anger, or which confirms their driving prejudices and stereotypes, and serves an immediate political purpose, will travel faster than information which encourages calm, reflection, or disconnection from prejudices or victory. Hence the spread of distorted information about Obama or climate change when it is tied to particular groups and politics, even without overt media promotion.

Again, it seems likely that more people spread the Today Tonight story or the Women’s Day version of the wedding than the refutations. On internet groups, I have observed the way Fox or Rush Limburg stories travelled amongst well educated Republican sympathisers, who did not check them against other sources, especially if they considered those other sources to be hostile. It often seemed that members of an information-group would suddenly nearly all have a similar opinion on some particular event, whether this opinion was normally part of their worldview or not. These stories sounded as if they should be true, expressed loyalty and hostility, and reinforced existing values, while usually condemning an information, or other, outgroup. If these stories are not true, well, don’t we doubt that anyone tells (or attains) ‘the truth’, anyway? The audience can always choose what they want to believe, or not to believe, depending on group loyalties and perceptions of refutational hostility towards ‘their’ group. Similarly, if fellow members of an information-group repeat a story which actually originates from the same source, it is verified in that act of repetition no matter how unreliable the source. It becomes a general rumour or general knowledge – the singleness of the originating source may be ignored. We don’t have the time, ability, or knowledge, to check all the information we encounter in the information overload, hence the source of the information (when known) and mass of people close to us who repeat that information generate validity or likelihood.

While people who recognise their ignorance or are unsure about their knowledge have little to say, misinformed people can be active proselytisers. The more paranoid, emotive and engaged the information, the more ‘evidence’ the claimants probably have and the more urgent it appears to pass it on. If bad information drives out hesitation, then ill-informed people tend to spread that bad information.

If it is the case that information that is spread and acted upon in information society, tends to be information which meets information-group bias, has high motivation, and maintains anger or other strong emotions, then democratic politics could become increasingly influenced by information tied to anger and hatred. Movement towards ‘rational’, ‘calm’ consideration of data and explanations in an ideal ‘Habermasian’ public sphere, becomes less likely. Information society may tend to produce a politics of ‘information paranoia’, suspicion, and projected conspiracy. We doubt our opponent’s intentions and information, they are manipulative by definition, we are suspicious of them.

IMP 8: Bad information hangs around.

On the internet, bad information stays available even after decisive refutation, and can be rediscovered and spread again. Even if no one is interested in promoting it, bad, misleading or untruthful information remains available to give authority to a position that cannot find, trust, understand, or want to find, the refutation in the data smog. An example is when climate sceptics continue to reuse refuted data. Information is easily separated from its refutation. We can hypothesise that Republican voters’ beliefs and doubts about Obama arise because:

  1. it helps explain their ongoing disquiet with events in the USA;
  2. the speed with which stories circulate and people reinforce each other within the information-group;
  3. the recurring encounter with ‘facts’ refuted elsewhere, without encountering the refutations in an acceptable context, and;
  4. cultivated distrust of information-outgroups.

Information-groups act to limit or filter the information received. In the information society, the very plenitude of information produces a kind of Gresham’s law of information: ‘Bad information drives out good’.

IMP 9: Information affects status.

The value of information depends upon its source and the source depends on the value of its information. Status (and possibly income) can depend upon being perceived as authoritative. Consequently, inaccuracy tends to be denied, or repeated to make it truth, as when it is reiterated that Reagan was a good president (despite almost bankrupting the State, risking atomic war, funding and equipping the Taliban, arming Iran, introducing policies which cost many Americans their life savings and lowering people’s rights at work). In an information society, it is hard for a person to be recognised as both legitimately high status and generally wrong. Publicising a mistake compromises the value of the source. Saying Reagan was not that great would compromise the information-groups built around his greatness. Therefore, sources tend not to admit mistakes, or cover that admission. Users of a repeatedly mistaken source could hunt for new sources but, rather than renounce their information-groups and values, they doubt the refutation, or the accuracy of all sources.

In information society, those with more status are supposed to have mastery of more information (however defined) than those receiving information from them (managers for example). Media outlets, as information originators, may encourage a patronising attitude towards their readers, as they are informing those readers. The more that information originates within the media then the higher the status the media attributes to themselves. High status journalists tend to be those who supposedly lead opinions: merely reporting as accurately as possible, is neither ‘creative’ nor ‘value adding’. As having hidden information both marks status and is a resource (which can possibly be charged for), providers may worry even less whether the information sold approximates reality. The more they can apparently manipulate their audience, the less regard they will have for accuracy; they can simply doubt their audience’s capacity to understand the ‘complex’ truth.

If ‘information receivers’ suspect this attitude exists, then they may resent it and further dismiss/doubt media reports, especially those they disagree with. Information increasingly becomes a matter of momentary usefulness (or not), in terms of being entertaining, meaningful or confirming existing prejudices, doubts, or actions. In particular, this status contest arises should an outgroup media tell them that they are mislead by their favourite sources. The ABC’s Media Watch can sometimes be said to look down on consumers as it uncovers what it perceives as manipulation and implies those being manipulated are not as high in information status as its own ‘host’. Given the status system, people usually reject the contention that they are being manipulated, and thus doubt the claims of those showing they are being manipulated. Everyone then claims to be aware of ongoing deception, or the general failure of ‘truth’; because being deceived renders a person low status and incapable. We defend our own knowledgability, while doubting that we are truly misled.

IMP 10: Accurate information depends on admitting ignorance.

To obtain adequate information, people must know when they have inadequate information. This is a variant of the Dunning-Kruger argument that people who are incompetent in a domain lack the skills to evaluate competence in that domain, and hence are convinced they are competent, while being unable to recognise real competence in others. Conversely, competent individuals may not recognise how rare their competence is (Kruger & Dunning, 1999; Ehrlinger et al., 2009). The Dunning Kruger papers almost entirely ignore social factors, such as information being validated by valued others. If valued people don’t agree that your shared information is inadequate, or that recognising incorrectness serves a useful function, then you risk potential disruption to the information-group. Given that the information-group is bounded by loyalties to sources, plausibilities and values and exists to filter information, it becomes less probable that members will confirm the lacks and more likely they will reinforce ignorance, and doubt everything that does not confirm those plausibilities and values. The combination of correctness with status means that leaders are even less likely to seek correcting knowledge. This is especially likely if it might appear to admit an information-outgroup is more knowledgeable. People defend themselves from recognising their own inability to judge informational accuracy by claiming to doubt it all. Doubt saves members the pain of teaching themselves what others might believe, and risking loss of identity. Processes of doubt and inaccuracy reinforce each other.

Conclusion

In this socio-caricature, I have argued that information in information society is subject to numerous social and technological forces that produce ‘information mess’, doubt and the embrace of misinformation. The factors leading to information mess arise in the social processes of the information society, and are not ‘failures’ of that social process. The failures come with the successes and are not separable from them.

The internet as the technological basis of the society allows the easy production, storage and distribution of information, producing information overload. Information supporting almost any position can be found, and hangs around even if refuted. The drive to produce quantities of information, especially for sale or success, also tends to lower accuracy or adequacy of that information.

People maneuver in the information mess via the information-group to which they belong or aspire. Information-groups are imagined communities bound by previous knowledges, loyalties to information sources, and conflicts with other information-groups. Information-groups act as filters for information allowing people to make sense of the world, and maintain their social position and beliefs, despite counter information. They provide a level of consensus as to what is actually happening, identify enemies and scapegoats and help people negotiate their lives, amidst complexity. They give the knowledge which allows interpretation of other knowledge. Members may also help and support each other more directly. Losing identification with an information-group could leave a person friendless and without conceptual anchor.

Information and news sources are often associated with corporations and exist to make profit which they do by producing information that has appeal, and matches and motivates the social biases of the information-groups they aim to mobilise or to whom they sell information. Workers in the media are generally under pressure to produce pleasing items rather than check facts. Pressures on profit reinforce the need to tailor news to those existing biases. Information which seems truly valuable may be restricted in order to maintain its value. Sources likewise cast doubt on the information peddled by outgroup media, and disparage the outgroup itself, so as to bind their own information group more firmly.

Information is more easily spread the more it motivates people to spread it within their information-group, and this will be information that matches that group’s biases and activates strong emotions. Counter information is harder to discover without risking one’s place in an information group, and only few people in the same group will legitimise any sense of the group’s ignorance, especially as status is gained by being recognised as knowledgeable within the information-group’s parameters. In the information mess created by overload, by endless claims and counter claims doubt becomes a general response that allows people to function in this society. It provides an order of scepticism that helps people to dismiss challenging and dislocating information, and maintain their loyalty to sources and information-groups which may appear to be inaccurate. As such, it also acts as an information filter and is essential to the processes of information mess.

References

Australian Broadcasting Corporation (2010a, 4 October). Woman’s Day’s Fairytale Exclusive. Media Watch Episode 35 Retrieved from http://www.abc.net.au/mediawatch/transcripts/s3029143.htm

Australian Broadcasting Corporation (2010b). Bride’s Tale Revisited. Media Watch Episode 36. Retrieved from http://www.abc.net.au/mediawatch/transcripts/s3035307.htm

Australian Broadcasting Corporation (2010c, 27 September). Defusing an Explosive Story. Media Watch. Episode 34. Retrieved from http://www.abc.net.au/mediawatch/transcripts/s3023099.htm

Blair, T. (1999, 25 October). Why the Internet years are vital The Guardian Retrieved from http://www.guardian.co.uk/uk/1999/oct/25/5

Bell, D. (1971). Technocracy and politics. Survey: A Journey of East and West Studies 17(1): 1–24.

Boehlert, E. (2006). Lapdogs: How the Press Rolled Over for Bush. NY: Free Press.

Britz, J.J. (2004). To know or not to know: a moral reflection on information poverty. Journal of Information Science, 30 (3): 192–204

Castells, M. (2000). The Rise of the Network Society. Oxford: Blackwell.

Castells, M. (2004). The Theory of the Network Society. In Manuel Castells ed. The Network Society: A Cross cultural perspective (pp. 3-45). Cheltenham: Edward Elgar.

Castells, M. (2009). Communication Power. Oxford: OUP.

Chamberlain, G. (2010). Commonwealth Games 2010: Athletes warned of rising terrorism threat. Guardian/Observer. Retrieved from http://www.guardian.co.uk/sport/2010/sep/25/commonwealth-games-athletes-terrorism-threat

Corn, D. (2012). How to Beat the Fact-Checkers. Mother Jones September/October Retrieved from http://www.motherjones.com/politics/2012/09/factcheck-politifact-lying-politicians

Crook, A. (2012, August 22). Get Fact: is Australia ‘going it alone’ on pricing carbon? Crikey.com Retrieved from http://www.crikey.com.au/2012/08/22/get-fact-is-australia-going-it-alone-on-pricing-carbon/

Davis, N. (2008). Flat Earth News. London: Chatto & Windus.

De Caterian, R., Griffioen, A.W., Porreca, F. (2011). Fraud in biomedical research – The role of journal editors. Vascular Pharmacology 55, 119–120.

Ehrlinger, J., Johnson, K., Banner, M., Dunning, D., Kruger, J. (2008). Why the Unskilled Are Unaware: Further Explorations of (Absent) Self-Insight Among the Incompetent. Organizational Behavior and Human Decision Processes, 105(1), 98–121

Elder, J. (2011, July 31). Believe it or Not. The Age Online http://www.theage.com.au/national/believe-it-or-not-20110730-1i5fy.html

Frum, D. (2011, Nov 20). When Did the GOP Lose Touch With Reality? New York Retrieved from http://nymag.com/news/politics/conservatives-david-frum-2011-11/

Floridi, L. (2010). Information: a Very Short Introduction. Oxford: Oxford UP.

Gleick, J. (2011). The Information: A history, a theory, a flood. NY: Pantheon.

Harris Interactive. (2010). ‘Wingnuts’ and President Obama. Retrieved from http://www.harrisinteractive.com/NewsRoom/HarrisPolls/tabid/447/ctl/ReadCustom%20Default/mid/1508/ArticleId/223/Default.aspx

Haslam, S.A., McGarty, C. & Turner, J.C. (1996). Salient Group Memberships and Persuasion: the role of social identity in the validation of beliefs. In J.L. Nye & A.M. Brower (Eds.) What’s Social about Social Cognition? Research on Socially Shared Cognition in Small Groups. Thousand Oaks: Sage.

Healy, D. (2012). Pharmageddon. Berkeley. University of California Press.

Herman, E.S., Chomsky, N. (1998). Manufacturing Consent: The Political Economy of the Mass Media. NY: Vintage.

Hilbert, M. & Lopez, P. (2011). The World’s Technological Capacity to Store, Communicate, and Compute Information. Science 332 (1st April), 60-65.

Hofstetter, C.R., Barker, D., Smith, J.T., Zari, G.M., Ingrassia, T.A. (1999). Information, Misinformation, and Political Talk Radio. Political Research Quarterly, Vol. 52(2), 353-369.

Hoggan, J., Littlemore, R. (2009). Climate Cover-Up: The Crusade to Deny Global Warming. Vancouver: Greystone Books.

Hopkins, N. & Reicher, S. (1997). Social Movement Rhetoric and the Social Psychology of Collective Action: a case study of an anti-abortion mobilization, Human Relations, 50(3), 261-286.

Jackson, S. (2013, January 28). Job numbers rise despite big cuts at newspapers. The Australian. Available from http://www.theaustralian.com.au/media/job-numbers-rise-despite-big-cuts-at-newspapers/story-e6frg996-1226563047933

Kitty, A. (2005). Outfoxed. Rupert Murdoch’s War on Journalism NY: Disinformation.

Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology 77 (6), 1121–34.

Kuklinski, J.H., Quirk, P.J., Jerit, J., Schwieder, D., Rich, R.F. (2000). Misinformation and the Currency of Democratic Citizenship. The Journal of Politics 62(3), 790-816.

Lanchester, J. (2010). Let Us Pay. London Review of Books 32(24) 16 December Retrieved from http://www.lrb.co.uk/v32/n24/john-lanchester/let-us-pay

Laube, W. (2011, March 16). Sickening tsunami of faked photos. Sydney Morning Herald Online Retrieved from http://www.smh.com.au/federal-politics/society-and-culture/sickening-tsunami-of-faked-photos-20110315-1bvuo.html

Levy, D. (2008). Information Overload. In K.E. Himma & H.T. Tavani (Eds.) The handbook of information and computer ethics (pp.497–515). Hoboken: John Wiley.

Library of Congress (2013). Web Archiving FAQs. (10th Feb, 2013). Retrieved from http://www.loc.gov/webarchiving/faq.html

Lievrouw, L.A. & Farb, S.E. (2003). Information and equity. In: B. Cronin (Ed.), Annual Review of Information Science and Technology 37: 499-540

Linde, F. & Stock, W.G. (2011). Information Markets: A Strategic Guideline for the I-Commerce. Berlin: De Gruyter Saur.

Lowe, I. (1998). The Story: the Clever Country. ABC: The Slab. Retrieved from http://www.abc.net.au/science/slab/clever/story.htm

Machlup, F. (1962). The Production and Distribution of Knowledge in the United States. Princeton: Princeton University Press.

Marshall, J.P. (2007). Living on Cybermind: Categories, communication and control. NY: Peter Lang.

Marshall, J.P. & da Rimini, F. (forthcoming) Paradoxes of Property: Piracy and Sharing in Information Capitalism. In Tilman Baumgärtel (Ed.) The Pirate Book: Global Piracy and other Inadmissible Approaches towards Intellectual Property Rights Amsterdam: Amsterdam University Press.

Matías-Guiu, J.& García-Ramos, R. (2010). Fraud and misconduct in scientific publications. Neurología 25(1), 1-4.

The Mercury (2010, September 26). Kate Ritchie Wedding Photos. The Mercury. Available from http://www.themercury.com.au/article/2010/09/26/175141_entertainment.html

Morales, L. (2011). Majority in U.S. Continues to Distrust the Media, Perceive Bias. Gallup Politics. Retrieved from http://www.gallup.com/poll/149624/Majority-Continue-Distrust-Media-Perceive-Bias.aspx

Morales, L. (2012). U.S. Distrust in Media Hits New High. Gallup Politics. Retrieved from http://www.gallup.com/poll/157589/distrust-media-hits-new-high.aspx

Oreskes, N. & Conway, E.M. (2010). Merchants of Doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. London: Bloomsbury Press.

Peckham, M. (1979). Explanation and Power: The control of human behaviour. New York: Seabury Press.

Porat, M.U. (1977). The Information Economy: Volume 1 Definition and Measurement. Washington: Office of Telecommunications, Department of Commerce. Retrieved from http://www.eric.ed.gov/PDFS/ED142205.pdf

Schmidt, E. (2005). Technology Is Making Marketing Accountable. Google Podium. Retrieved from http://www.google.com/press/podium/ana.html

Shenk, D. (1997). Data Smog: Surviving the Information Age. New York: Harper Collins.

Sunstein, C.R. (2009). Going to Extremes: How like minds unite and divide. New York: Oxford UP.

Ware, M. (2004). The STM Report: An overview of scientific and scholarly journal publishing. Oxford: International Association of Scientific, Technical and Medical Publishers.

WorldWideWebSize.com (07 February, 2013). The size of the World Wide Web (The Internet). http://www.worldwidewebsize.com [Updated regularly].

About the Author

Jonathan Paul Marshall is an anthropologist who was an ARC supported QEII Fellow at the University of Technology Sydney. This paper grew out of the ARC supported project entitled “Chaos, Information Technology, Global Administration and Daily Life”, which aimed to investigate the social dynamics of software failure and confusion. Marshall has edited Depth Psychology Disorder and Climate Change (JungDownunder), co-edited an issue of Global Networks on ‘Networks of Disorder’, and is author of Living on Cybermind: Categories, Communication and Control (Peter Lang) which is a long-term ethnography of an internet mailing list. The co-authored book Disorder and the Disinformation Society, is contracted with Routledge.

I wish to thank John Elder who provoked the paper, made suggestions and read the original version of this article. He subsequently wrote a story on a similar theme (Elder, 2011)


Back to top