Information Obesity: The web site

Resources for chapter 9: How organisations affect the way we think

9.1 Interpretations of common terms [pages 139-140]

First, do read Orwell's Politics and the English Language - perhaps for no other reason than everybody should read this brilliant essay at least once. But also try to link it to ideas developed throughout IO, and particularly in this think task, regarding how ways of thinking can be "pushed" at us. This can take place because of the assumptions and values that get designed into the technologies and procedures which constitute organised activity - that is the main theme developed in Information Obesity. But it is Orwell's point that the very structures of language also "push" these ways of thinking (cognitive schema) at us. We can use words in ways that deflect conscious attention towards their meaning, particularly when trying to discuss abstract notions, which many political concepts are.

Try, therefore, to stimulate your critical faculties by reflecting on the following terms. Try to think about what they mean to you, but also, what the generally accepted definition is throughout society or, perhaps, in the organisation in which you work. What differences are there? What tensions arise as a result? Are these discrepancies things which actually prevent your living your life in the way you wish it? Obviously there are no "right" answers here and it is likely that your definitions will be very different from those of other colleagues or takers of this "test", but that is of course the point.

  • democracy
  • human rights
  • work
  • education
  • literacy
  • sexuality
  • drug use
  • privacy [see below]

Can you think of any more potentially divisive terms in which there are frequent attempts to impose one definition over a wide, divergent range of possibilities?

Also now think of how conformity to the socially accepted definition of each of these terms is "policed". If we violate certain norms, what sanctions can be taken against us? Are these justified? How are they justified - in other words, what mechanisms exist in society by which they are enforced and reviewed?

Back to the top

9.2: An example: "Privacy"

The ideas developed here are just an initial exploration of the issue and represent my own feelings, so do not consider this a definitive statement: I am, in effect, merely undertaking the think task as defined so far.

For me, privacy is: a right to engage in whatever activity I see fit within the walls of my own house, as long as that does not harm others directly or indirectly; a right to determine (in collaboration with my other family members) what comes into and out of this house, whether that be people, goods or information; a right not to be placed under surveillance in my home; and, drawn now more widely, a right to veto the publication of information about me in whatever form, and to opt out of certain public displays of that information for no reason other than personal choice (an example, though perhaps a trivial one, being the right not to have my home telephone number published in the telephone directory); finally, a right to keep my personal life private from my employer, which should not claim for itself any right to sanction me for actions or behaviour undertaken away from the workplace.

However, it is perhaps already apparent that many of those personal definitions of privacy are not supported by the legal, economic and informational infrastructures of society - at least in the UK. I cited the example of having an ex-directory phone number partly because it is one of the few cases in which I do actually have the right to veto publication. In practice, if, say, a newspaper decided to violate my privacy in some way through publishing a story about me, my ability to counter this would be limited, and would probably involve me taking costly and lengthy legal action. I would also have to prove that the story was somehow false or violated my human rights. (Of course many people, for whatever reason, are often happy to have their privacy removed, by voluntarily seeking publicity.) Surveillance of, say, web browsing habits is now entrenched in British law (though I wonder how many people would be happy to have their regular mail opened by the authorities before it was delivered to them - this being, in effect, what is going on with electronic mail). The UK has not quite, yet, gone as far down the road as some countries with respect to allowing employers to investigate the private life of employees, but perhaps I do not notice this so much through working in a university, an organisation which traditionally has accorded its core employees a considerable amount of autonomy.

There remain certain sanctions against the violation of privacy, ones which could equally be applied against me if I were to violate the privacy of others. If anyone enters my home against my will they have committed a criminal offence: though we should also note that the state can mandate the violation of privacy through legal documents like search warrants (and indeed the Regulation of Investigatory Powers Bill, which is one means by which the UK government has given itself the power to monitor all web browsing). We probably think it justified that sanctions are taken against those who commit crimes, like violence, in the private sphere, but the question of how far into the private lives of the general population this should go is precisely the moral question that is intimately wrapped up in the seemingly simple idea of "privacy". The British state and media, while continuing to pay lip service to the meme that is "privacy", extends that right only so far, even when no criminal or civil offence is being committed within private spheres. And, in the name of economics, the system is also increasingly violating the idea that individuals should retain any control over how information gathered about them is "made public". There is also little sense that these institutionalised definitions of "privacy" are being validated at the community level. Indeed, we should be fair and recognise that it is often at the community level that privacy is hardest to secure; but perhaps that is partly what "community" is, a sense that things are shared, including, to an extent, one's private life.

Back to the top

9.3 Cognitive load and cognitive biases

This link allows you to view the brilliant "gorilla suit" experiment (which is © Daniel Simons) Remember that the point of this movie is to illustrate cognitive load - so if you have never seen it before, watch it first as it is meant to be watched: that is, ensure that what you do is count the number of passes made during the game.

Then watch it again, without paying attention to the count. Did you spot the intruder this time?

Or, try the Stroop test (Power Point presentation). Note that this works in any language. The people who are best at it are little children who cannot yet read; or people trying it in another language other than the one they speak. Try the Greek version, for illustration. (If of course you speak both Greek and English you might find both difficult, but can probably switch to saying the Greek colour names for the English text and vice versa - this will probably help.)

Confirmation bias. The picture is of Manchester United lining up prior to a major European game. However, there are 12 people in the picture, not 11. The gentleman to the far left of the picture is Karl Power, who is a sort of professional hoaxer. He simply jumped out from the crowd as the team ran out and joined them for their picture. No one actually noticed: not the photographers, not the security staff at the stadium, and not even any of the players, at least not straight away. How on earth did he get away with this?

Image: Karl Power lining up with the Manchester United team

The simple answer is that he a) had the gumption to do it in the first place and b) - most importantly - he was wearing the kit. Had he been dressed in normal clothes he would surely have been stopped long before the players got into position for the photograph. When the teams are running out onto the pitch the security staff are not going to look at everyone's faces and check them against some kind of mental model of what these 11 Manchester United players actually look like. In any case this assumes that people would be confident, in a split second, to make such a judgment (one can imagine the furore if a legitimate player was wrongly stopped from joining his team mates, thus making it extremely risky to challenge interlopers in this situation). Nor were the photographers any more likely to have challenged Power, concerned as they were with getting that all-important shot of the players. Instead, the confirmation bias encourages us to look for clear patterns that help us make decisions quickly. This is, of course, exactly why footballers wear distinctive kits, so they can (at least in theory) ensure a pass is directed towards a team mate; they would not have the time to scrutinise the faces of all 22 players on the pitch, but the kit reduces the amount of cognitive processing required.

In a more detailed sense the confirmation bias serves to validate our existing opinions. We are more likely to notice data that confirm them, and fail to notice data that suggest our hypotheses, opinions, etc. are wrong. Remember the point made on p. 144 of IO, drawn from Peter Knight's excellent book on Conspiracy Culture:

He discusses how elaborate conspiracy theories can be continually "confirmed", rather than refuted, by new evidence. With the Kennedy assassination for instance, conspiracy theorists: "claim that any new piece of information which would undermine existing theories or confirm rival ones might itself be a deliberate plant by the powers that be to lead investigators astray. Likewise the lack of evidence of a conspiracy can itself be taken as evidence of a conspiracy to deliberately withhold vital information. The infamous backyard photos of Oswald confirm that he was indeed the lone gunman? Then they must have been faked" (Knight, 2000: p. 98).

Affirmation bias. In 1995, Barings Bank, a venerable London financial institution, was brought down by the poor dealings of "rogue trader" Nick Leeson (pictured). Leeson lost enormous amounts of money on the financial markets but, as with congenital gamblers, rather than stop trading and/or change his strategy, continued to try to "win it back" and thus dug himself and Barings into a deeper and deeper hole. Nor did the institution itself prove capable of recognising the danger it faced, and this led to its eventual collapse (see the Wikipedia page. The US corporation Enron was also brought down by a similar set of circumstances (see the movie, Enron: The Smartest Guys in the Room); a combination of poor decision making with an arrogant assumption that this was a proven strategy and a reluctance to accept, or even see, the onrushing catastrophe until it was too late.

The affirmation bias causes us to:

  • overestimate our own abilities and knowledge
  • be over-confident about our reasoning and judgment
  • declare, with hindsight, that our powers of prediction are better than they were ("I always said that would happen")
  • take responsibility for successes, but blame failures on others.

All have been proven by cognitive scientists. While it is easy to be cynical, and we should remember that affirmation bias helps us join the world (instead of retreating into an angst-filled, depressed state, believing nothing we can do will be successful or make a difference):

it can also "impede our judgment" (Blaug, 2007) quite dramatically. We may simply fail to see when we are heading down a wrong path of thought, or activity, until it is too late. The more cognitive work we have invested in something, the less likely we are to abandon it. We embed prior decisions into sociotechnical systems that subsequently direct our action; thus institutionalising the affirmational bias. This is a very dangerous trap for activity, possibly leading organisations into negative feedback loops, where it becomes impossible to see the basic flaws in a strategy or set of values that are leading towards disaster. (IO, pp. 145-6)

Reification bias. Reification involves treating things which are social constructions as if they are "natural". I mention the following possible examples on p. 146:

  • that rising GDP is the inevitable aim of economic policy
  • that modern education must inevitably use more ICT than in the past
  • that certain minorities are naturally lazy and thieving
  • that women are naturally worse managers than men.

As you can see the reification bias, in its stronger and less savoury forms, can also take the form of stereotype or outright prejudice and discrimination.

THINK: what unspoken but nevertheless widely-held assumptions permeate your organisation and/or activity systems? And also - though I realise this will be difficult and perhaps uncomfortable for many people - be honest with yourself: what prejudices do you hold? What reasons do you have for holding them? Have a good look at these - are they things you have picked up from personal experience, or do you think they are media-generated?

The closing pages of part 3, which summarise the nature of information obesity and how education can - and cannot - help, is one of the 3 book extracts on this site.

Back to the top

All information on this site is © Andrew Whitworth 2009. Site design by Marilena Aspioti. Information on this site can be freely reproduced and used for educational and/or non-profit purposes. For commercial use, contact the copyright holder.