Click here to return to the  front page

Tangentium

 

March '04: Menu



All material on this site remains © the original authors: please see our submission guidelines for more information. If no author is shown material is © Drew Whitworth. For any reproduction beyond fair dealing, permission must be sought: e-mail drew@comp.leeds.ac.uk.

ISSN number: 1746-4757

 

Networking Democracy: IT and Radical Infrastructures

Drew Whitworth

Page 1 ¦ Page 2 ¦ Page 3 ¦ Page 4 ¦ Printer-friendly version


Democratic moments in the history of IT

Though ‘IT’ and ‘the Internet’ are not synonymous, there is some justification for treating the Internet as the most significant and, for our purposes, representative application of IT. One problem faced by early computing scientists (1950s-1960s) was the duplication of work, as there had developed a multitude of different systems with ‘one-of-a-kind’ software programs akin to ‘original works of art’ [10]. Demands on computing resources were increasing rapidly and the continued isolation of systems was not sustainable. Therefore, without some form of networking, IT could not have continued to evolve.

When seeking to link disparate entities, solutions tend to converge towards one of two poles. The first is to create a central body in which authority is invested. Backed up by legal, financial or moral sanctions and possibly the use of force, this executive body or steering committee imposes particular processes and standards to which all the network’s nodes have to conform. Contrast this with the more dynamic, less centralised solution, where no such core is present (or created) and different individuals, organisations or systems try and reach consensus over the methods by which they will interact in an atmosphere of free and fair deliberation and uncoerced agreement. Networking is both a technical and sociopolitical activity. Linking computers is habitually considered a technical problem, but one must still ask, ‘who has access to the network?’ ‘What can people do when they have access?’ ‘Who makes policy decisions around the location and design of interfaces, and who defines valid usages of the network?’ ‘Who participated in these decisions?’ All the answers can be judged against democratic credentials.

Of course, one may consider the 'need' to continue the growth of IT as representative of its interconnections with capitalism and centralised bureaucracy, and there is a great deal of evidence to support this view. Like all technologies, IT was primarily designed not to challenge existing institutional structures but to preserve them [11]. Few authors suggest that IT achieved its dominant position as a result of social movements and/or democratic change. Indeed IT is often applied in the sort of co-option and repression of democratic movements mentioned above, with surveillance being perhaps the most obvious example. More indirectly, but certainly more significantly, IT preserves and enhances the conditions under which capital (more precisely, rapidly-moving, hard-to-control ‘venture capital’) strengthens its domination over modern life. Rather than being a ‘new frontier’, the ‘space’ opened up by IT is used primarily to lubricate global capital flows [12]. Even where pockets of ‘cyberspace’ appear democratic, IT simultaneously opens up the possibility of their colonisation. Barlow observes how the large institutions which claim ‘hegemony’ over this virtual sphere will use ‘historically familiar’ tactics in order to assert this claim, securing the resources of the sphere into their possession, establishing tariffs and defining cyberculture ‘strictly in terms of economy’ [13]. It is doubtful that these processes will be of much benefit to the autonomous individuals of the future.

Institutionalised locations for power sometimes acknowledge their deleterious effects such as unemployment, material poverty, informational poverty, patriarchy or environmental crisis. When solutions are tried, however, they are often centralised technical fixes. Examples include: introducing ‘electronic voting’ to combat ‘apathy’, or Newt Gingrich’s crass proposal to give ‘laptops to the unemployed of America’ (International Herald Tribune, Jan. 7/8 1995: p. 3). It seems that for institutionalised power, IT’s main benefit is its potential to remove unpredictable and diverse ‘human factors’ and impose standardised fixes [14]. However, such solutions can be, and have been, resisted. In such resistance we can see alternative methods of coping with diversity. Here diversity is not repressed (directly, or through an implicit belief that the centralised, standardised solution has to be the best or most efficient one) but embraced and indeed used as the basis for the chosen technical solution.

Internet pioneers faced one such moment in the early 1980s when the International Organization for Standardization (ISO) sought to impose standards known as OSI upon the fledgling network. As Hafner and Lyon (1996: p.247) say: ‘[o]n the OSI side stood entrenched bureaucracy, with a strong we-know-best attitude…’. [15]. Despite backing from governments in both America and Europe the TCP/IP protocol prevailed. TCP/IP had been developed in a more collaborative fashion, via the Request for Comments (RFC) system. RFC was a semi-formal manifestation of the collaborative decision-making model. Individuals proposed protocols, software, systems and other solutions to particular problems. Via e-mail or Usenet newsgroups, any interested party was then free to criticise, augment or otherwise comment upon the proposal. In contrast, OSI was considered an untried, abstract design dreamt up by what the Net community considered an exclusionary group of bureaucrats who lacked genuine experience with the Net. TCP/IP had the advantage of incumbency, with users having both experience of it and a sense of ownership of it. As one (unnamed) scientist said, ‘Standards should be discovered, not decreed’ [16].

RFC was also used in defining the later set of protocols for the WWW and the related domain name system. After contributors had agreed on the structure and names of ‘top-level’ domains such as .com or .org, the assignment of domain names was turned over to the Internet Assigned Numbers Authority (IANA). Berners-Lee [17] observes that this ‘organisation’ was basically one man, the late Jon Postel. This may seem undemocratic, but Berners-Lee is not the only one to draw attention to Postel’s benevolence here, and resistance to the commercialisation of the system. It was only after Postel’s death that IANA was privatised and a free market in domain names was allowed to emerge. Political theory has never been comfortable with the ‘benign dictatorship’ as an organisational solution, but perhaps Postel was close to being such. Nevertheless he did not act alone, but within the overall moral sensibility of a community which at the time was resistant to commercial exploitation of the Internet.

The WWW too arose as one man’s vision. But Tim Berners-Lee developed it within a dynamic environment containing multiple beliefs and disparate systems: CERN, the European Centre for Nuclear Physics. To get his creation accepted and used Berners-Lee had to acommodate the needs of others, rather than impose standards upon them. In fact, Berners-Lee explicitly observes that CERN’s organisational form (a social network) influenced his solution to a technical networking problem [18]. As the WWW spread exponentially, Berners-Lee then helped create the World Wide Web Consortium (W3C) to oversee it and administer standards. This was also viewed as not a dictatorial body but ‘a place for people to come and reach consensus’ [19].

The open model of decision-making embraces different approaches and opinions rather than treating them as problems in need of a technical fix or centrally imposed standards. With Linux, the open source operating system, users are treated as ‘co-developers’ and testers of innovations. Coding solutions to particular problems are still created by individuals; it is very difficult for groups as a whole to be creative and to develop solutions from scratch. But these spontaneous moments of creative innovation will, if tested and approved, be perpetuated within the Linux community as a whole.

None of the above events or models should be considered uncritically and the following section will describe problems with some. But they do suggest that some moments in the development of IT and the Internet (on which IT depends for its dominant position) could be judged favourably against democratic principles. Even a crude understanding of democracy should suggest that the W3C is a more democratic way to shape the WWW than, say, allowing Microsoft to do so (directly or by default). Or that the RFC model is closer to democratic software creation than a system of patents granted only to large companies which can afford them. Yet at the same time it is difficult to see, in the above events, any direct contributions to a radical infrastructure? I suggest it is because of the absence of such contributions that the seeds planted by these relatively democratic moments have proven difficult to cultivate.

Back to the top

Continue to page 3


Footnotes

10. Hafner, K. and Lyon, M. (1996) Where Wizards Stay Up Late: The origins of the Internet, New York: Touchstone, pp. 43-4. return

11. R. Sharpe, quoted in Webster, F. (2002), Theories of the Information Society, 2nd ed., London: Routledge, p. 139. return

12. Webster, op cit, chapter 4: Bauman, Z. (2000) Liquid Modernity, Cambridge: Polity. return

13. Barlow, J. P. (1995), Coming into the Country. return

14. Capel, R. (1992) "Social Histories of Computer Education: Missed Opportunities?" in J. Beynon and H. Mackay (eds) Technological Literacy and the Curriculum, London: Falmer Press, pp. 57-8. return

15. Hafner and Lyon, op cit., pp. 247-9. return

16. Hafner and Lyon, op cit., p. 255. return

17. Berners-Lee, T. (1999) Weaving the Web, London: Sage, p. 137. return

18. Berners-Lee, op cit., pp. 8-10. return

19. Berners-Lee, op cit., p. 105. return