ISJ 2 Index | Main Newspaper Index
Encyclopedia of Trotskyism | Marxists’ Internet Archive
From International Socialism 2:74, March 1997.
Copyright © International Socialism.
Copied with thanks from the International Socialism Archive.
Marked up by Einde O’Callaghan for ETOL.
The last few years have seen an extraordinary wave of enthusiasm for computers, particularly for the internet computer network. In Britain nearly a third of households have a computer. Computers are advertised as vital to children’s education. They appear in films from Jurassic Park to Goldeneye. And this, we are told, is only the beginning. Computer company Olivetti states that over the next 20 years computers will be involved in almost every aspect of people’s daily lives, from controlling laser guided vacuum cleaners to replacing visits to the doctor: ‘You may never have to go to a doctor again. You could simply have a tele-conference with your physician, using sensors to transmit vital information.’ [1] Bill Gates, multi-billionaire head of the software company Microsoft, predicts that a computer will soon be available so small that it can fit into a pocket:
It will display messages and schedules and also let you read and send electronic mail and faxes, monitor weather and stock reports, and play both simple and sophisticated games. At a meeting you might take notes, check your appointments, browse information if you’re bored, or choose from among thousands of easy-to-call-up photos of your kids.
The tiny computer will also take the place of money and of keys, and give appropriate traffic reports. [2]
It is not surprising that computer companies make great claims for their products but such claims are widely accepted. Most Americans, for example, believe that by 2000 cars will have computer controlled navigation systems, and that computer technology will mean you can watch any TV show you want at any time – and that by 2005 cash will be obsolete and home appliances will respond to spoken commands. [3] Indeed, no statement about the importance of computers seems too extreme. The blurb for one academic study of the internet, for example, begins as follows:
Multimedia, the information superhighway and the internet have changed our world almost beyond recognition. Electronic networks have revolutionised the human relationship to time and space, and have undermined national boundaries. [4]
Politicians of both the right and left accept the idea that computers are changing all our lives fundamentally. Deputy Prime Minister Michael Heseltine has claimed that society is:
… about to go through a revolution which is immensely exciting, basically a technological revolution of the superhighways ... People today have not fully grasped the effect it’s going to have on their lives, but it is, in my view, of incalculable consequence ... People will have more leisure and will have more wealth … it’s all very exciting, very positive. [5]
The Labour Party’s information technology policy also asserts that large scale social change is on the way:
New technologies … will bring fundamental change to all our lives.
The ways in which we do business, or study, or receive broadcast entertainment, or receive healthcare, or shop, or make use of public services, will be transformed in a host of innovative ways. [6]
These large social changes have, of course, political consequences. Theorists of all political colours agree that we are going through an ‘information revolution’. Just as the industrial revolution fundamentally changed the world to bring about modern industrial society, so the information revolution will create a new, information society. For the right, all this is a triumphant vindication of capitalism. Technology is constantly advancing, bringing more and more advanced technology with no increase in price. Computers will assist in creating what Bill Gates describes as ‘friction-free capitalism’:
Capitalism, demonstrably the greatest of the constructed economic systems, has in the past decade clearly proved its advantages over the alternative systems. The information highway will magnify these advantages. It will allow those who produce goods to see, a lot more efficiently than ever before, what buyers want, and will allow potential consumers to buy those goods more effectively. Adam Smith would be pleased. More importantly, consumers everywhere will reap the benefits. [7]
Computers can even, apparently, solve capitalism’s social problems. On his second day as speaker of the US House of Representatives, right-wing Republican Newt Gingrich proposed that poor people, including the homeless, should be given tax credits so that they could buy a computer and so increase their skills and chances of employment. [8]
On the left most analysis of information technology starts from a wider ‘post-Fordist’ view of society, such as that put forward by Marxism Today in the 1980s, based on the claim that we are living in ‘New Times’:
Unless the Left can come to terms with these New Times, it must live on the sidelines ... At the heart of New Times is the shift from the old mass-production Fordist economy to a new, more flexible, post-Fordist order based on computers, information technology and robotics. But New Times are about much more than economic change. Our world is being remade. Mass production, the mass consumer, the big city, big-brother state, the sprawling housing estate and the nation-state are in decline ... [9]
The world has been changed so thoroughly by information technology, the argument runs, that we no longer live in the kind of industrial society established in Europe in the last century. It follows that a political theory like Marxism, which set out to analyse and change that society, no longer applies in the modern world. Marxism might, at best, have had something to say about a world of cotton mills and coal miners – but how can it hope to respond to the internet or to the work of computer programmers? Marxism is outdated and this is nowhere clearer than in the field of information technology.
Writers associated with this journal have repeatedly confronted the ideas of post-Fordism and it is not my intention here to rehearse those general arguments. [10] Rather, I want to examine the claim that information technology in particular is beyond the reach of Marxist analysis. To what extent have computers fundamentally changed the world? Is the computer industry really different from the capitalist industries of earlier periods? Is Marxism capable of understanding the technologically advanced world of the late 20th century, and of forming useful political strategies based on that understanding?
The expansion of the computer industry in the last 50 years has indeed been enormous. In 1947 one computer engineer predicted that six computers were all the United States would ever need. [11] In fact by 1994 there were some 82 million computers in the US and 200 million in the world as a whole. [12] Making computers, and the chips that control them, has become a major industry. For many white collar workers in the developed world computers have become as integral a part of their office equipment as phones or photocopiers. The trend is set to continue, with the world market for personal computers reportedly growing at 30 percent a year. [13] Communications between computers is one of the areas of fastest growth – the numbers of computers connected to the worldwide internet computer network, for example, currently doubles about once every 12 months. [14]
As well as becoming more numerous, computers have become more accessible. The first computers were enormous machines which filled whole rooms, impossible to use without highly specialised knowledge. In the 1960s and 1970s computers became a little more widespread but they were still physically huge and extremely expensive by today’s standards. These machines often needed to be installed in special rooms, and contact with them was only possible via an elite of programmers and administrators. In the early 1980s this changed dramatically with the introduction of small machines such as the IBM Personal Computer – ancestor of most personal computers in use today – and the Apple Macintosh, selling for a few thousand dollars. As machines have become smaller and cheaper they have also become more powerful, and this means that they can run software which is much easier to use.
There is, then, a real basis to claims about the ‘information revolution.’ But for all this, information technology is available to a tiny minority of the world’s population. Most people in the world still have homes without electricity. According to the Labour Party’s ‘information superhighway’ policy, ‘half the people in the world have never made a phone call’. [15] Figures for the availability of telephones are worth looking at in more detail, since most people outside universities and large companies who access the internet do so over phone lines. The figures for the numbers of ‘internet hosts’ – large computers connected to the system – make much the same point as those for the availability of phones.
PHONES AND INTERNET HOSTS BY COUNTRY [16] |
||||
Developed countries |
|
Phones per |
|
Internet |
---|---|---|---|---|
Australia |
481 |
309,562 |
||
Belgium |
469 |
30,535 |
||
Denmark |
869 |
51,827 |
||
Germany |
545 |
452,997 |
||
Italy |
440 |
73,364 |
||
Japan |
512 |
269,327 |
||
UK |
519 |
451,750 |
||
USA |
483 |
6,053,402 |
||
Pacific ‘Tigers’ |
|
|||
Hong Kong |
|
541 |
|
17,693 |
Singapore |
388 |
22,769 |
||
South Korea |
294 |
29,306 |
||
Taiwan |
366 |
25,273 |
||
Eastern Europe |
|
|||
Poland |
|
93 |
|
24,945 |
Romania |
99 |
954 |
||
Russia |
163 |
14,320 |
||
Latin America |
|
|||
Brazil |
|
62 |
|
20,113 |
Chile |
55 |
9,027 |
||
Mexico |
70 |
13,787 |
||
Africa and Asia |
|
|||
Angola |
|
4 |
|
0 |
Bangladesh |
2 |
0 |
||
China |
9 |
2,146 |
||
Congo |
7 |
0 |
||
Egypt |
10 |
591 |
||
Ghana |
5 |
6 |
||
India |
9 |
788 |
||
Kenya |
9 |
17 |
||
Morocco |
10 |
234 |
||
Pakistan |
‘about 7 ...’ |
17 |
Only some 20 countries – such as the US, Japan and the European Union states – really have access to these technologies. In others, such as the Pacific ‘Tigers’, Eastern Europe or Latin America, some access is available. But in the African and Asian countries, where most people in the world live, telephones are a rarity – let alone computers and the internet. The exclusion of Africa and Asia from new technologies has been typical of capitalism for a century – in 1912, for example, 67 percent of the telephones in the world were in the United States, 26 percent in Europe, 1.3 percent in Asia and 0.3 percent in Africa. [17]
Access to information technology is denied not only to many countries but to many people in the richest countries. Some 43 percent of Americans have never used a computer, and only 31 percent of the population own one. [18] An estimated 15.7 million people in the US had access to the internet by the end of 1996 – only some 7 percent of the population, in the country with more computers than any other. Computer use is dominated by the middle class. Fully half of American internet users, according to one survey, have household incomes of over $50,000. [19] Another survey of users of the world wide web (www), one of the most popular services on the internet, found that:
- 25 percent of www users earn household income of more than $80,000 whereas only 10 percent of the total US and Canadian population has that level of income.
- 50 percent of www users consider themselves to be in professional or managerial occupations. In contrast, 27 percent of the total US and Canadian population categorise themselves as having such positions.
- 64 percent of www users have at least college degrees while the US and Canadian national level is 29 percent. [20]
Class is not the only factor influencing people’s use of computers. While 57 percent of Americans have used a computer, for example, only 16 percent of retired people have done so. [21] Men are twice as likely as women to use the world wide web, and this reflects the sexism prevalent in all areas of computing. [22] However, class is the most important factor in the simple sense that most people cannot afford a computer. When people who did not own computers were asked why they did not buy one, 72 percent replied that it was because they cost too much. Asked if they would learn how to use a computer if they received one as a gift, 92 percent of people who didn’t use computers replied that they would. [23]
Computers are increasingly used as an educational resource – the best selling encyclopaedia in the world, for example, is Microsoft’s Encarta software. Microsoft claim that Encarta sells five times as many copies as the best selling printed encyclopaedia, at a fraction of the price. [24] However, such resources are in practice available only to those children whose parents can afford a computer. Children without computers at home have to rely on schools to provide them – and government figures show that there is only one computer for every 18 children in British primary schools, and one for every ten children in secondary schools. Over half of the computers included in these figures are over six years old and, as such, probably obsolete. So in practice one modern computer is available per class.
The Labour Party, if elected to government, ‘would hope to persuade companies to donate or sponsor equipment’. [25] One of the highlights of Labour’s 1995 conference was Tony Blair’s announcement of a deal with British Telecom; in return for access to the lucrative video-on-demand market, BT would cable up every school. But there is no chance that business would donate the millions of computers necessary to give every child access to one. Even if every school were connected to computer networks for free, they would still need funding for the phone calls they would make along their new cabling when connecting to computer systems. Teachers would still need training. Labour’s proposals would not change the basic position that very few working class children have proper access to computers.
Altogether, far from breaking from a society based on class, in which the developing countries have little access to technology, and workers in the developed world have far less access than the middle class, computers continue these trends inherent within capitalism. Indeed at every stage of their development computers have reflected the capitalist society that gave them birth, and the inability of the market to meet people’s needs.
The general principle that computer technology can only be understood in the context of capitalism, and that the market prevents the full development of that technology, goes back even to the 19th century prehistory of computers, when Charles Babbage attempted to develop his ‘Difference Engine’ and ‘Analytical Engine’. The roots of Babbage’s work lay in the political development of France after the revolution of 1789, and the economic development of Britain after industrialisation. It was probably in 1819, aged 28, that Babbage travelled to Paris and saw the mathematical tables of de Prony, which were to have an enormous influence on him. De Prony had been commissioned during the Republic to prepare an immense set of mathematical tables to celebrate the metric system, and by implication the rational nature of the new political order. The tables were the largest ever conceived and it seemed at first that they would be impossible to complete since too few people were available to do the required calculations inside a lifetime. However, de Prony chanced on a copy of Adam Smith’s The Wealth of Nations. Smith argues that the division of labour is central to efficient manufacturing. He gives the example of a pin factory – if one person carries out all the operations involved in making a pin, he argues, the factory is much less productive than when each operation is assigned to a separate worker and one rolls out the wire, another cuts it, and so forth. De Prony based the production of his tables on such a ‘division of labour’. He assembled three groups: the first included six of the best mathematicians in France, and they set out the overall plan for the project and the general form of the calculations to be used. These formulae were then handed to the second group, consisting of seven or eight competent mathematicians who transformed the general formulae into calculations involving actual numbers. These they handed on in turn to the third group, formed of 60 to 80 people, most of whom knew no more mathematics than addition and subtraction. The third group worked out the calculations which the second group had given them, and in this way the tables were completed. De Prony had shown that intellectual work could be automated like any other sort of work. [26]
The creation of mathematical tables was an important issue for British capitalism in the early 19th century. The development of commerce and banking made necessary millions of calculations. In the absence of any kind of calculator, people either worked these out in their heads, or referred to sets of tables. In particular, the development of British trade with the rest of the world made accurate navigational tables vital. However, there were inaccuracies in all the existing tables. This meant that ships were wrecked, and financial affairs miscalculated. Babbage asserted, for example, that the British government had lost between £2 and £3 million because of errors in tables used to calculate annuities. [27] Babbage planned to build a machine – the Difference Engine – which would produce tables automatically, with no possibility of error, and in 1823 he received funding from the government to do so. [28] Over the next 11 years, the government was to spend over £17,000 on the Difference Engine project – by far the largest government sponsored research project of the time. Part of the Difference Engine was completed and worked perfectly, but the whole machine was never made (it would have consisted of over 25,000 metal parts, and weighed several tons). [29] In the 1840s Babbage began theoretical work on an Analytical Engine, which he continued to develop on paper until his death in 1871. Though consisting of metal parts, the Analytical Engine shares many features with modern computers. Punched cards were to be used to input data and programs – a technology borrowed from the Jacquard loom, where punched cards controlled the patterns woven into the cloth. In the words of Ada Lovelace, a collaborator of Babbage’s and one of the first people to write computer programs, ‘The Analytical Engine weaves algebraic patterns just as the Jacquard-loom weaves flowers and leaves.’ [30]
As well as his work on the Difference Engine, Babbage wrote widely about the development of capitalism in Britain. He campaigned unsuccessfully for the reform of scientific education, arguing that the state must support scientific research if Britain’s economy was not to fall behind those of countries with more interventionist governments, such as that of Germany. [31] Babbage’s experience demonstrates two aspects of the development of technology under capitalism. Firstly, capitalism has made possible technological advances which would have been unimaginable previously, as Marx and Engels noted in 1848 in the Manifesto of the Communist Party. [32] Babbage’s own work is testimony to capitalism’s technological vigour – that he attempted, with some success, to build machines resembling computers at a time when the most complicated mechanism most people had ever seen was a clock.
But Babbage’s work also demonstrates that the market cannot respond adequately to the potential for new technology which capitalism creates. However much Babbage’s machines would have increased the profitability of British capitalism as a whole, there was never any question of his work being funded by industrialists who could make massively greater profits in the short term from textiles and railways. In the absence of such funding he looked to the state.
Conservative politicians have argued for 20 years that the state must have a minimal role in the economy if it is to thrive. Nationalised industries have been sold off so as to shed the supposed dead weight of state bureaucracy, emerging as efficient and profitable competitors in the market. Such ideas have now been accepted by some on the left – Tony Blair’s rewriting of Clause Four of the Labour Party’s constitution, for example, involved rejecting a commitment to nationalisation and replacing it with a formulation which accepted the ‘dynamism of the market.’ However, the idea that the state has only the most minor role to play in the economy squares very badly with history. The development of capitalism saw major change in the nature of the state in every country – including revolution in Britain and France. Legal systems expanded massively to provide the necessary framework for business, including contracts, patents and copyrights. Permanent armed forces were created. State intervention in the economy took place at a local and national level – local government in Manchester, for example, was strongly committed to free trade and the development of capitalism, but by 1905 the local state had invested £7.4 million in water, £2.6 million in gas, £2.3 million in electricity supply, £2 million in tramways, and £5 million in building the Manchester Ship Canal. [33] National governments worldwide intervened in the development of railways – as Eric Hobsbawm comments:
Without exception the new railway systems were planned by governments and, if not actually built by them, encouraged by the grant of favourable concessions and the guarantee of investments. Indeed, to this day Britain is the only country whose railway system was built entirely by risk bearing and profit making private enterprise ... [34]
Babbage’s work thus reflects three elements of capitalism: the enormous acceleration of technological development; the inability of the market to harness that development; and the intervention of the state in support of the new technologies. These three elements have characterised the history of information technology from Babbage’s time to our own.
Far from stimulating further work, Babbage’s ideas fell into obscurity until the mid-20th century, after the invention of the computer at the end of the Second World War, when various machines were developed in reponse to the needs of the military. In Britain a machine called the Colossus was designed to decipher coded German messages. Further work after the war produced the Mark One, technically the first computer in the world, which was put into operation at Manchester University in June 1948. Central to work on both machines was the mathematician Alan Turing. Turing was brilliant, eccentric, naively honest and openly gay. The authorities were willing to overlook his sexuality during the war, but in 1952 Turing was convicted of ‘gross indecency’. He was punished by a year of ‘chemical castration’ – he was given female hormones, which rendered him impotent and caused him to grow breasts. In 1954 Turing killed himself. [35]
The Mark One was built as the prototype of a machine which could be mass produced by the Ferranti electronics company. But the British computer industry never competed effectively with its US rival. ENIAC, forerunner of the first American computer, was completed in 1945; it had been designed to calculate tables used in aiming artillery pieces, and was later used to do calculations on the first hydrogen bombs. By the early 1950s various different computers were in operation.
It was at this point that IBM began to make computers. The company had evolved from the Tabulating Machine Company, who sold calculating machines to the US Census Bureau. Now the Census Bureau began buying computers from their competitors Remington Rand. The more forward-looking IBM executives realised that the company was doomed if it continued to ignore computers. However, Thomas Watson Senior, who had run the company since 1914, was unconvinced and only agreed to make computers when the US government asked IBM to do so during the Korean War. [36] During the next 20 years much of the development of computers was funded by the US state, which needed ever smaller and more powerful machines – to control missiles, for example, and to guide spacecraft as part of the Apollo programme. [37] The US government invested $400 million in IBM, for example, to ensure it kept ahead in the technological arms race. [38] The first 20 years of computing, then, does nothing to support the idea that the market works best without the intervention of the state. The second 20 years, in the 1970s and 1980s, demonstrates that the market, far from making the best in new technologies rapidly available, has brought chaos at every turn.
IBM was generally considered to be one of the most stable and profitable companies in the world when it announced that in 1992 it had made a $5 billion loss – the largest in commercial history. In the first half of 1993 alone IBM lost a further $8.3 billion. [39] The company finally returned to profitability in 1994 – by the end of that year they had sacked some 35,000 people. [40] IBM’s fall has been matched by the rise of software company Microsoft. First created in 1975, Microsoft has risen to dominate the computer software industry, making its founder and chief executive, Bill Gates, the richest man in the world. How did such a turnaround happen?
At the height of IBM’s success more than 70 percent of the world’s computer installations were based on its equipment. The company had invested $5 billion in the early 1960s on the development of a range of computers called the 360. The 360 range was technically advanced and different sizes of machines were available – a company could start off with a small machine and easily upgrade to a bigger one. Developing the 360 was a gamble, since the technology might not have worked, and the 360 made obsolete all previously available IBM computers. But the gamble paid off and IBM became enormously profitable. [41] By the 1980s, IBM’s profits funded an immense bureaucracy. Writing in 1989, Chris Harman quoted the Wall Street Journal to the effect that IBM was:
[a] giant, calcified institution in desperate need of structural modernisation ... Even after slashing its workforce the colossus is one of the world’s most luxuriantly thick bureaucracies ... IBM budget planners write reports about coming reports. [42]
This was not an accident peculiar to IBM – Harman notes the same process taking place in the car industry, the direct result of the operation of the market. With consistent profit levels IBM had little need to innovate. The company became so large that different parts of it judged their work by purely internal criteria, regardless of how profitable it might be for the company as a whole. This became clear when IBM began to collaborate with Microsoft to write software. IBM measured how much work someone did by how much programming code they wrote. This sounds reasonable but the most efficient software uses as few programming instructions as possible for each task since this means it can run more quickly. In this case:
… a Microsoft developer took a piece of IBM code that required 33,000 characters of space and rewrote it in 200 characters ... This was considered rude. Other Microsoft developers then rewrote other parts of IBM’s code to make it faster and smaller. This was even ruder. IBM managers then began complaining that, according to their management system, Microsoft hadn’t been pulling its weight. Measured in lines of code, they said, Microsoft was actually doing negative work ... [43]
IBM’s conservatism and bureaucracy finally caught up with it when they came to produce their personal computer (PC) in 1981. IBM had been trying for several years to produce a successful small computer. In the end they succeeded by assembling a group of mavericks from throughout the company and giving them only a year to put a machine together – an extremely short time by IBM’s ponderous standards. The PC developers produced the computer on schedule by buying many of the machine’s components from outside suppliers, rather than producing them within IBM. At first the PC was an enormous success – in the first four months of the PC’s existence, sales reached $40 million. [44] But other manufacturers could buy the parts that made up a PC from the same suppliers as IBM and they did so, producing ‘clones’ – machines which worked just the same way as an IBM PC but cost less. For a while, both IBM and the clone manufacturers made impressive profits, which further reinforced IBM’s complacent belief that it would always dominate the computer industry.
After 1985 such complacency led to disaster. IBM, which made millions from leasing large computers to large companies, never considered that selling small computers might one day be the more important market. It was used to the longer development times which had worked with the big old mainframe computers, and so it didn’t update its PC as technology developed. Instead IBM tried to change the technology which the clone manufacturers were copying – they introduced a new computer called the PS/2 which worked differently, and stopped selling the PC which had been so popular. In this way it hoped to drive out the clones and completely control the market. But the new computer performed no better than the old ones. As one former IBM executive commented about his managers, ‘They still didn’t realise they were in a competitive world. They thought we could ram anything down customers’ throats.’ Meanwhile, clone manufacturers quickly improved their machines as new technology came along. When IBM’s new computer failed, it was forced to return to making PCs. But now, as one account of IBM’s decline puts it, ‘IBM was just another clone maker, but the one with the most pretensions, the biggest overhead, the highest prices, and a rapidly falling market share.’ [45] IBM’s share of the now enormous personal computer market fell from 50 percent in 1984 to 8 percent in 1995, and its profits fell with it. [46]
In retrospect, IBM seems foolish not to have recognised the importance of personal computers. But there are many new technologies which fail in the marketplace, from 8 track tape cartridges to electric cars, and there is no way of knowing which will become important. Technical superiority is no guarantee of success – Betamax was a technically better video system than VHS, and BSB’s satellite broadcasting system better than Sky’s. The only way of finding out which will work is to gamble – as IBM did successfully with the 360 and the PC, and unsuccessfully with the PS/2. In the circumstances it seems unsurprising that IBM tended to stick with products with a proven record of profitability rather than innovating. The market gave it no incentive to do so.
As IBM’s fortunes have declined, so those of Microsoft and Bill Gates have risen. This may look like a capitalist dream come true – that anyone with brains and who works hard can become rich – but the reality is rather different. Gates first got involved with computers at 13 years of age. The son of upper middle class parents, he attended a school rich enough to pay for the students to have use of a computer, which was exceptional in 1968. Gates then dropped out of university to work with computers with schoolmate Paul Allen. One popular history gives the following account:
Like the Buddha, Gates’s enlightenment came in a flash. Walking across Harvard Yard while Paul Allen waved in his face the January 1975 issue of Popular Electronics announcing the Altair 8800 microcomputer from MITS, they both saw instantly that there would really be a personal computer industry and that the industry would need programming languages. Although there were no microcomputer software companies yet, 19 year old Bill’s first concern was that they were already too late. ‘We realised that the revolution might happen without us,’ Gates said. ‘After we saw that article, there was no question of where our life would focus.’ [47]
We are to believe that Gates became rich by single mindedly devoting his life to computer software after one insight of genius. The reality is more complex and grubbier.
Microsoft’s success has been based for the last 15 years on software called MS-DOS, which was included with the IBM PC, and with all the clones made since. MS-DOS is an ‘operating system’, software which enables the different parts of a computer like the screen and keyboard to work together. IBM had originally hoped to buy an operating system from Microsoft’s then competitors Digital Research. But the head of Digital Research, Gary Kildall, didn’t even meet with IBM since he knew that all the small computers they had produced up to that point had failed, and he didn’t see why the PC should be any different. His wife and business partner Dorothy was a lawyer and she was horrified by the legal constraints IBM wanted to place on Digital Research as part of the deal. So IBM went to Microsoft, and asked if they could buy an operating system from them instead. Microsoft agreed to sell an operating system to IBM – though they didn’t actually have one to sell. Instead, they arranged with a neighbouring company to use one called QDOS, which they renamed MS-DOS and eventually bought outright. QDOS was all but copied from a third operating system, called CP/M and written by none other than Gary Kildall of Digital Research. The success of MS-DOS starts with a lucky break and a dodgy business deal, not with intelligence and hard work. [48]
As the story of MS-DOS went on, Gates continued to be lucky. IBM and each clone manufacturer paid Microsoft a fee for each computer they sold with MS-DOS, so the enormous sales of PCs and clones meant big profits. IBM, of course, was unhappy to pay Microsoft so much money for a product Microsoft was also selling to IBM’s competitors. They wanted to get control of the operating system back and so kill off the clones. With the PS/2, IBM introduced a new operating system, OS/2. Microsoft could only sell an inferior version of OS/2 to clone manufacturers. The full version was to be sold only by IBM and only worked on IBM computers. [49] But, when the PS/2 flopped, OS/2 flopped with it, and Microsoft went on profitably selling MS-DOS to clone manufacturers.
Gates’s success was based on his initial judgement that PCs would be immensely successful, and on his undoubted abilities and hard work. But it was also based on an entirely unlikely set of circumstances – that IBM would license an operating system from Microsoft for the PC, would make a success of the PC, and then hand that market over to their competitors. After all, if the PC hadn’t sold, neither would have MS-DOS. If the PS/2 had sold, MS-DOS would have become redundant. Rather than being based on Gates’s abilities – or the high quality of MS-DOS – Microsoft’s success in the 1980s was due to a series of lucky breaks. All this is very far from the theory of a self regulating market delivering the best goods at the lowest price.
The second piece of software which has been central to Microsoft’s profitability is Windows. Windows made computers easier to use. Earlier computer screens had been black with white characters – with Windows the screen was white with black writing, like paper. Rather than typing in commands, you moved a pointer about the screen with a device called a ‘mouse’. You used the mouse to choose what you wanted to do from lists called ‘menus’, which appeared and vanished as you needed and finished with them, or you chose from little pictures called ‘icons’. Windows has been one of the most successful pieces of software ever sold. You might imagine that, when it came to market in 1990, Microsoft had just invented it. Yet the mouse was invented in the 1960s, and everything else that was distinctive about Windows had existed since 1973. [50]
In the early 1970s there had been much discussion about computers creating a ‘paperless office’ – documents would be created on computers, be edited and stored on computers, and sent to a person’s computer for them to read. At no point would the document exist on paper. This idea worried Xerox, whose business was photocopiers, and which was aware that its patents on the photocopying process would some day expire. It set up a research establishment called PARC to explore computers. By 1973, PARC had produced the Alto. The Alto had a black on white screen, had icons and menus, and had a mouse. Altos could be linked together using a technology called Ethernet, still used today, and printed documents using the world’s first laser printers – again, a technology still used today. In fact the Alto included all the technologies to be marketed so successfully by Microsoft 17 years later. However, Xerox took so little interest in the Alto that it didn’t even patent the technologies involved. It worked out that such a machine wouldn’t be profitable to produce, and then forgot it. [51]
In December 1979 Xerox bought $1 million worth of shares in a new computer company called Apple. As part of the deal a group from Apple were given a tour of PARC. Apple boss Steve Jobs was amazed that Xerox wasn’t exploiting the technology it had developed: ‘“Why aren’t you doing anything with this?” he bellowed. “This is the greatest thing! This is revolutionary!”’ [52]
Apple decided to produce a machine which worked in a similar way to the Alto, called the Macintosh. After years of development (the project was cancelled on several occasions) the Macintosh came on the market in 1984. Early sales were disappointing, but in 1985 new software and an Apple version of the ‘laser printer’ pioneered by Xerox PARC became available. These made the Macintosh capable of doing ‘desktop publishing’ – people with no specialist training could use their computers to do typesetting and graphic design. The Macintosh established a niche in the market which it has hung on to ever since. [53]
Apple had approached Microsoft in 1981 to produce software for the Macintosh. After they began working with Apple, Microsoft produced Windows, their own version of the Alto’s system. Bill Gates wanted, in the words of one Microsoft manager, a ‘Mac on a PC’ – in fact, Windows resembled Macintosh software to such an extent that Apple sued Microsoft for breach of copyright in 1988, finally losing the case in 1992. Perhaps the most accurate summary of the relationship between Windows, the Macintosh and the Alto is a comment Bill Gates made to Steve Jobs in 1983: ‘It’s … like we both have this rich neighbour named Xerox, and you broke in to steal the TV set, and you found out that I’d been there first and you said, “Hey, that’s not fair! I wanted to steal the TV set!”’ [54] Far from bringing new technology promptly to market, Microsoft made enormous profits from technology which was 17 years old, much of which had been developed by other companies.
The fall of IBM and the rise of companies like Microsoft and Apple is not a new phenomenon, but typical of the workings of capitalism. Marx and Engels wrote in the Communist Manifesto, ‘Constant revolutionising of production, uninterrupted disturbance of all social conditions, everlasting uncertainty and agitation distinguish the bourgeois epoch from all earlier ones.’ [55] Companies and whole industries grow and decline as part of the search for profit. However, personal computers began to develop in the wake of the radicalisation of the 1960s. The first such machines were developed by people who built them as a hobby, who met together and shared ideas freely. For people from this milieu it was easy to confuse fighting IBM politically by challenging capitalism as a system, with fighting IBM in the marketplace. The confusion was all the easier to make because the styles of the old and new companies were very different. IBM executives all wore the same blue suits. Microsoft employees slept under their desks when a project needed finishing. One coped with the exhaustion by learning to sleep standing up. [56] Designers of part of the Macintosh at one point ‘were staying up 58 hours straight, blasting Dead Kennedys records, gobbing Vitamin C like popcorn.’ One played video games to relax, while another ‘would just sit there and scream, top of his lungs’.
Apple founder Steve Jobs encouraged his staff to believe that this kind of thing showed that they were rebelling against the system:
One of Jobs’s slogans proclaimed, ‘IT’S BETTER TO BE A PIRATE THAN JOIN THE NAVY.’ Forget that they were employees of a billion dollar corporation – the Mac team was a raucous band of buccaneers, answering to no one but their Captain!
As the 1960s faded their legacy was seen not in terms of collective struggle, but of individual fulfillment and self expression. This fitted with the idea that the new companies were successful because they recruited such clever, if eccentric, people. Certainly it is true that employment at Microsoft or Apple has been very different from most work under capitalism – the exceptional profitability of the companies meant that class struggle hardly existed. Many Microsoft employees, paid partly in company shares, became millionaires as the company grew. Extraordinary levels of identification with the employer were the norm – one ex-Apple manager remembered, ‘For a month after I left, I cried myself to sleep.’ This state of things was reinforced by the fact that many workers were hired straight from college – with no experience of work or domestic commitments, they were happy to work insanely hard for the chance to make big money – and if it got too much, they could sell their shares at 30 and leave.
For some workers, the new computer firms had been part of an idealistic, reformist project – to give people more power against large companies by making available small, cheap, powerful computers. In the words of an Apple staff member:
Very few of us were even thirty years old ... We all felt as though we had missed the civil rights movement. We had missed Vietnam. What we had was the Macintosh. [57]
Amid the glamour of high profits the ideal of changing the world became that of making a cosy niche in the world, with computers as tools for self expression – the ultimate aim of life of the new middle class of managers and professionals which many ex-students had by now joined.
Far from marking the end of industrial capitalism and the state, the internet has always depended on them. In 1957 the Soviet Union launched Sputnik, the first artificial satellite. There was considerable alarm among the US ruling class who needed to constantly better Soviet technology so as to keep ahead in the arms race. The US government set up ARPA, an agency with the job of ensuring the US military had the most advanced technology possible. One problem which ARPA considered was that of maintaining communications systems in the event of nuclear war. They worried that the destruction of only a small number of cities could bring all communications to a standstill because the systems weren’t sufficiently flexible. If the phone lines from, for example, New York to Washington were blown up there was no way those cities could communicate. If New York were destroyed, all the communications systems centred in New York would be useless. In 1969 ARPA created a computer network, ARPANET, which addressed these problems. Messages were sent around the network by computers, which constantly informed each other about the state of the network. If one part of the network vanished, the computers told each other this, and messages were sent by routes which remained undamaged. [58]
ARPANET, used by military and academic institutions, grew to connect 562 computer systems in 1983, in which year the military left to set up their own network. The following year, the US National Science Foundation took over running the system, now known as the internet, and they did this until the system was privatised in 1994 – by this time over 2 million computer systems were connected. [59] As with all other areas of computing, then, the internet received massive initial funding from the state, particularly the military. The market was in no way involved – indeed, for most of the internet’s history commercial activity has been banned from the system.
The history of computers does nothing to support the idea that the free market delivers choice or efficiency. In fact, far from having developed beyond a Marxist analysis, computers show many of the features by which Marxists have characterised capitalism. Since Marx’s own time, capitalist economies have been prone to booms and slumps. The profitability of a particular area of the economy encourages capitalists to invest in it in ever greater numbers. After a certain point more of the commodities involved are being produced than can be sold. The price of the commodity falls, driving down profits and putting the weaker capitalists out of business. This leads to a shortage of the commodity, so that prices begin to rise and the whole cycle can start again. The making of semiconductors, the chips which control computers as well as CD players and many other electrical appliances, has followed just this pattern. In 1995 the world market for semiconductors was growing by over 30 percent a year. Net profits were as high as 50 percent. As a result around 50 semiconductor factories were planned around the world, each costing around $1 billion. However, in August 1995 the Financial Times reported that institutional investors were wary when Korean electronics company Tatung planned to build such a plant: ‘The risk is that the planned semiconductor venture is mistimed and comes on-line when the industry, notorious for its punishing boom-and-bust profit cycles, has entered a downtrend.’
Indeed by 1996 more chips were produced than the market would bear, and prices fell – one chip which was selling at $46 at the start of the year cost only $11 by September. Investment decreased and new plants were put on hold as profits declined. [60]
Defenders of capitalism argue that the market leads to innovation and choice. But in fact the economy is increasingly dominated by a small number of firms. For example, the aerospace industry is dominated by three companies – Boeing, McDonnell Douglas and Lockheed. And, as this article was being written, the first two of these announced a merger. A few supermarket chains sell most of the groceries bought in Britain. Computers are no different. As we have seen already, the industry of the 1960s was completely dominated by IBM. In 1995 six companies made 46 percent of the personal computers sold, and their domination of the market was expected to increase in 1996, perhaps reaching 75 percent by the end of the decade. Most of these computers use chips made by Intel, which has an 80 percent share of the world market. [61] Much of the software for these machines is made by Microsoft, which completely dominates the market for operating systems with MS-DOS and Windows. Microsoft also supplies a majority of personal computer business software – databases, spreadsheets and word processing. Their nearest rival is Lotus, which recently became a subsidiary of IBM. Even markets which have only developed recently are dominated by one company. To use the world wide web system on the internet, you need software called a ‘browser’. Some 74 percent of use of the web takes place through the most popular browser, Netscape – the nearest competitor accounts for only 8 percent of use. [62]
Relations between the large companies which dominate the computer industry are far from the straightforward competition which market enthusiasts might expect. We have already seen the shifting pattern of co-operation and competition which existed in the 1980s between IBM, Microsoft and Apple. In 1995 Toshiba and IBM were planning to build a joint semiconductor plant in the US, while Toshiba, IBM and Siemens were collaborating on chip development in Germany, where Philips and IBM were also developing a joint plant. Japanese chip manufacturers were considering working together on some projects to stay ahead of foreign competition. [63] This network of partial co-operation is matched by a complex pattern of ownership between leading computer companies. Motorola, France Telecom and NEC, for example, each own 17 percent of Groupe Bull. NEC and Groupe Bull each own 19.99 percent of Packard Bell. [64] In 1995, ICL – 84 percent owned by Fujitsu – bought a controlling stake in Germany’s fifth largest computer manufacturer, while Amstrad bought Jarfalla, a Swedish company once owned by IBM. [65] All of these temporary alliances, and this jockeying for position, are entirely unlike the ‘rigour of competition’ which is supposed to characterise the market.
Billions have been spent on information technology on the assumption that it increases productivity. But there is remarkably little evidence that it actually gives value for money. A 1995 survey of financial insitutions, for example, found that only 28 percent felt that information technology delivered the financial return required of it, 34 percent didn’t know, while 38 percent felt that it didn’t deliver financially. US economists Stephen Oliver and Daniel Sichel have claimed that between 1970 and 1992, computers have only added 0.3 percent to the growth in economic output. [66] It seems remarkable that capitalists invest huge sums in technology which might actually be losing them money, but the market makes it inevitable. In March 1995, for example, Chase Manhattan Bank spent some $100 million on a new computerised trading floor. It was unlikely that the new technology would generate enough profit to pay for itself, the Financial Times reported, but the investment was necessary so that Chase Manhattan could compete with other banks, who were spending even more on information technology. [67] A similar logic lies behind much of the hype about the internet. An advert for Apple’s world wide web software puts it in these terms:
Let’s face it, the expansion of the internet is a phenomenon to be ignored at your peril. Thousands of companies are already on the world wide web, from fledgling start-ups to large international corporations – and the chances are, your competitors are already there.
The question is, if you’re not on the web today, where will your business be tomorrow? [68]
We have already seen that only a small number of people have access to the internet. For some companies, the costs of being ‘on the web’ will exceed the profits generated. None the less, those companies have to join the stampede for fear that their competitors will steal business from them. Market competition actually lowers levels of profitability.
In fact the logic of the market generates huge problems for the computer industry. One such problem is that posed by the year 2000. Much of the software governments and businesses currently use assumes that all years start with ‘19’ and only the last two numbers ever change. No one knows how such software will behave after the end of 1999. Changing all the systems involved is potentially a huge and expensive task – 16 percent of companies say it will cost them between £2 and £5 million, and 15 percent say it will cost over £5 million. However, many companies are ignoring the whole issue. Almost half have no strategy for dealing with the problem. Asked when their systems will be able to cope with the year 2000, 21 percent of companies say only in 1999 or 2000, while 5 percent say after the year 2000. Given the fact that information technology projects seldom meet deadlines, it is possible that many computer systems will grind to a halt after the end of 1999. According to the business publication Computer Weekly:
The current state of debate about the year 2000 is chillingly reminiscent of the last hours of the Titanic. The industry is split between those who are crying doom and disaster and those who are pooh poohing all the fuss as being wildly out of control.
The market makes it impossible to assess how severe the problem actually is. On the one hand, analysts and consultants are keen to stress the potential problems so as to get companies to pay them for sorting those problems out. On the other hand, managers dread taking on a project which will be a disaster if it doesn’t deliver on time and which will cost a lot of money with no prospect of a return. ‘The dilemma facing information technology managers is to steer a sensible path between the get-rich-quick doomwatch merchants who would obviously like to hussle everybody onto their books, and the head-in-the-sand ostriches oblivious of the tidal wave approaching.’ [69]
The market also ensures that vast sums of money are wasted on failed computer projects. Gloucestershire Social Services spent between £300,000 and £1 million on a computer system which came into operation in 1993. It was so slow and complicated to use that in January 1995 they were planning to spend up to £200,000 on a new one. Computer Weekly explains that computer companies have to promise the moon so as to win contracts:
The problem for suppliers in general is that if they pointed out all the risks associated with major projects...they would lose the business ... Consequently suppliers promise everything during the bidding process in the sincere hope that they will be able to deliver ... [70]
In addition to the claims of the free market enthusiasts that the state has no useful economic role to play, the 1990s have seen the growth of theories of ‘globalisation’. According to these, international trade has grown to a level where multinational companies can move production from country to country as they see fit. As a result, if governments impose high taxes on profits, or workers receive high wages, the companies involved will simply move to countries which don’t reduce their profits in this way. The state is powerless before the market. [71]
Computers would seem to be one of the industries most likely to be affected by globalisation. Thousands of chips are so small that they can be loaded on a plane and flown round the world at little cost – surely companies can manufacture them wherever they like? With wages for software engineers in India less than a quarter of those in Germany, why shouldn’t software companies do all their development in poorer countries and save money? Yet this isn’t what happens. As far as the location of a semiconductor plant is concerned, the Financial Times explains:
Basic requirements include an adequate labour force, reliable utilities, clean air and copious water supplies. Chipmakers also look for sites that are well-served by the suppliers of the chemicals and equipment used in semiconductor production … semiconductor manufacturers are not lured by low cost labour. Typically, labour accounts for less than 10 percent of the cost of running a semiconductor factory – with depreciation of the plant being a much bigger factor.
According to one manager from leading chipmaker Intel:
The politics and other financial inducements are much more important than the salary structure. If you get tax relief … or training grants or capital equipment grants, those are much more important than salaries. [72]
For example, in August 1995 Siemens announced that it would invest over $1 billion in a chip plant on Tyneside. Siemens chose to locate the plant in the UK against competition from Ireland and Austria, it was reported, after ‘intensive lobbying by Mr Michael Heseltine … and the personal intervention of Mr John Major’. All three governments had offered Siemens incentives packages – the UK government package was worth ‘close to £50 million’. According to Siemens, its reasons for choosing the UK included ‘the availability of a flexible labour force, good infrastructure and a proven track record in the industry and a sizeable domestic market for semiconductors ...’ [73]
Factors such as reliable infrastructure, experience in semiconductor production and a large domestic market are available in relatively few countries. For this reason, the United States has 33 percent of the world semiconductor market, followed by Japan with 29 percent, and then by South Korea, Germany and the UK. [74] Manufacturers do plan to expand into other countries – in the autumn of 1995, for example, there were seven chip manufacturing plants being planned in China. [75] But this is not because of lower wages, but because of the growth of the domestic market in a country where 700,000 personal computers were sold in 1994 and the market expands by some 30 percent a year. Compaq, the largest personal computer manufacturer in the world, follows a similar rationale – it has plants in Texas, Scotland and Singapore, and is planning new ones in China and Brazil ‘in preparation for the expected rapid growth of PCs in developing markets’. [76]
Manufacturers thus still depend on states to provide infrastructure such as transport and electricity supplies, and financial incentives to pay for new plants. Governments like the British Tories, acting in complete opposition to their free market rhetoric, use the state to sponsor industry. The state thus continues to be as central to capitalism as it always was and, as far as software goes, its role in policing copyright and patents is essential. Copying computer software is usually easy to do, and the copy is of exactly the same quality as the original. Illegal copying is estimated to cost the computer industry some £400 million a year. The UK software industry, for example, is demanding that British and European parliaments act to strengthen copyright law. The US government has threatened sanctions against China if action is not taken to reduce software piracy – it is estimated that some 94 percent of software in China is pirated, and the US company Microsoft is badly affected. [77]
The discrepancy between the rhetoric and the reality of capitalism concerning the state means that defenders of the system often end up contradicting themselves. Bill Gates, for example, reports with enthusiasm that in Singapore the state is forcing builders to construct a computer infrastructure:
Every developer will soon be required to provide every new house or apartment with a broadband cable in the same way he is required by law to provide lines for water, gas, electricity and telephone. When I visited with Lee Kuan Yew, the 72 year old senior minister who was the political head of Singapore from 1959 to 1990, I was extremely impressed with his understanding of the opportunity ...
But only three pages later, Gates is concerned that state intervention will be a bureaucratic drag on market efficiency:
In many countries nowadays, top political leaders are making plans to encourage highway investment [i.e. information superhighway, or computer communications, investment – CW] ... A government bootstrap could, in principle, cause an information highway to be built sooner than might happen otherwise, but the very real possibility of an unattractive outcome has to be considered carefully. Such a country might end up with a boondoggle, white-elephant information highway built by engineers out of touch with the rapid pace of technological development. [78]
Scott McNealy of Sun Microsystems gave the Financial Times an even more confused account of how capitalism works:
Microsoft is getting into all sorts of businesses … its dominant position is unhealthy for the market, says Mr McNealy ... It is stunning how few people understand market economics ... There is a huge amount of ignorance ... We have great empirical evidence that choice works, and that is why we have anti-trust and consent decrees to control the market. [79]
The free market works fine by itself, apparently, as long as the state regularly intervenes in it.
Computers have made an enormous difference to the jobs of many workers in the developed countries. Information technology spending per employee is highest in white collar jobs such as banking and finance (£6,243 per year) and insurance (£5,505), followed by central government (£4,324) and local government (£3,310) – though in engineering the figure is still high (£1,598). [80] Many would claim that the development of computers has reduced the numbers of jobs available, and reduced the power of those still in work to defend their jobs and conditions.
The workers who use computers today have not traditionally been seen as part of the working class. Fifty years ago bank workers, civil servants and local government white collar staff were thought to be in a relatively privileged position. Such white collar employment meant a ‘job for life’ on good pay. If such workers did join unions, they were often more like staff associations which had a cosy relationship with management and never went on strike. All this has changed. White collar work has been routinised, job security has vanished, pay has fallen, and unions such as UNISON, CPSA and BIFU have all been involved in strike action. However different their conditions of work, or their traditions, from such traditional working class figures as miners, dockers or shipbuilders, such workers have found they can only defend themselves by uniting through their unions to oppose management attacks. The fact that white collar workers form an increasingly large part of the working class, therefore, does not mean workers have less power.
In some industries, such as printing national newspapers, the introduction of computers has coincided with job losses. But job losses in the print were more the result of union leaders’ failure to stand up to employers’ attacks than a direct consequence of computerisation in itself. Other areas, such as local government, have introduced computers without large staff cuts. And if advances in technology have destroyed some jobs, they have created others – in 1995, for example, chip manufacturer Motorola became the largest industrial employer in Scotland. [81]
Some people worry that computers will make possible a society like that depicted in George Orwell’s Nineteen Eighty-Four, where the state is able to spy on every aspect of peoples’ lives and so make it impossible to fight back. For example, they argue that virtually all purchases will soon be made using computerised barcode systems and credit or debit cards, so that computers will constantly gather information about where people are, the kind of things they buy, and so on. Of course, people are right to be concerned that information held about them on computers should be accurate, private and so forth. But the nightmare of a computerised police state isn’t likely to happen. We have already seen that, far from being efficient and all knowing, computers are as badly made as any other commodity produced under capitalism. As for the state, the Department of Social Security spent £2.6 billion between 1984 and 1996 on a computer system designed to administer welfare benefits – in theory cutting costs and making it possible to cut over 20,000 staff. In fact, by 1996 staff numbers had risen by 2,000, and the computer system needed a further £750 million spent on it before it would work properly. [82] Meanwhile the Department of Health has been setting up computer systems to keep records on every person in the UK. But the DSS and DoH never discussed co-ordinating their computer projects, though such co-ordination would have saved hundreds of millions of pounds. [83] If the government can’t even use computers effectively to pay benefits or keep track of medical records, there is little chance that they could use them to constantly spy on everybody. Anyone who successfully avoided paying the poll tax, or who has received the wrong amount of benefit or none at all ‘because of a problem with the computer’, has a much better idea of the level of information technology available to the national and local state.
Nor is it true that workers in high technology companies have little power. We have seen that such companies can’t simply relocate to low wage countries if workers in the developed world fight for better pay or conditions. In the first place, such countries lack the necessary infrastructure. In the second, a company with a substantial investment in machinery – at the most, as we have seen, a chip plant costing $1 billion – can’t simply pick it up and move it elsewhere. Indeed, the size of such investments, and their rapid depreciation, means workers in such plants have immense power. The speed of technological development means that chip factories rapidly become obsolete. In the words of Intel management, ‘If you make $2 billion dollars capital investment, the bulk of capital investment is written off in four years, and on $2 billion that is $500 million a year depreciation.’ [84] Such a plant therefore has to make over $9 million a week simply to pay for its construction costs. If workers strike for even a short time they cause the company massive losses.
Other computer workers have similar potential power. In February 1995 Norwich Union sacked 93 information technology staff. The staff were called to a meeting where they were given their redundancy notices. Their identity cards were taken from them, they were then accompanied to their desks by security staff while they packed, and they were then escorted from the building. They were given no time to say goodbye to other workers, or to make any phone calls. Computer Weekly explained:
This brutal procedure is becoming the standard way for companies to dispose of anyone with a sensitive job. The idea is to protect the business from sabotage by supervising employees’ speedy exit, preventing them from tampering with computers in a fit of revenge. [85]
These extreme precautions meant Norwich Union workers didn’t get a chance to use their power, but in March 1996 a group of civil servants did. Computer Weekly reported that:
Strike action by just eleven Courts Service computer staff will lead to £4 million in uncollected debts and 100,000 unissued summonses a month until the dispute is resolved.
Staff at the Courts Agency Computer Centre in Northampton began a two week strike on Monday. They are protesting over being forced to transfer out of the civil service to a private contractor ... [86]
Computer workers have the ability to cause widespread disruption in official and financial systems – at a time when increased unionisation and management attacks make it likely that times will come when they use that power.
The claim that we live in a post-industrial information society, in which the free market delivers and Marxism is outdated, is false. Rather, the development of computers strengthens the Marxist case. For Marx, the development of the means of production was the bedrock on which social change rested:
At a certain stage of development, the material productive forces of society come into conflict with the existing relations of production or – this merely expresses the same thing in legal terms – with the property relations within the framework of which they have operated hitherto. From forms of development of the productive forces those relations turn into their fetters. Then begins an era of social revolution. The changes in the economic foundation lead sooner or later to the transformation of the whole immense superstructure. [87]
The transition from feudalism to capitalism, and the revolutionary upheavals it involved, occurred when social structures such as the medieval church and absolute monarchy came to prevent the development of the forces of production. They were replaced by social forms such as bourgeois democracy, wage labour and formal equality before the law, which made possible the development of productive forces to a level undreamt of under feudalism. But those social forms, progressive in their day, in turn become outmoded as the forces of production advance. The overthrow of capitalism now becomes necessary for humanity’s productive potential to be realised.
For example, there is today no technical reason why in each village and town on earth there should not exist a link to a worldwide computer network. From the network anyone would be able to obtain, in a few minutes, a copy of any book ever published, any piece of music ever recorded, any film or TV programme ever made. The educational and cultural opportunities which such a system would bring to billions of people are beyond imagining. Yet such a system could only be built in a socialist society. You see why when you consider Bill Gates’s account of how people could use a computer network (’the highway’) to do this kind of thing under capitalism:
Record companies, or even individual recording artists, might choose to sell music in a new way. You, the consumer, won’t need compact discs, tapes, or any other kinds of physical apparatus. The music will be stored as bits of information on a server [a large computer – CW] on the highway. ‘Buying’ a song or album will really mean buying the right to access the appropriate bits … in any non-commercial setting, anywhere you go, you’ll have the right to play the song without additional payment to the copyright holder. In the same way, the information highway could keep track of whether you had bought the right to read a particular book or see a movie.
Computers that could be used to make the world’s culture universally available are to be used to exclude those who have not paid the fee. The system can only exist if it includes an immense sub-system for checking who has bought what – the computerised equivalent of keeping tabs on the record collection of everyone in the world. Concepts like private property and copyright, which once helped the development of production, have now become a hindrance to it. If you compare Gates’s vision of the future with the political and cultural achievements of capitalism in earlier periods, what is most remarkable is its staggering banality. For example:
If you are watching the movie Top Gun and think Tom Cruise’s aviator sunglasses look really cool, you’ll be able to pause the movie and learn about the glasses or even buy them on the spot ... If the movie’s star carries a handsome leather briefcase or handbag, the highway will let you browse the manufacturer’s entire line of leather goods and either order one or be directed to a convenient retailer.
The queries Gates thinks we might want to put to such a computer system include, ‘List all the stores that carry two or more kinds of dog food and will deliver a case within 60 minutes to my home address,’ and ‘Which major city has the greatest percentage of the people who watch rock videos and regularly read about international trade?’ [88]
Such poverty of ideas is not so much the result of Gates’s personality as of the logic of capitalism. By the autumn of 1996, for example, four British supermarket chains had issued a total of 19 million loyalty cards. For each card issued, computers collect information about every tin of beans and box of teabags that customers purchase. British Gas, which is currently claiming that if it is forced to reduce prices it will have to lay off staff, has spent £150 million on a new billing system. The government has spent £70 million on a new jobcentre computer system, as part of cutting benefits to the unemployed. [89] Again, routine administrative tasks are ideally suited to computerisation – but it is just as computers come to be widespread that we also see a huge expansion in routine clerical work. At Computer Associates, the third largest software company in the world, management have turned off the electronic mail – the computer communications system – for much of the day because staff were responding too eagerly to it. The marketing director commented that ‘the e-mail would come in and they would drop everything to deal with it’, and explained to clients that ‘if it mattered that much, you’d have phoned’. [90] Time and again, technology which could improve people’s lives is used in a way which is either pointless or actually makes those lives worse.
Now, as in Marx’s time, capitalism is capable of enormous technical advances – the benefits of which it denies to all but a few. But information technology also makes clear how capitalism has outlived its ability to take human society forward. Instead, as computers and the internet develop, they give a new resonance to Marx and Engels’ claim that:
[after] the overthrow of the existing state of society by the communist revolution ... the liberation of each single individual will be accomplished in the measure in which history becomes transformed into world history … it is clear that the real intellectual wealth of the individual depends entirely on the wealth of his real connections. Only then will the separate individuals be liberated from the various national and local barriers, be brought into practical connection with the material and intellectual production of the whole world and be put in a position to acquire the capacity to enjoy this all-sided production of the whole earth ... [91]
References in italics starting ftp, gopher or http identify documents on the internet. In agreement with International Socialism I have used references in the notes below which are only available on the internet since so much information about the net is only available on the net itself. Internet references, however, are not generally acceptable in International Socialism articles. Unlike books, there is as yet no guarantee that the original reference will still be available in five or ten years time. And, without wishing to overstate the editorial or scholarly control exercised over the publication of some books, internet articles often do not go through even the minimum of preparation common in many publishing houses.
[Note by ETOL: The links below are those given in the original article – they have not been checked by ETOL and there is no guarantee that they will still work. A number of links to documents within ETOL and MIA have been added – for these there is no visible URL.]
1. The Guardian, 2 May 1996.
2. B. Gates, The Road Ahead (London 1995), pp. 74–6.
3. Microsoft/Intelliquest National Computing Survey, pp. 58–59. Available from http://www.microsoft.com
4. Back cover of D. Spender, Nattering on the Net: Women, Power and Cyberspace (North Melbourne 1995).
5. Computer Weekly, 18 April 1996.
6. Introduction, Communicating Britain’s Future, http://www.poptel.org.uk/labour-party/policy/info-highway/index.html
7. B Gates, op. cit., p. 183.
8. Computer Weekly, 12 January 1995.
9. Marxism Today, October 1988.
10. For the economy, see C. Harman, The Myth of Market Socialism, International Socialism 42, The State and Capitalism Today, International Socialism 51 and Where is Capitalism Going? International Socialism 58 and 60. A. Callinicos, Against Postmodernism: a Marxist Critique (Cambridge 1989) addresses many of the philosophical questions raised by computer and particularly internet enthusiasts.
11. J. Palfreman and D. Swade, The Dream Machine: Exploring the Computer Age (BBC Books 1991), p. 8.
12. Financial Times, 2 May 1995.
13. Financial Times, 3 August 1995.
14. http://www.nw.com/zone/host-count-history.
15. Ensuring Social Use, Communicating Britain’s Future, op. cit.
16. Phones data from CIA World Factbook 1994, gopher://UMSLVMA.UMSL.EDU:70/11/LIBRARY/GOVDOCS/WF93/WFLATEST. [Note by ETOL: This link seems to be dead.] Internet hosts from Host Distribution by Top-Level Domain Name, January 1996, http://www.nw.com/zone/WWW/dist-bynum.html
17. E.J. Hobsbawm, The Age of Empire 1875–1914 (London 1989), p. 346.
18. Microsoft/Intelliquest National Computing Survey, op cit, p. 120.
19. O’Reilly and Associates, Defining the Internet Opportunity, http://www.ora.com/survey/users/charts/pop-proj.html and http://www.ora.com/survey/users/charts/net-income.html. [No longer available online]
20. The Commercenet/Nielsen Internet Demographics Survey – see http://www.commerce.net/. [No longer available online]
21. Microsoft/Intelliquest National Computing Survey, op cit, p. 10.
22. The Graphic, Visualization and Usability Center’s Fourth WWW User Survey, Georgia Institute of Technology, http://www.gatech.edu
23. Microsoft/Intelliquest National Computing Survey, op cit, pp. 11–12.
24. B. Gates, keynote speech, Interactive Media Conference, 6 June 1995 – available from http://www.microsoft.com
25. Ensuring Social Use, Communicating Britain’s Future, op. cit.
26. A. Hyman, Charles Babbage: Pioneer of the Computer (Oxford 1982), pp. 43–44.
27. D. Swade, Charles Babbage and his Calculating Engines (London 1991), p. 2.
28. A. Hyman, op. cit., pp. 52–53.
29. Ibid., pp. 169–170; D. Swade, op. cit., p. 10.
30. A. Hyman, op. cit., p. 198.
31. Ibid., pp. 80–92.
32. K. Marx and F. Engels, Communist Manifesto, see the edition printed in Beijing (1965), p. 39.
33. A Kidd, Manchester (Keele, 1993), pp. 116, 154.
34. E.J. Hobsbawm, The Age of Revolution 1789–1848 (London 1973), p. 216. See also C. Harman, The State and Capitalism Today, International Socialism 51, p. 11.
35. A. Hodges, Alan Turing, The Enigma of Intelligence (London 1985), ch. 8.
36. J. Palfreman and D. Swade, op. cit., chs. 2 and 3.
37. Ibid., p. 81.
38. The Observer, 25 August 1996.
39. C.H. Ferguson and C.R. Morris, Computer Wars: The Fall of IBM and the Future of Global Technology (New York, 1994), p. xi.
40. Financial Times, 12 January 1995.
41. C.H. Ferguson and C.R. Morris, op. cit., ch. 1.
42. The Myth of Market Socialism, International Socialism 42, pp. 19–20.
43. P. Carroll, Big Blues: The Unmaking of IBM (London 1994), p. 101.
44. C.H. Ferguson and C.R. Morris, op. cit., p. 28.
45. Ibid., p. 59.
46. Computer Weekly, 28 March 1996.
47. R.X. Cringely, Accidental Empires (London 1993), p. 52.
48. S. Manes and P. Andrews, Gates (New York 1994), chs. 11 and 12.
49. Ibid., p. 331.
50. J. Palfreman and D. Swade, op. cit., p. 89, which includes a photo of an early mouse.
51. S. Manes and P. Andrews, op. cit., pp. 165–166; R.X. Cringely, op. cit., pp. 82–83.
52. S. Levy, Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything (London 1995), p. 79.
53. Ibid., pp. 115, 186, 221.
54. S. Manes and P. Andrews, op. cit., pp. 181–189, 225, 255, 357–364, 438.
55. K. Marx and F. Engels, op. cit., p. 36.
56. S. Manes and P. Andrews, op. cit., p. 211.
57. S. Levy, op. cit., pp. 143, 164, 175, 203.
58. H. Reingold, The Virtual Community: Finding Connection in a Computerised World (London 1995), pp. 7, 71, 74.
59. B. Sterling, A Short History of the Internet, Internet Society at http://info.isoc.org:80/infosvc/index.html; Number of internet Hosts, http://www.nw.com/zone/host-count-history.
60. Financial Times, 3 August 1995, 15 August 1995, 9 August 1995, 18 July 1995, 5 September 1996, 12 September 1996.
61. Computer Weekly, 28 March 1996, 4 January 1996; Financial Times, 5 April 1995.
62. Computer Weekly, 4 April 1996.
63. Financial Times, 9 August 1995, 27 September 1995, 11 July 1995.
64. Financial Times, 5 August 1995, 6 August 1995.
65. Financial Times, 30 March 1995, 14 June 1995.
66. Computer Weekly, 4 January 1996, 26 January 1995.
67. Financial Times, 31 March 1995.
68. The Guardian, 11 April 1996.
69. Computer Weekly, 4 April 1996, 18 April 1996, 21 March 1996.
70. Computer Weekly, 26 January 1995.
71. See C. Harman, State and Capitalism Today, International Socialism 51, for a general refutation of these ideas.
72. Financial Times, 3 August 1995.
73. Financial Times, 5 August 1995.
74. Financial Times, 11 August 1995, 15 August 1995.
75. Financial Times, 27 September 1995.
76. Financial Times, 27 February 1995, 6 September 1995.
77. Computer Weekly, 4 January 1996, 9 February 1995.
78. B. Gates, The Road Ahead, op. cit., pp. 235, 238.
79. Financial Times, 3 May 1995.
80. Computer Weekly, 16 February 1995.
81. Financial Times, 1 July 1995.
82. Computer Weekly, 28 March 1996.
83. Computer Weekly, 30 March 1995.
84. Financial Times, 3 August 1995.
85. Computer Weekly, 9 February 1995.
86. Computer Weekly, 28 March 1996.
87. K. Marx, Preface to A Contribution to the Critique of Political Economy, Early Writings (Harmondsworth 1975), pp. 425–426.
88. B. Gates, The Road Ahead, op. cit., pp. 80, 165–166, 175–176.
89. The Guardian, 28 September 1996.
90. The Guardian, 19 September 1996.
91. K. Marx and F. Engels, The German Ideology (London 1974), p. 55.
ISJ 2 Index | Main Newspaper Index
Encyclopedia of Trotskyism | Marxists’ Internet Archive
Last updated on 16 May 2021