Is all technology authoritarian?
“In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug. “
The Terminator franchise draws from a deep well of mythology – preying on our fears that humans will be crushed by the technology that we ourselves create. The horrible inevitability of our fate makes it a compelling story.
There is a converse myth: that through ever-advancing technological progress, rational man (and we can go down a whole other rabbit hole about gender, Francis Bacon, the enlightenment and rationality) can overcome all challenges, and create whatever he needs to further our interests.
As Langdon Winner writes in 1980, in Do Artifacts Have Politics?:
A long lineage of boosters have insisted that the “biggest and best” that science and industry made available were the best guarantees of democracy, freedom, and social justice. The factory system, automobile, telephone, radio, television, the space program, and of course nuclear power itself have all at one time or another been described as democratizing, liberating forces.
Both are powerful myths, and any discussion of technology must acknowledge them. In particular, when we’re talking about technology, but we don’t realise it. When we talk about makers and a Maker Movement, we’re often talking about technology-led transformations. Ruskin and Morris were talking about the role of makers after an upheaval in manufacturing technology. The likes of Chris Anderson, Mark Hatch, even skeptics like Evgeny Morozov, are talking about grassroots makers, DIY-ers and budding entrepreneurs working in a period of digital technology transformation: in every field, from fabrication, to collaboration, learning to distribution.
This post is part of a series of hunches (in the Steven Johnson sense) that I’m writing in public to try and work through some questions I have about whether contemporary making cultures can be liberatory forces.
Scepticism about the liberatory potential of technology has a long history, so here, I want to collect some perspectives from that history.
Lewis Mumford, Authoritarian and Democratic Technics, 1964
He sets out the bargain we make: material abundance, in return for compliance to a system of constraint:
The bargain we are being asked to ratify takes the form of a magnificent bribe. Under the democratic-authoritarian social contract, each member of the community may claim every material advantage, every intellectual and emotional stimulus he may desire, in quantities hardly available hitherto even for a restricted minority: food, housing, swift transportation, instantaneous communication, medical care, entertainment, education. But on one condition: that one must not merely ask for nothing that the system does not provide, but likewise agree to take everything offered, duly processed and fabricated, homogenized and equalized, in the precise quantities that the system, rather than the person, requires. Once one opts for the system no further choice remains. In a word, if one surrenders one’s life at source, authoritarian technics will give back as much of it as can be mechanically graded, quantitatively multiplied, collectively manipulated and magnified.
And he suggests a way forward, informed by a general approach of “small is beautiful”:
But if we are not to be driven to even more drastic measures than Samuel Butler suggested in Erewhon, we had better map out a more positive course: namely, the reconstitution of both our science and our technics in such a fashion as to insert the rejected parts of the human personality at every stage in the process. This means gladly sacrificing mere quantity in order to restore qualitative choice, shifting the seat of authority from the mechanical collective to the human personality and the autonomous group, favoring variety and ecological complexity, instead of stressing undue uniformity and standardization, above all, reducing the insensate drive to extend the system itself, instead of containing it within definite human limits and thus releasing man himself for other purposes. We must ask, not what is good for science or technology, still less what is good for General Motors or Union Carbide or IBM or the Pentagon, but what is good for man: not machine-conditioned, system-regulated, mass-man, but man in person, moving freely over every area of life.
He even makes an optimistic reach for automation as a positive force, a hope shared by many critics of technology systems:
The very leisure that the machine now gives in advanced countries can be profitably used, not for further commitment to still other kinds of machine, furnishing automatic recreation, but by doing significant forms of work, unprofitable or technically impossible under mass production: work dependent upon special skill, knowledge, aesthetic sense.
This is a topic I will return to.
The Frankfurt School, 1944
They draw out an aspect of authoritarianism that comes from the relationship between technology and the resources used by that technology. If your view (or design) of technology frames humans and nature as resources, then 2 unsavoury outcomes follow:
- The ‘human resources’ embedded in the system must have their behaviour and agency circumscribed so they fit into the system with maximum efficiency.
- The natural resources exist only to be extracted and used, which is an unsustainable approach for most resources on this planet (excepting sunlight, perhaps)
We also get a criticism of Enlightenment ideals, and the ideas both of domination generally – never a word to prompt feelings of warmth – and domination of nature and humans specifically; tying together these concerns about social justice and our relationship to the environment.
Ursula Franklin, The Real World of Technology, 1989
The constraint of human will and behaviour to conform to technical systems is a central theme of Ursula Franklin’s book, The Real World of Technology.
Franklin contrasts holistic technology with prescriptive technology. I’ll quote here from Mandy Brown’s excellent post on Franklin’s book:
Holistic technologies are normally associated with the notion of craft. Artisans, be they potters, weavers, metal-smiths, or cooks, control the process of their own work from beginning to end. Their hands and minds make situational decisions as the work proceeds, be it on the thickness of the pot, or the shape of the knife edge, or the doneness of the roast. These are decisions that only they can make while they are working. And they draw on their own experience, each time applying it to a unique situation … Using holistic technologies does not mean that people do not work together, but the way in which they work together leaves the individual worker in control of a particular process of creating or doing something.
Today’s real world of technology is characterized by the dominance of prescriptive technologies. Prescriptive technologies are not restricted to materials production. They are used in administrative and economic activities, and in many aspects of governance, and on them rests the real world of technology in which we live. While we should not forget that these prescriptive technologies are exceedingly effective and efficient, they come with an enormous social mortgage. The mortgage means that we live in a culture of compliance, that we are ever more conditioned to accept orthodoxy as normal, and to accept that there is only one way of doing “it.”
You can see clear parallels between Mumford’s democratic and authoritarian technics, and Franklin’s holistic and prescriptive technology. More specifically, we could use Franklin’s approach to look for indicators of non-authoritarian technology:
- Do users have oversight over the whole process?
- Are they directly connected to the material being worked, so they can adjust their work as they go?
- Can they learn and apply new skills or knowledge not assumed by the designer of the system, to improve the thing being made?
- Can they make themselves a critical point in the system, or are they merely fungible?
Langdon Winner, Do Artifacts Have Politics?, 1980
Some technologies seem to inherently require authoritarian systems of administration. Winner gives the example of nuclear power:
An especially vivid case in which the operational requirements of a technical system might influence the quality of public life isnow at issue indebates about the risks of nuclear power. As the supply of uranium for nuclear reactors runs out, a proposed alternative fuel is the plutonium generated as a by-product in reactor cores. Well-known objections to plutonium recycling focus on its unacceptable economic costs, its risks of environmental contamination, and its dangers in regard to the international proliferation of nuclear weapons. Beyond these concerns, however, stands another less widely appreciated set of hazards – those that involve the sacrifice of civil liberties. The widespread use of plutonium as a fuel increases the chance that this toxic substance might be stolen by terrorists, organized crime, or other persons. This raises the prospect, and not a trivial one, that extraordinary measures would have to be taken to from theft and to recover it if ever the substance were stolen. Workers in the nuclear industry as well as ordinary citizens outside could well become subject to background security checks, covert surveillance, wiretapping, informers, and even emergency measures under martial law – all justified by the need to safeguard plutonium.
In the same piece, he suggests that other forms of energy technology are less inherently authoritarian:
Thus environmentalist Denis Hayes concludes, “The increased deployment of nuclear power facilities must lead society toward authoritarianism. Indeed, safe reliance upon nuclear power as the principal source of energy may be possible only in a totalitarian state.” Echoing the views of many proponents of appropriate technology and the soft energy path, Hayes contends that “dispersed solar sources aremore compatible than centralized technologies with social equity, freedom and cultural pluralism.”
It’s tempting to think of the internet as a similarly dispersed technology, but of course the physical and computational infrastructure is highly centralised, from network cable joining continents, to the ICAAN-controlled domain name system, to de-facto internet gateways such as Facebook or Google.
David Noble, Forces of Production, 1984
In this now-famous work about the transformation of the American manufacturing industry, Noble gives a detailed case study of the introduction of CNC systems at General Electric. CNC has inherent qualities that forces operators to comply with certain ways of working (it is prescriptive, in Franklin’s terminology). But Noble argues further that the technology is authoritarian because it reflects the ideology of the people who developed, marketed and bought it:
Thus, while General Numeric advertises its CNC system to job shops in the name of shop floor editing and operator control, the company promotes its wares to managers of large firms in the interest of “better security” and greater management control. … “TOTAL COMMAND: Introducing The Management-Run CNC Jung Surface Grinder,” reads one advertisement in the September 1982 issue of Mode Machine Shop. “Management-Run?” the advertisement asks, driving home its central message, “Why Not? Install a terminal in your office and you can control it yourself. . . . The operator simply loads and unloads.”
This is precisely the approach taken by General Electric, where managers now fantasize about the “paperless factory” as well as machines without men. Here, as elsewhere, operators are routinely locked out of the CNC controls; at one site, at least, operators caught with keys to the controls are subject to immediate dismissal. … “Computer-aided manufacturing,” observes GE manager William Waddell, “is, in fact, a communications system, and, when successful, it forces an organization into a disciplined approach to manufacturing.”
An even stronger claim is made by Winner again, when he argues that Robert Moses’ public works projects in mid-century America deployed technology specifically designed to further the narrow interests of the ruling class; using gating architecture to keep poor and black communities out of the new parks he was building around New York:
Anyone who has traveled the highways of America and has become used to the normal height of overpasses may well find something a little odd about some of the bridges over the parkways on Long Island, New York. Many of the overpasses are extraordinarily low, having as little as nine feet of clearance at the curb. Even those who happened to notice this structural peculiarity would not be inclined to attach any special meaning to it. In our accustomed way of looking at things like roads and bridges we see the details of form as innocuous, seldom give them a second thought.
It turns out, however, that the two hundred or so low-hanging overpasses on Long Island were deliberately designed to achieve a particular social effect. Robert Moses, the master builder of roads, parks, bridges, and other public works from the 1920s to the 1970s inNew York, had these overpasses built to specifications that would discourage the presence of buses on his parkways. According to evidence provided by Robert A. Caro in his biography of Moses, the reasons reflect Moses’s social-class bias and racial prejudice. Automobile-owning whites of “upper” and “comfortable middle” classes, as he called them, would be free to use the parkways for recreation and commuting. Poor people and blacks, who normally used public transit, were kept off the roads because the twelve-foot tall buses could not get through the overpasses. One con sequence was to limit access of racial minorities and low-income groups to Jones Beach, Moses’s widely acclaimed public park. Moses made doubly sure of this result by vetoing a proposed extension of the Long Island Railroad to Jones Beach.
Again, we can look for a modern parallel with digital technology. While I suspect Facebook and other social networks are not bothered by the racial makeup of their users, they do design the technology to exploit the labour of publishers, content creators and platform users, in order to further their interests as aggregators of attention. The technology may not be inherently authoritarian (though it might be), but it certainly embodies monopolistic and exploitive design intent.
I like to quote from older historical sources, because our perspective looking back on these ideas helps give them some clarity. But I’ll also quote one source from the present day, which is a good example of a concern about a current technology trend – that we’re building technical systems that rely on data collection to inform machine learning, target advertising, or provide ‘intelligent’ assistance, and these systems have the potential to be used as tools of mass surveillance.
Maciej Ceglowski, The Moral Economy of Tech, 2016:
In our attempt to feed the world to software, techies have built the greatest surveillance apparatus the world has ever seen. Unlike earlier efforts, this one is fully mechanized and in a large sense autonomous. Its power is latent, lying in the vast amounts of permanently stored personal data about entire populations.
We started out collecting this information by accident, as part of our project to automate everything, but soon realized that it had economic value. We could use it to make the process self-funding. And so mechanized surveillance has become the economic basis of the modern tech industry.
We’re used to talking about the private and public sector in the real economy, but in the surveillance economy this boundary doesn’t exist. Much of the day-to-day work of surveillance is done by telecommunications firms, which have a close relationship with government. The techniques and software of surveillance are freely shared between practitioners on both sides. All of the major players in the surveillance economy cooperate with their own country’s intelligence agencies, and are spied on (very effectively) by all the others.
Much has been written in a similar vein since Snowden, but Ceglowski articulates something particularly relevant in this context. Previous writers I’ve quoted talk about technology as a deal (maybe an involuntary one) in which we trade material abundance for constraint in labour. But the technology wave we’re riding today offers a new deal, and one with even less room to negotiate. We can have an abundance of free services and cheaper goods, from books to hotels, but in return we give up our data and help build a surveillance state that may or may not operate in our interests.
These writers are all concerned with what you might call ‘big’ technology. Mass manufacturing, the Industrial Revolution, the Silicon Valley tech giants. My interest is chiefly in grassroots technology movements. Can DIY technologists and makers design technology that does not have these characteristics? Can we build institutions incentivised to act in ways that promote the design, and adoption of democratic or holistic technologies?