The Public Voice in Electronic Commerce
La place du citoyen dans le commerce électronique

OECD  Paris - October 11th, 1999
OCDE  Paris - 11 Octobre1999

Presentation from
Phil Agre

First-World Myopia:
The Invisible Context of Computing

Philip E. Agre
Department of Information Studies
University of California, Los Angeles
Los Angeles, California 90095-1520
USA

pagre@ucla.edu
http://dlis.gseis.ucla.edu/pagre/

This is a draft. Please do not quote from it.
Version of 16 October 1999.
1100 words.

Information is everywhere and nowhere, immaterial and abstract, cleanly separable from the concrete world of cows and cars. That is the common sense we learn in school. But the common sense cannot be quite right, or else we would not need the Internet to move all of this information around. The Internet is strange in this way, an ontological hybrid, with one foot in the world of atoms and the other foot in the world of bits. Metaphysics since Plato and political science since the Enlightenment make it easy to imagine the world of universal information that the Internet promises to bring -- so easy, in fact, that we can underestimate the work that will be required to bring this world about.

Here is the problem. We all know that computers are complex beasts. But for all of their internal complexity, computers are just as complicated in their embedding in the outside world. Yet the complexity of this embedding is largely invisible to the people who design computers, and to the people who make a living promoting their use. Call it first-world myopia: taking for granted the sprawling background of infrastructure, institutions, and information that make modern societies possible.

It may seem implausible that infrastructure, institutions, and information could be invisible, since in public discourse these days we hardly seem to discuss anything else. But it happens all the time, and here is why. Modern society exhibits a tremendous division of labor, and a division of labor is only possible if every occupational community can focus on its own speciality, letting everything else fade into the background. In a society with a thousand and one occupations, everyone will be socialized into an occupational discourse and practice that simply presupposes the products of a thousand others. This is efficient, but it is also dangerous. The products of industrial society often do not travel well. Computers, for example, require electricity and an overnight delivery system for spare parts -- both complex infrastructures that are not always available. These are familiar examples; let us consider three examples that are less familiar.

Thing first about data. What is data? In fact computer scientists have thought about data in several ways (Agre 1997). In the early days, they spoke of computation as data processing, the idea being that data was an industrial material like iron ore. But this metaphor did not do justice to the representational nature of data -- the fact that data makes claims about something in the world. So techniques like data modeling arose to make clearer what sorts of entities in the world the machine was supposed to represent. But that hardly exhausts the topic. Once a computer is filled with well-defined numbers, consider what it means to add those numbers (or subtract them, or compare them). Those numbers originated somewhere, for example a 3 from Europe and a 4 from Asia. The sum of those numbers, 7, is only meaningful if the 3 and the 4 are commensurable, and that requires that they be measured in the same way. A working computer thus requires more than functioning circuits. It also requires a far-flung institutional arrangement that provides for the standardized capture of data. Bowker (1994) refers to this kind of realization as an infrastructural inversion, and it is a good example of the first-world myopia that takes such things for granted.

Think next about the settings in which computers are used. It is extraordinarily common for organizations to invest large sums in complex computers without any investment in training. Schools often invest their scarce resources in computers without any thought to the curriculum. In some cases the responsible authorities are duped by claims that the systems are easy to use. In other cases it is assumed that computers will pay for themselves by displacing staff, and further investments in human capital seem like the opposite of that intention. In each case, what is neglected is what Kling (1992) calls the web of relationships around the computer. Computers are easy to see, but webs are not.

This effect is especially pronounced with the Internet. The Internet's design was motivated in large part by "end-to-end arguments" (Saltzer, Reed, and Clark 1984), according to which it is more efficient to move complex networking functions from the network itself to the computers that use it. This made perfect sense in places like MIT and UCLA, where the researchers could depend on a web of skilled people and advanced technical resources. It makes less sense in the real world, and organizations that adopt the Internet are often unprepared for the cost and managerial overhead of hiring a system administrator to maintain it. In the telephone system, most of those administration skills are internalized by the telephone company (Odlyzko 1998). The Internet's distributed architecture also distributes the social and technical complexity. An analogy can be found in Latour's (1988: 90 [?]) account of the spread of Pasteurization in France. Pasteur's process worked in his laboratory, and in order for the process to spread around the country, the relevant aspects of the laboratory had to be spread around as well. The consequences for Internet services are considerable. For example, most first-worlders are accustomed to the decades of cross-subsidies that brought basic telephone service to rural areas. They take this universality for granted, and they too readily imagine the rural utopias that universal Internet service will bring.

Think, finally, about neoclassical economics, which continues to dominate first-world economics departments despite its programmatic neglect of institutions and information (e.g., Hodgson 1988). Such a theory is only plausible if a tremendous framework of both can be taken for granted. Of course, some neoclassically oriented economists have spent the 1990s relaxing these extreme assumptions, so that institutions and information have started to become visible in mainstream discourse (e.g., Stiglitz 1985). Nonetheless, the neoclassical idealizations of perfectly information and perfectly functioning economic institutions are deeply ingrained in a vast economic rhetoric, and this rhetoric is still routinely applied in contexts where the idealizations do not hold. Economic playing fields are thus made to seem much more level than they really are.

These few examples hardly exhaust the depths of first-world myopia. But they do get us started on an important project: understanding how technologies and ideas can be perfectly valid in one context and disastrously wrong-headed in another. Once we acquire this new, clearer-sighted variety of common sense, we will become less susceptible to what Leigh Star and Gail Hornstein (1990) call "universality biases": the uncritical assumption that discoveries in one context will necessarily apply in another. Instead, we will take the transfer of technology and ideas as an opportunity to make visible the taken-for-granted background of technical and economic work. Bad advice will be replaced by dialogue, and we will all be better for it.

References

Philip E. Agre, Beyond the mirror world: Privacy and the representational practices of computing, in Philip E. Agre and Marc Rotenberg, eds, Technology and Privacy: The New Landscape, Cambridge: MIT Press, 1997.

Geoffrey Bowker, Information mythology: The world of/as information, in Lisa Bud-Frierman, ed, Information Acumen: The Understanding and Use of Knowledge in Modern Business, London: Routledge, 1994.

Geoffrey M. Hodgson, Economics and Institutions: A Manifesto for a Modern Institutional Economics, Cambridge, UK: Polity Press, 1988.

Rob Kling, Behind the terminal: The critical role of computing infrastructure in effective information systems' development and use, in William Cotterman and James Senn, eds, Challenges and Strategies for Research in Systems Development, London: Wiley, 1992.

Bruno Latour, The Pasteurization of France, translated by Alan Sheridan and John Law, Cambridge: Harvard University Press, 1988.

Jerome W. Saltzer, David P. Reed, and David D. Clark, End-to-end arguments in system design, ACM Transactions in Computer Systems 2(4), 1984, pages 277-288.

Gail A. Hornstein and Susan Leigh Star, Universality biases: How theories about human nature succeed, Philosophy of the Social Sciences 20(4), 1990, pages 421-436.

Joseph E. Stiglitz, Information and economic analysis: A perspective, Economic Journal 95(supplement), 1985, pages 21-41.