John Leslie King, Research Interests

Click here to get my current CV in PDF format

I study the relationship between technical change and social change, concentrating on information technologies and change in social institutions. My early academic career (the 1970s through late 1980s) focused on the study of information technology use and effects in complex organizations, mainly municipal corporations and other government entities. This started with a focus on policy, such as whether to centralize or decentralize service structures in organizations. In time, I became more interested in the problem of design and development of sophisticated socio-technical information infrastructures that must function effectively in complex organizational and institutional settings, usually over long periods of time.

At the moment I have two major focal areas for this work. One I call epistemic infrastructure organized around three topics: the academy (including institutionalized education of all kinds, but focusing on higher education); systematic collecting (libraries, archives, museums, galleries, aquaria, zoos, etc.); and crowd-sourced knowledge (Wikipedia, ineractive blogs, electronic information markets, etc.). The other embodies the strategy of sampling at the extremes and concentrates on the interesting situation of institutionalized learning in one of the world's most remote places, Antarctica. Epistemic infrastructure and learning in Antarctica are both examples of socio-technical change in highly institutionalized production sectors that I explain below.

We have many existence proofs that it is possible to build complicated and long-lived systems (e.g., air traffic control, airline reservation systems, the Internet). Interestingly, we also have considerable evidence that we cannot build such things on purpose. The first time we built them we were trying to do something that had not been done before (e.g., defend ourselves against long-range bombers in the case of air traffic control and airline reservation systems, both outcomes of Project SAGE). The useful systems we got were spin-off benefits, not anticipated as part of the original work. Since they were not anticipated, they were built without much attention to requirements. They were entireley new and no one thought about them in terms of requirements. When we try to build such systems on purpose the situation is quite different: we undertake to build such systems because of requirements, to get something specific that we want. Unfortunately, our performance in such cases is not very good. It is estimated that half of all intentional efforts to build systems like these on purpose fail to achieve their objectives. This is not because the people trying to build the systems are incompentent: it is because building such things on purpose is much more difficult than first appears.

Research into failures of system development in such cases has repeatedly shown the biggest source of the problem to be inadequate understanding of the requirements for the proposed system. This is not surprising; such efforts are often motivated by a desire to improve already-embedded infrastructures that have grown up around the original systems. Understanding embedded infrastructure requires what Bowker and Star call infrastructure inversion that is difficult to accomplish. It also requires working across multiple conceptual levels of analysis. To study this requires data collection in a number of highly instituionalized production sectors. I called this research high-level requirements analysis, because of its focus on issues of organizational and institutional usability that are seldom considered by software engineers or system developers.

A highly institutionalized production sector is any world of human enterprise affected by the regulation and/or influence of social institutions, both formal (e.g., governmental entities) and informal (e.g., professional associations, scientific societies, etc.). The intellectual groundwork for this understanding of institutionalized production is found in a 1994 paper titled Institutional Factors in Information Technology Innovation that appeared in Information Systems Research. Implementation of this research program in highly-institutionalized production involved the development of information infrastructure in support of other service and physical infrastructures.

I have learned about infrastructures in support of the following highly institutionalized production sectors:

Logistics and transport. This is one of the world's largest production sectors. Depending on how you count, it constitutes between 14% and 17% of global production. My research in this area focused initially on air freight movement, because such movement is inherently intermodal (meaning freight must move on multiple modes such as motor and air transport), and it is highly regulated by institutions at the global level. It is also increasingly dependent on information management. This work was done with Barrie Nault, Amelia Regan and Paul Forster. Support came from the National Science Foundation, the UCI Graduate School of Management Corporate Partners, the International Air Transport Association, Cargo Network Services, and KLM Cargo. CNS, in particular, sponsored a large baseline survey of information handling among the 1,800 air carriers, motor transport carriers who handle air freight, and freight forwarders. At the time this was the largest and most comprehensive survey of information handling ever attempted in an entire freight sector. Eventually this work spread to intermodal transport generally.

Common carrier communications, particularly global telephony. The international telephone system is the largest networked information infrastructure on Earth. It is much larger and more penetrating than the Internet, although the advent of smart phones in cellular telephony is blurring the distinction between telephony and the Internet. In the past it was highly regulated and institutionalized, although regulatory and institutional structures are changing as a result of cellular telephony. Working with colleagues then at the University of Jyvaskyla in Finland, I studied the evolution of key institutional enablers that explain why the diffusion of cellular telephony occurred much faster in the Nordic countries than it did anywhere else, including in the United States, where the technology was invented by AT&T in 1947. This was a study in technical history, involving access to original source documentation related to signaling protocols, spectrum allocation, and addressing. A complementary assessment of source documentation on these issues was conducted in the U.S. and Japan, the other primary actors in the spread of global cellular telephony. The dominant factors influencing the speed of diffusion of these technologies were each region's unique institutional and regulatory structures that affected the speed with which signaling, spectrum, and addressing standards could be set. Much of this work focused on establishment of the first wide-use analog standard (NMT-450/900) in the subsequent establishment of the GSM-900/1800 digital standard that is now the dominant world standard. The relatively slow start of cellular telephony in the US was due in large measure to the complexities of regulatory structures and policies at the FCC, in which competition for radio spectrum for cellular telephony put the telephone industry into direct conflict with the broadcasting industry, both of which were (and are) regulated by the FCC. This work was supported by the Academy of Finland (roughly equivalent to the US NSF), and conducted with Kalle Lyytinen, Joel West, Ashley Andeen, Vladislav Fomin, Ari Mannien and other colleagues.

Electric power generation and distribution. Electric power is a critical network infrastructure that is highly institutionalized and regulated. It is also undergoing major institutional reform. This research was done in cooperation with Scott Samuelson of the National Fuel Cell Research Center, with support from the California Energy Commission and industry sources. Robb Klashner was the primary driver behind this research.

Criminal courts. The criminal justice system is a crucial element of service infrastructure. It is also highly institutionalized and regulated. Research in this sector focuses on computerized case management systems in felony courts, and in particular, the Superior Court of Los Angeles County, the largest criminal court system in the US. Support was from the State Justice Institute. Margaret Elliott was the principal researcher in this project.

Health care. Health care is one of the largest and most important service infrastructure activities in the US. Research in this area has focused mainly on defining high-level requirements for advanced patient record systems. Research support was initially from the University of California Industry/University Cooperative Research Program in Life Sciences Informatics, with company support from a California software firm that specialized in patient record systems. The research program established the parameters for the design of patient record automation to allow multiple uses of patient records for patient care, administration, and medical research without requiring the constant intervention of people in the records process. Researchers on this work included Mark Ackerman, Wanda Pratt and Madhu Reddy.

Global Electronic Commerce. This study concentrated on national policies and conventions related to enterprise in the spread of electronic commerce in the United States and nine other countries. It was supported by the Naitonal Science Foundation, and was run by Jason Dedrick and Kenneth Kraemer at UC Irvine. Kalle Lyytinen and Vladislav Fomin participated in this work.

Epistemic Infrastructure. This refers to major collections -- libraries, archives, museums, galleries, zoos, aquaria -- that together are often referred to as cultural institutions, but that more importantly constitute the primary mechanisms for creation of knowledge communities. This research was primarily historical, tracing the rise of such institutions from classical antiquity through the early modern period, and on through the scientific and industrial revolutions. A major focus was on the implications of the Internet and other information infrastructure on the role of such institutions. This work was done initially for the Organization for Economic Cooperation and Development (OECD). My collaborator was Margaret Hedstrom.

Higher education. I started working in this araa through study of the University of California's systems for articulation agreement with community colleges and the California State University System. Articulation is the way students from community colleges and Cal State can come into the UC system and transfer credits directly. Such articulation was an important element of Clark Kerr's vision for California public higher education, and was the main mechanism by which students could start at the community college level and move up into Cal State and/or UC. This work was done mainly by Suzanne Schaefer. Subsequtently I studied the California Virtual University and the Western Governors University. Since moving to Michigan I have followed the Michigan Virtual University, which is interesting in that it started out focused on higher education but eventually concentrated its efforts on K-12). I have also done work in this area for the National Science Foundation and the Computing Research Association. For a couple of years I was Vice Provost for Strategy at the University of Michigan, which allowed me to focus directly on this topic. Collaborators in this work have included Cory Knobel, Nick Berente, Amy Ramirez-Gay, and Victor Wong.

Antarctica and the Southern Ocean. Since visiting Antarctica as part of a National Research Council study in 2011 I have begun to focus on that continent and the surrounding Southern Ocean as a highly institutionalized production sector. Polar work (exploration and research) has always been dependent on technology because the environmental conditions are extreme: it is cold all the time and day and night each last six months. But Antarctica has been even more remote than the northern polar region because humans could not reach the continent of Antarctica until the late 19th century. James Cook visited the Antarctic region in the HMS Resolution in the mid-1770s and speculated about a large land mass there, but could not get his sail-powered ship through the pack ice. Steamships were required. Carsten Borchgrevink's team used a steamship to make the first real landing on the continent in 1898, kicking off the Heroic Age of Antarctic exploration that featured Shackleton, Scott, Amundsen and others now known as earlly Antarctic explorers. Permanent settlement waited until the middle of the 20th century, and becoming serioius only as a result of the International Geophysical Year (1957-58). Since then Antarctica has become an important outpost for research of many kinds. My interest in Antarctica is two-fold. First, the contemporary institutional construct of Antarctic research is the Antarctic Treaty System that was born largely because of the Cold War and now must change to fit a post-Cold War world. Second, the kind of research expanding most rapidly in Antarctica uses remote sensing and collection of vast amounts of data that rely on cyberinfrastructure. The interplay of this institutional and technical change provides a good opportunity to examine aspects of high-level requirements.

This sometimes seems to me like a strange and disconnected mix of work. It probably seems this way to others. Nevertheless, I have found there is a method to this madness. As a philosophy undergraduate I was intrigued by a central problem of reductionism, namely the question of where to stop. Reductionist strategies can succumb to a variety of fallacies in reasoning if not taken far enough or if taken too far. Like Goldilocks, we need something that is just right, to reduce things enough to accomplish what we set out to do. It is common in requirements analysis to stop too early. Here's an example: A system developer claims that a specification for a system is based on a statement of requirements. Whose requirements? The users' requirements, says the developer. How does the developer know that requirements actually reflect what the users want or need? The users signed off on them, says the developer. Do the users know what they want or need? That's their problem, says the developer. Actually, that is everybody's problem, or at least everybody involved in creating the system. If the requirements aren't right, the specifications cannot be right, and the resulting system will not be right. Stopping too early does not help. Why don't the users know what they want or need? And how is this problem to be fixed? Getting good requirements is harder than it appears in many cases because users do not know what they want or need. That's when one has to go to the high level and begin to determine what the requirements must be. That's the gist of my work: determining what requirements must be. A corollary is determining what requirements cannot be. The scheme is to narrow the search space around requirements such that the actual requirements are probably somewhere within the residual. It ain't pretty, but it's better than guessing.