Actor Network Theory

BACK: Technological Systems

Bruno Latour, Michel Callon, John Law

One of the basic intuitions of actor network theory is that our society is kept together through artifacts and technology. What is all the stuff that isn’t human that is required for our social system to function?

Actor network theory suggests that there is no society that is separated from technology and science. Society is more than just bodies and norms. It doesn’t make much sense to distinguish between science and technology–they are so connected and essential that there’s not much reason to separate the two.

In the actor network theory, they are the same. We cannot subtract technoscience from society. What does society look like from this perspective? A large network in which all kinds of actors are interacting with each other.

In STS, Bloor brought up the idea of principle of symmetry: why did certain relevant social groups triumph over others? Why did it become popular? It shouldn’t be influenced on why we think something is true or false.

ANT people generalize this idea in “generalized symmetry” in that we don’t even distinguish between humans and nonhumans. It doesn’t matter if an actor in the network is human or not–it still has the influences.

As you can imagine, this was quite shocking to people who were studying this subject. They were claiming that there was no difference between humans and our artifacts. We’ve previously always cited humans as the reason for a phenomenon.

Latour on the “Missing Masses”

He applies this idea to how society works. We see a lot of things going on, but if we only blame people then there’s not much explanation. We’re only looking at a tiny bit. We need to look at many other things–the missing masses, the nonhuman things.

Consider a door closer.

It’s job is to close the door after someone goes through it. Is this a good citizen of our society? It’s doing work for us. If we placed a sign on a door which said “please close the door behind you,” how many people would actually do it? Rather, we created this whole machine which does it for us. In this way, we have achieved that goal smooth and unproblematic–what a great result! We’ve delegated this task to the door closer and instantiated some morality into it–“it’s a good thing to close doors behind you.”

By seeing this, we can see how we are working on human behavior and machines. The machine sort of limits what the human can do–we can’t pull the door out or leave the door open without a door stopper. If we visit a webpage, we’re limited to the buttons that we can click on. In a way, the machine tells us what to do as well.

Designers start breaking down tasks into small components and then they distribute these onto different actors, and each of these actors do their part in a very coordinate manner–this is the network. This distribution is known as machination. Everything is related and the system is stable and works if everyone does what they’re supposed to be. Failure happens when you’re the machine that stops working or you’re the human that can’t use a door handle, but it doesn’t matter–that was the cause of the failure.

The door, which was part of the system of the room, which was a part of the system of the university, which was a part of … this extends to a large, global network. Latour suggests that we’ve underestimated the role of technology in our life–from seat belt indictators which beep at us when we’re going too fast, stop lights which tell us what to do, or ATM machines which tell us what to do when we need cash.

The Berlin Key

To get into the apartment gate, you would have to push the key all the way through, walk through, and then take the key out from the inside. This forces you to close the gate–otherwise you wouldn’t get your key back. This inscribes a program of action just into the key.

Artifacts are always part of programs of action–which include both people and the artifacts.

Latour wants us to rethink ‘society’ completely. It should be considered as a large collective of humans and non humans.

Latour on Machines

Science continuously acts on our society. The Diesel engine: Latour turns the diesel engine story (like Edison and lightbulb) into the perspective of ANT. The inventor was seen as a strategist that needed to enroll others and to control their behavior. Nothing could go wrong with the engine, and everything had to be tweaked to work with the network.

How do we do this?

  1. Translation of interests: Offering new interpretations and thus channeling people’s action in specific directions; recruiting allies (Bloor)

  2. Keep the interested groups in line: Enroll and control others by making their behavior predictable (building a machine).

The windmill has negotiations to interest the wind in the fabrication of bread. All the elements have been made interested in the work of each others–we force the wind to work for us in order to make bread. This, obviously, makes the work of designing machines extremely complex.

Translation vs Diffusion

Latour suggests that “diffusion” is misleading–we can’t just make a machine and hope it diffuses out into society. There is work involved and requires fighting against the groups that might resist them. Diffusion dervies from the idea of technological determinism.

Translations, rather requires someone to move it–it the a translation of interests which creates allies and people who are interested in the technology.

Post-It Notes: Latour discusses the emergence of this. The sticky not didn’t use to be so sticky–but why would people want this? Nobody really want it. It failed a bit, until management gave it to their engineers, who started writing down notes and sticking it to their computers. They then used this as a marketing strategy–we used it this way, and you could to.

Through this, they found alliances.

So how would Latour tell the history of the invention of the bicycle, as opposed to the SCOT approach?

Callon on “Heterogenous Technoscience”

Hetergeneous Technoscience: there are no distinct phases in the process of innovation. There is no technical, economic, marketing ideas. ANT suggests that all of these things were already mixed together since the beginning of the technology–you were working with “what kind of market will this sell to” from the beginning.

  • Failure of the electric car in France–people didn’t understand why we should use something when we already have something that works well? The social meaning of this tech wasn’t obvious–there were no allies. ANT tells us that those engineers should have also reconstructed French society because these new machines create a new society.

Actor-Network: heterogeneous associations of unstable elements, which influence and re-define each other continuously, the new description of the dynamics of society.

Differences from Hughes’ notion of “large technical systems”

  1. No evolutionary stages: the network never really grows in momentum. It might expand when we tweak something to fit into it, but it never grows out of hand and unstoppable.
  2. No distinction between the system and its environment: Latour and Callon never talk about the background of a system–it’s just there and part of the network and functions with us. Hughes, on the other hand, might that the system might be taken out of society or we can talk about the social background of a system.
  3. Question of agency–Human vs Non-Human: Nonhumans are given agency. All the different bits do their own duty–there is a constant attribution of agency to humans to nonhumans.

Scientific Theories According to ANT

Science and technology work by translating material actions and forces from one form into another. There is a point where we’re abstracting the ideas from one phenomenon into another phenomenon–for predictions or explanations or whatever.

Abstract theories are the product of manipulations at the local level–just a couple of observations–into a global phenomena. There’s no ‘inductive process’ like we’ve traditionally seen science in. There’s a material process–we need to follow the material process before we can abstract.

This also discusses the universality of science (you can apply it anywhere and it works in the same way). ANT thinkers say, “yes that’s the end goal of what you’d want, but it’s not a given from the beginning.” Science isn’t universal until all the work is put in to applying it to different case studies, gathering evidence, etc. They emphasize the effort of science. Universality is an achievement, they aren’t universal by default. You have to create an alliance that is global.

This isn’t something that Bloor or others would disagree with, but it goes a step further.

Inscription devices: scientific instruments that makes it possible to write down a description that is abstract enough to be replicable by other people.

Immutable mobiles: This is like the toolbox–once you have your papers and formulas, they can be moved around and used to control and understand more phenomena.

Centers of Calculation: Nature, once stabilized in a lab and turned into marks on paper or in a computer, is manipulable here, so we can produce new predictions and abstract representations. These representations are often taken to be nature itself.

Pasteurization of France

Pasteur became famous because of the microbial theory of disease and anthrax vaccine. Other social life was revolutionized because of this intuition.

Latour discusses the difficulty of trying to isolate the microbes that Pasteur worked with–how do we know that they’re isolated to a point that they can be abstracted??

On top of this, he has to convince people that these microbes actually exist and having to gain the trust of other relevant social groups. Once he found his abstraction and knowledge, he has to convince other people that there is actually a use for this–medical groups, cow farmers, etc..

Callon on Scallops of Saint-Brieuc

Scientists were called in to revive an oyster farm that wasn’t doing well. This was a story of failure. Different groups (fishermen, scientists, scallops) who should have collaborated didn’t. They all had their own interests, which should have been alligned in order for success, but the actors didn’t behave as they should. Because of this, the network failed and crumbled.

Criticisms of ANT:

  1. There is no “Explanation” provided–it is “culturally flat.” It’s a very sophisticated description, but it doesn’t explain anything. The cultural element isn’t here– where scientists come from isn’t important through ANT. It is much more abstract and distant. We lose a lot of fine grain of the events. For the bicycle example, it was extremely interesting to hear who and how different social groups wanted to design the bicycle–for masculinity or for transportation? But in ANT, none of that is relevant.
  2. “Conservative” implications: Things described as they appear to technologists/scientists and users. There feels like there might be a risk of going back to technological determinism by heading in this direction.
  3. Human Agency: What does it mean when we give agency into the machines? Never has a theory taken out the humanness of agency. STS’s key idea is that we wanted to identify the social dimension, the humanness that explained our technology. ANT takes that away from how we explain science and technology.

NEXT: Cold War