THE SOCIAL ENGINEERING SOLUTION TO THE MURDER IN THE MILGRAM EXPERIMENT
Eugen Tarnow, Ph.D.
Society's power to make us obey allows for peaceful existence, economic prosperity and efficiency but it also amplifies faulty decisions to become catastrophic. In 1963 Stanley Milgram showed that the vast majority of humans exhibit excessively obedient behavior in presence of an authority and can easily be made to encourage or tolerate real torture and murder.
In this advocacy paper, the overdue issue of how to limit excessive obedience is addressed. Eliminating the Milgram Prediction Error – i.e. the discrepancy between what we think we will do and what we actually do in situations of authority is stressed. Barriers and dynamics in our society that keep us from breaking and even enforce our habit to obey excessively are discussed. For example, society does not know what the strong situations are and therefore cannot put up a defense against them; the law does not punish excessively obedient behavior and the teaching of ethics is hampered by illusions of its efficiency.
A sketch of a solution to the problem of excessive obedience is made involving experiential training, mappings of authority fields, rules and strong situations, and policy changes.
Thou shalt not follow a multitude to do evil . . .
(Exodus, 23:2, suggestion from about four thousand years ago)
The Milgram obedience experiment has become quite famous over the last forty years (for reviews, see Milgram, 1974 and also Miller, 1986) - if one mentions the experiment at a party, some of the participants will vaguely remember it. But while making for a good topic for conversation over a beer, it is a finding that has yet to produce a single useful action. In fact, it did just the opposite: it provoked other researchers to kill the messenger and declare the experiment unethical. It is thus not surprising that over time, the result has not improved: the experiment yielded the same horrendous obedience rate in 1985 (Meeus and Raaijmakers, 1995) as in 1963 (Milgram, 1974). But, as we shall see, it may be that simply telling people about the experiment is not enough anyway, the behaviors challenged are just too difficult to change.
The Milgram obedience experiment reveals what physicists would call an instability in our society towards limitless obedience to authority. I.e. while our society is quietly humming along, a catastrophe may lurch around the corner once too many people start to obey a bad set of directives. Some has been written about the role of excessively obedient behavior in world events such as the Holocaust (Arendt, 1970; McKellar, 1951; Miller, 1986), the My Lai massacre, the treatment and disappearance of people during the military regime in Argentina (Kelman and Hamilton, 1989), and the NASA space shuttle disaster (Feynman, 1990). If one accepts the description of these events in terms of excessive obedience then they serve as an additional motivation for reexamination of the problem in the Milgram experiment – if not, this paper will not argue one way or the other.
In the Milgram experiments, a subject, the Teacher,
is asked by the Experimenter to give electrical shocks to a confederate, the
Learner. The stated purpose of the experiment is to understand how punishment
affects memory recall. The Student, with a stated heart problem, fakes an
increasing discomfort and as the fake electrical shocks increase to dangerous
levels, he suddenly becomes quiet which can be reasonably interpreted as him
being dead (Mantell (1971) conducted a replication of
the Milgram experiment in Germany and interviewed the subjects after the
experiment. Many claimed that they
believed the learner had been dead or at least unconscious). Milgram found with
this simple experiment that most people can be made to seriously injure and
"kill" by verbal orders. Even
though the subjects may feel intuitively that they are doing something terrible,
the forces of obedience are overpowering.
Milgram also discovered that predictions by psychiatrists, graduate
students and faculty in the behavioral sciences, college sophomores, and
middle-class adults of the rate of inflicting maximal injury during one of the
experimental conditions were consistently much smaller (0-1%) than the actual
rate (65%, see Milgram, 1974, p. 31).
This is referred to as the Milgram Prediction Error (
The murder in the Milgram experiment occurred because the Experimenter was able to use his authority to limit the Teacher’s options for thought and behavior in a purposively designed, deceptive and gradually presented “strong” situation and because our society has created individuals who are much too easy to command. The Experimenter was able to limit the subjects’ interpretations of the experiment to the idea that it was a reasonable study in learning. Not one of a thousand Teacher subjects acted on the alternative interpretation that it was a dangerous experiment and called the police or freed the Learner (Zimbardo, 1974; Milgram had alerted the local police department beforehand because he expected such calls (Alexandra Milgram, private communication)). Likewise, the Teacher subjects assigned the responsibility for their actions to the Experimenter and concentrated on performing the task at hand in the most efficient way possible. They no longer saw the choice of disobedience, but only the choices with which obedience could be improved. Some went as far as to assign responsibility for the Learner's death to the Learner (Milgram, 1974).
The murder is not committed by subjects who enjoy killing others. Martin, Lobb, Chapman and Spillane (1976) found that high obedience rates can also be obtained if the result is self immolation.
Excessive obedience can be defined on an individual level as well as from a larger perspective.
Would a manager like to have his staff be totally obedient? At first glance one would think so. In theory, the manager bears the responsibility for a department and in order to run it properly she would need everybody to help. But what if she gives an erroneous order? Then it would probably be better for her staff not to carry out the order. Thus it is evident, at least for the sake of error correction, that the degree of obedience should not always be 100%.
Let’s take the specific example of a manager and her
staff: the airplane captain and the
first officer. Up to 20% of all airplane
accidents may be preventable by optimizing the “monitoring and challenging” of
captain errors by the first officer (
First Officer: just .. you just gonna stay up here as long as you can?
Captain: yes. guard the hor- I mean an speeds one hundred.
At the point the plane is scraping the trees, the following dialogue occurs:
Captain: did you ah click the ah airport lights .. make sure the co-common traffic advisory frequency is set. [sound of seven microphone clicks]. click it seven times?
First Officer: yup yeah I got it now. [momentary sound of scrape lasting for .1 secs]
According to the NTSB (NTSB, 1994a) the crash was caused by several factors, among which was the failure of the first officer to monitor and alert the captain of the erroneous descent. Had the first officer been less obedient, it is likely that he, the captain and the other people on the plane would have been alive today.
the individual point of view, excessive obedience can be defined as behavior for
which the obedience level is higher than predicted by the individual, i.e. when
the Milgram Prediction Error (
Here is presented eight barriers and dynamics which keep us from breaking, and even enforce, our habit to excessively obey.
The Milgram Prediction Error erects one barrier towards the elimination of excessive obedience: it keeps the consequences of excessive obedience from our awareness. If we do not recognize that there is a problem, it is not going to be a priority to fix. Thus it is a lot easier to find people to protest whoever is the current president than to ask Congress to form a committee and pass laws about strong situations. When the verdicts come in from trials which involve strong situations, the newspaper rarely point out that most other people would have done the same thing and that societal obedience or the strong situation is the problem. Instead we are happy to conclude that the convicts are different from us and we would never have done what they did.
The Milgram Prediction Error is part of a larger social illusion of the effectiveness of ethical teachings. Milgram wrote that “the force exerted by the moral sense of the individual is less effective than social myth would have us believe. Though such prescriptions as ‘thou shalt not kill’ occupy a pre-eminent place in the moral order, they do not occupy a correspondingly intractable position in the human psychic structure. A few changes in newspaper headlines, a call from the draft board, orders from a man with epaulets, and men are led to kill with little difficulty---Moral factors can be shunted aside with relative ease by a calculated restructuring of the informational and social field.” (Milgram, 1974, pp 6-7). Teaching ethics by simple instruction is inefficient in strong situations; the Teachers had no doubt obtained such instruction (“it is wrong to kill”) and did not expect to punish the Learner as severely as they did.
In the field of scientific authorship (which the author happens to be familiar with) Eastwood, Derish, Leash and Ordway found that training in the ethics in research correlated with an individual’s belief that it influenced conduct of scientific research and publishing, and that it heightened his sensitivity to misconduct. However, training in ethics was actually uncorrelated with willingness to commit unethical or questionable research practices in the future, and was positively correlated with a tendency to award honorary authorship. The intention to award honorary authorship also increases dramatically for those who have first-hand experience with inappropriate authorship (either by having been asked to list an undeserving author, named as an author together with an undeserving author, or unfairly denied authorship). The authors concluded that “despite the respondents’ own standards in this matter, their perception of the actual practice of authorship assignment in the research environment has fostered a willingness to compromise their principles.”
During a stint as a trainer in a hospital situation I
noticed an example of the social illusion.
As an icebreaker my boss and I would ask the individuals of a hospital
department to state something about their values. Almost everybody would then ascribe to the
golden rule. Even with my limited
knowledge I knew that this did not describe the behavior of several of the
people involved, but nobody objected, no one laughed. I have noticed another example which occurs when
wars and catastrophes are discussed in religious settings. Inevitably, the problem is described by
seemingly well-meaning people as belonging to those bad people and, of course,
nobody present would ever do anything like that. One gets a cozy feeling and a thankful
feeling – “I am so lucky to belong to this group of people that will make sure
I am always safe”. I sit there and wish
for Stephen Katz to come and present the toilet situation in
The Milgram Prediction Error asks us to face two tough truths:
· Milgram's finding that anybody is likely to seriously injure the Learner means that we are not safe from our neighbors. This presumably also makes it very difficult to discuss in groups since it points out the fallibility of the group members and therefore of the group itself.
· That we injure the Learner against our later judgment means that we ourselves cannot be trusted.
Just like the obedience rate in the Milgram experiment stayed constant, the Milgram Prediction Error--our non-anticipation of the result--had not changed in 1985 either: Meeus and Raaijmakers, (1995) performed an obedience experiment involving “administrative violence,” depriving someone of his job. Predicted obedience rates: 10% Actual rates: 95%.
In the Milgram experiment not one of a thousand Teacher subjects came up with an interpretation alternative to the Experimenter's and, for example, called the police or freed the Learner (Zimbardo, 1974). This limited perspective of the Teacher has been investigated by many researchers.
Milgram developed the theory of the "agentic" state to explain his experimental results (Milgram, 1974). It is a hypnotic state in which one assigns all responsibility for one's actions to the supervisor and concentrates on performing the task at hand in the most efficient way possible. One no longer "sees" the choice of noncompliance, but only the choices with which compliance can be improved. The theory of the agentic state explains the tendency to assign responsibility for the Learner's death to the Learner (Ibid.): after the task has become all-important to the Teacher, the Learner is perceived to be one of the variables left that can be optimized; thus the Teacher wishes the Learner to try his best. The Learner's death is self- inflicted because he refuses. Likewise, the Nazi concentration camp guards stopped thinking about the horrors they were perpetrating and concentrated on the ease of execution of its victims. The guards posted signs saying "work bring freedom" (Vrba, 1964, p. 90), screamed to the victims to go faster, faster (Ibid., p. 132-4), made them believe that executions were medical checks (Ibid., p. 144), etc.
A short note by Kohlberg suggests that the limited perspective problem is related to the subjects' moral development (Kohlberg, 1969).
Milgram, and Kelman and Hamilton, also refer to the limited perspective problem as the "narrowing of the cognitive field" (Milgram, 1974, p. 38) and as "dehumanization" and "neutralization."
There is a halo effect that favors excessive obedience over dissent: A person who obeys has much of society's validation behind him, and society has had a long time to "beautify" his behavior (uniforms, monetary rewards, etc). Thoreau, a pioneer in civil disobedience, remarked in 1849 about the obedient majority: "They will wait, well disposed, for others to remedy the evil, that they may no longer have it to regret." (Thoreau, ed. 1980, p. 226). The obedient majority can look around to see others behave just like it and reinforce their behavior.
Dissent, on the other hand, often becomes ugly. Ziemke wrote, in the context of the denazification
Strong situations occur daily, and we need to know what they are in order to decrease excessive obedience rates. Only a few examples of studies can be found in the literature - unknown doctors ordering nurses to inject unknown medicine (Hofling et al, 1966), and bureaucratic orders to disturb a test-taking potential employee (Meeus and Raaijmakers, 1986).
Milgram wrote: "Obedience, because of its very ubiquity, is easily overlooked as a subject of inquiry" (Milgram, 1974, p. xi). Twenty years later, it might be that psychologists studying obedience have missed an important level of analysis - the rule - perhaps because of its ubiquity: There are rules to create peace, to uphold standards, to increase efficiency, to spare people's feelings etc. The corresponding field of rules (similar to Milgram's binding factors, Milgram, 1974, p. 148) has never been mapped out. Without a knowledge of what we are obeying, we cannot lessen excessive obedience. Authority benefits from rules being elusive, and may perpetuate this situation because it has more experience of the situation and is more powerful.
We often do not know what the consequences for breaking rules are, traffic and criminal laws excepted. Indeed there might not be a fixed penalty: Imagine that one wants to enter a particular University library and insists on breaking one rule: the rule that one has to possess a library card. Thus one walks and passes the guard. Here are two examples of consequences:
· Nobody sees you and there is no consequence.
· The guard does not mind. Breaking the rule costs little.
· The guard minds, and tells you to get yourself a card, it takes only five minutes anyway. You insist that you want to enter the library and break this one rule. The guard calls security. Two bouncers enter and demand that you leave the premises. You explain the situation to them as you did to the guard. They think you are crazy and perhaps dangerous and they pull a gun on you.
In either case it is interesting to see which rule was actually broken. The rule about the card, the rule about obeying the guard? It seems to be difficult to separate the infringements from each other.
It is likely that mental barriers to uphold and break rules are set by impressions from childhood (Milgram, 1974, p. 136, and Zimbardo (1974)). Since our childhood authorities may be more influential and important than our adult authorities, the costs of breaking rules may be erroneously valued.
Axelrod writes that "it is the ability to identify and
punish defectors that makes the growth and maintenance of norms possible"
(Axelrod, 1985). The norm for excessive obedience is
much easier to maintain than the norm against. It is difficult to identify a
person who excessively obeys since he typically not stand
out from the crowd. A dissenter, on the other hand
stands out clearly and can be easily penalized.
It has also been pointed out that the power of authority is increased
multifold in the presence of an obedient group (
Identification of the excessive obedience culprit is also difficult because the overall responsibility is inherently an issue involving two or more people: The power to make decisions and the power and knowledge to carry them out are often separate. Organizational life with large bureaucratic organizations exacerbates these points since hundreds of people could be involved with a crime in large and small ways. Milgram also found that observers and participants in the experiment have different views (Milgram, 1974).
punishment for excessive obedience is very elusive. The consequences for the
perpetrators of the
Indeed, the Milgram obedience experiments present an unsolved legal paradox. Since almost everyone would commit a crime in strong situations, it is doing justice to the criminal not to convict him (see also Le Bon, ed. 1982, p. 163-165). On the other hand, the absence of a conviction does not serve the victim, nor does it protect society from future crimes.
"Turning the other cheek" is a heuristic that often lends credence to excessive obedience because it can be construed as obedient behavior that further strengthen the obedience field.
the late eighteenth century, James Madison stressed the vulnerability of our
society to the violence of "factions."
third example is the civil rights movement in the
Let us discuss some of the possible ways of decreasing excessive obedience.
· Experiential education. It is likely that learning by instruction is ineffective (just like teaching about authorship ethics actually created more inappropriate authorship). Rather learning by experiential education is more appropriate because of the presence of the social pressures involved in "doing the right thing" (indeed, Milgram interviewed his subjects after the experiment and many felt they had learnt something important). Since we know that playacting of the Milgram experiment can give close to the same result as the experiment itself, this seems an appropriate endeavor (Meeus and Raaijmakers, 1995, found that with sufficient intensity, role playing of the Milgram gives the same result as the original Milgram experiment). At each step of a strong situation the participants would be taught to see the full perspective of choices available to them. Spectators could learn by viewing the experiences that it is imperative to accept the dissenter who may emerge, the somewhat different type of person she might be, or has to be, and accept the unattractiveness that accompanies dissent.
obedience is more widely understood, we can catalog the strong situations.
Zimbardo emphasized the need "for more knowledge about those conditions in
our everyday life where, despite our protest - 'I would never do what they did'
- we would, and we do" (Zimbardo, 1974). Mapping of work situations that
are strong for individuals can be done by undercover order- giving (Hofling et al, 1966; Meeus and Raaijmakers, 1986, Tarnow, 2000a). For example, imagine an
organization where pleasing the bosses is more important than the work output; excessive
obedience is pervasive. At random times, each of the managers could be asked to
give what the board of directors considers a nonsensical/unethical order. If
the unethical order is obeyed, the situation is too strong. If a situation is
found to be too strong, it should be pointed out, and discouraged. The regular
occurrence of obedience-testing questions will serve to create a norm for what
orders can be given, and to encourage critical evaluations of future orders (a
specific example includes obedience optimization in the airplane cockpit and
other high risk work places, see
· Axelrod (1985) studied the emergence of norm systems and found a necessary criterion for the viability of a new norm system: the ability of the agent to modify the unwanted behavior. It makes little sense to punish a person unless they or others are given the power to behave differently in the future. To help make awareness of excessive obedience a norm, and to decrease excessive obedience, the law may be useful by eliminating strong situations and by increasing our individual armament against social pressures. In the former case, laws may need to regulate the size and communication structure of groups. A meeting between an employee and two managers, for example, is a situation that may be questionable. Large bureaucracies create strong obedience fields and the existence of these could be questioned on this ground. In the latter case excessive obedience must be identified and punished often enough for it to disappear as a norm. If a group of people was involved, partial individual responsibilities should be assessed and clear rules for distribution of punishment made. If the assigning of responsibility becomes impossibly difficult, then the proper legal actor must be the full group. If the situation is somewhere in between, one can assign the responsibility to both the executor of the crime and to the people responsible for the obedience "field." The legal arena may also be useful to remove excessive obedience once a social policy has been adopted that defines and enlightens citizens of the dangers of excessive obedience.
· The structure of the communication flow in an organization needs to be considered. For example, if a group of people were sitting in a circle (or in a row in a movie salon), and were only allowed to talk to their nearest neighbors, we can speculate that dissent would be relatively easy. It only costs you the opinions of at most two people. However if you are in a workplace with privacy inhibiting cubicles, dissent would be much more costly.
· The mapping of conscious and unconscious contracts and "covenants" that exist in the work place need to be performed. Efforts should be made to simplify the contracts and covenants so that individuals are not overwhelmed. It should be apparent to everybody when no contract exists (Hobbes stressed "the silence of the law" being important for liberty (Hobbes, ed. 1960, p. 143)).
· The real consequences of not going into a contract, or of disobeying a rule need to be understood. To illustrate, one could construct a "social crime" table. This table would show the temptation levels and the actual breaking rates to give us a sense of how strongly social rules are enforced.
· We need to understand the functions of the rules around us. The addition of rules can serve many hidden purposes. Authorities can "fix" problems, inefficiency can be hidden, the obedience field is strengthened, and the breaking of even minute rules can lend credence to firing individuals. Sometimes we may ask ourselves whether we want to communicate the existence of a rule, and thereby strengthening it if we do not agree with it.
· An alternate way to diminish excessive obedience is to encourage dissent (also emphasized by Kelman and Hamilton, 1989). While the U.S. Constitution guarantees some individual rights against the Government, the Constitution does not apply to private organizations; in these there are no First Amendment rights and few privacy rights. Since private organizations are by virtue of their contribution to the popular mindset essential to the problem of excessive obedience, the lack of comprehensive civil rights in the private sector, in addition to those related to discrimination, should be reconsidered. Whistle-blowing was recently encouraged by the government under special circumstances but it could be further encouraged in other arenas. The legal rights people do have in organizations may be taught more vigorously: People who have little idea about the laws that rule them are not empowered to insist on their rights.
· Since "turning the other cheek" can enhance the obedience field, it needs to be taught more carefully. It should be properly contrasted with the opposite heuristic - tit-for-tat. Axelrod found the latter to be the most robust "ethics" in a computer game. It can be error-correcting and some generalizations to human behavior support this algorithm (Axelrod, 1984). It may be that "turning the other cheek" should be thought of as a way to correct tit-for-tat, not replace it.
In this article I have stressed the need for a solution to the societal instability pointed out by Stanley Milgram, explained what some of the barriers are to a solution and made a plausible sketch of what such a solution might look like. From here it is the responsibility of policy makers and granting agencies to pick up the task.
Hannah Arendt (1970), “Eichman in
Axelrod, R. (1984). The Evolution of Cooperation, Basic Books.
Axelrod, R. (1985). "Modeling the Evolution of
Norms," speech delivered at the American Political Science Association
Davison, Allan J.; Higgins, Nancy C. (1993). "Observer Bias in Perceptions of Responsibility." American Psychologist, v48, 584.
Eastwood, S., Derish, P., Leash, E., Ordway, S. (1996) Ethical issues in Biomedical Research: Perceptions and Practices of Postdoctoral Research Fellows Responding to a Survey, Science and Engineering Ethics 2: 89-114.
R.L. (1997). “Surviving the Americans”,
Hobbes, T. (1960). Leviathan. Basil
Hofling, C.K.; Brotzman, E.; Dalrymple, S.;
Isenman, M.K. (1990). "Reviewing
Crimes of Obedience. By Herbert C. Kelman
and V. Lee Hamilton." The
I.L. (1972). Victims of Groupthink: A Psychological Study of Foreign-policy
Decisions and Fiascoes.
and Hamilton, V.L. (1989). Crimes of Obedience.
Bon, Gustave. (1982). The
crowd - a study of the popular mind.
James in Rossiter, Clinton. (1961). The Federalist Papers. Penguin Group,
Mantell, D.M. (1971). "The Potential
for Violence in
MARTIN, J.; LOBB, B.; CHAPMAN, G.; SPILLANE, R. (1976). "Obedience under conditions demanding self-immolation." Human Relations, p. 345-356.
McKellar, Peter. (1951) "Responsibility" for the Nazi policy of extermination. Journal of Social Psychology, v34:153-163.
Meeus, W.H., and Raaijmakers, Q.A. (1986). "Administrative Obedience: Carrying Out Orders to Use Psychological-administrative Violence." European Journal of Social Psychology 16: 311-324.
Milgram, S. (1974). Obedience to Authority: An Experimental
Miller. A.G. (1986). The Obedience Experiments.
National Transportation Safety Board (1994a). Controlled
collision with terrain:
Sherif, M. (1961). Conformity-Deviation, Norms, and Group Relations. In Conformity and Deviation, ed. by I.A. Berg and B.M. Bass. Harper and Row.
Henry David. (1980) "On the Duty of Civil
Disobedience", in Walden and "Civil Disobedience," Penguin
Ziemke, Earl F. (1975). Army Historical Series: The
Zimbardo, P.G. (1974). On "Obedience to Authority". American Psychologist 29:566-567.
The author, Eugen Tarnow, is a consultant with a degree in physics (Ph.D. M.I.T., 1989). His interests include groupware, training in customer relations, task efficiency, business vision statements, the performance of large and small work groups, and cockpit crews.
The author thanks Carol Caruthers, Rafi Kleiman, Arianna Montorsi, Mats Nordahl, and Barbara Smith for critical readings of the manuscript; Steve Maaranen, and Nicklas Nilsson for useful discussions.
Correspondence concerning this article should be addressed to Eugen Tarnow, 18-11 Radburn Road, Fair Lawn, NJ 07410, USA; firstname.lastname@example.org (E-mail).