A brief history of groupthink

Why two, three, or many heads aren't always better than one.

Illustration by Andy Martin

Illustration by Andy Martin

View full image

Thirty-five years ago, Yale psychologist Irving Janis published an essay in the Yale Alumni Magazine explaining how a group of intelligent people working together to solve a problem can sometimes arrive at the worst possible answer. He called his radical new theory "groupthink," and it changed the way we think about decision making. The idea remains so influential, says Yale political scientist Donald Green, that "the term 'groupthink' must come up once a day in common conversation." Janis's essay is still the alumni magazine's most requested reprint. His book on the subject went into a second edition that is still in print as a college textbook. (Janis died in 1990.)

Janis came up with the idea of groupthink during a Yale seminar on the psychology of small groups. His reading about the Bay of Pigs fiasco had led him to wonder how intelligent people like John F. Kennedy and his advisers could have been "taken in by such a stupid, patchwork plan as the one presented to them by the CIA representatives." During his seminar, he found himself suggesting that what had happened in the White House might be similar to what happened among ordinary citizens in the groups he studied for his research: they often developed a "pattern of concurrence-seeking . . . when a 'we' feeling of solidarity is running high."

To investigate further, Janis studied several policy fiascoes, including the Bay of Pigs, the failure to protect Pearl Harbor, and the escalation of the Vietnam War. In each case, the participants "adhered to group norms and pressures toward uniformity, even when their policy was working badly and had unintended consequences that disturbed the conscience of the members," he wrote. "Members consider loyalty to the group the highest form of morality."

Participants in those critical decisions, Janis found, had failed to consider the full range of alternatives or consult experts who could offer different perspectives. They rejected outside information and opinion unless it supported their preferred policy. And the harsher the preferred policy -- the more likely it was to involve moral dilemma -- the more zealously members clung to their consensus: "Each member is likely to become more dependent than ever on the in-group for maintaining his self-image as a decent human being and will therefore be more strongly motivated to maintain group unity."

Janis suggested several steps for preventing groupthink, though he cautioned that they were hypothetical. His recommendations include careful impartiality on the part of the leader as to what decision the group should make; formation of competing teams to study the same problem; and giving "high priority to airing objections and doubts."

Today, groupthink is studied in military colleges, business schools, the management training industry, and academe. It also influences real-time national policies. In 2005, the presidential commission on U.S. intelligence about weapons of mass destruction (Yale president Richard C. Levin was a member of the commission) released a lengthy study; it included the finding that "'groupthink' on an international scale" was one of the reasons Western intelligence services all agreed that Iraq was a genuine WMD threat.

In response, the CIA, for one, changed its ways. In April 2006, John A. Kringen, head of the CIA's Directorate of Intelligence, published an op-ed in the Washington Post entitled "How We've Improved Intelligence: Minimizing the Risk of 'Groupthink.'" He outlined a dozen new procedures, including routinely consulting academics and outside experts and setting up "alternative analysis" teams.

Was groupthink responsible for the 2005 National Intelligence Estimate finding -- since reversed -- that Iran was working to produce a nuclear weapon? Kringen didn't respond directly when asked by the Yale Alumni Magazine. The directorate "has been diligent in integrating fresh thinking and new perspectives into our analysis," he wrote in an e-mail, and he listed several more "structured analytic techniques" the CIA now uses to test hypotheses.

What about the decision to invade Iraq? Histories and analyses both pro and con will surely be written in bulk as to whether groupthink in the Bush administration led the nation into war. That is already a favorite charge of left-leaning pundits, especially so after the release of State of Denial: Bush at War, Part III by Bob Woodward ’65. Woodward wrote that in moments when the president "had someone from the field there in the chair beside him, he did not press, did not try to open the door himself and ask what the visitor had seen and thought."

Below, several academics whose work today bears on, and draws on, groupthink share their thoughts about Irving Janis's powerful insight.

Avoiding Enrons

Groupthink is as fresh and relevant a concept today
as it was back in ’72. And that's a good and bad thing. Obviously, in terms of intellectual vitality and the robustness of insights into human nature, it's quite wonderful that it was captured and coined. The bad thing is that, while the term has become increasingly recognized, the underlying pathology lives on and there's no sign of it diminishing. From Enron to Worldcom, you see every single core issue that Janis raised. Illusions of invulnerability - this wild reckless optimism. Collective rationalization - the belief that dissent is disloyalty or stupidity.

In every one of these disasters, the single common quality was that the relevant knowledge was within arm's reach. It wasn't that there was some expert who was inaccessible; they were employed by the organization and they had made efforts to be heard. The enterprise suppressed information. There's a company called Paribas that had an analyst, a fellow called Dan Scotto, who put out a research report on Enron that revealed some of the problems -- two months before Enron started its plunge. Paribas basically pressured him out, and six years later he is unemployed. And he should have been a hero. Think how well Paribas investors would have done if they had gotten out early.

Governance issues are so much more profound than the simplistic legal and accounting rule-making that we often see from governance enthusiasts. Take just one example -- Janis's idea of having a devil's advocate role shared across the board. Too often these days, in the well-intentioned spirit of good governance, somebody in a group is appointed to be the governance czar. Then that person gets typecast. Other people can discount any criticism from them -- whenever they speak, they're just reading out the script that gives them a role in the room. To have somebody playing the good-governance cop and the rest of the group going off their own merry way is very dangerous and unproductive.

Jeffrey A. Sonnenfeld
Senior Associate Dean for Executive Programs and
Lester Crown Professor in the Practice of Management

Yale School of Management

The secret power of leaders

Within most academic fields
, the ideas that are perpetuated are ideas that can be turned into PhD dissertations by graduate students. Groupthink is complex enough that it was never hot in that sense. Nobody today says, My area is groupthink. But what emerged subsequent to groupthink was an area called "judgment and decision making," which is one of the most important areas in all of psychology. In fact Danny Kahneman won the 2002 Nobel Prize based on his research into how rational people make irrational decisions.

Groupthink also preceded the development of an area called political psychology. Janis's work, plus the work of many others, has led many psychologists to say that what's really important is applying psychology to the political domain -- understanding ideology, understanding how political leaders work and why constituents follow as they do.

Janis showed that members of a group who are smart and rational and well trained may make irrational decisions because they look only for evidence that will confirm their stated objectives, their stated goals. And the key is whether the leader makes his position clear in advance of the group's deliberations. Not sufficiently emphasized in reviews of groupthink is that it revealed the secret power of leaders to influence group decision making by simply having their values or perspective become known. Groupthink becomes worse when the leader creates the concept that everybody has to be a team player. History will record that nobody has done this more than George W. Bush.

Philip Zimbardo '59PhD
Professor Emeritus of Psychology, Stanford

When do groups know best?

Since the ’60s, most of the scholars in the field of organizational behavior
have written about the merits of group over individual decision making. Individual decision making represented autocracy, whereas decisions by groups meant partnership, collaboration, and teamwork. Into that arena stepped my late colleague, Irving Janis, who pointed out that sometimes groups make very, very bad decisions. His concept of groupthink flew in the face of the existing orthodoxy, at least in the management literature. 

Part of the difference lay in the specific outcomes of decision making which were stressed. Irv looked at the quality of the decisions. He showed that groups could make decisions that ignored critical information about the consequences of the chosen course of action. On the other hand, social scientists such as Rensis Likert and Kurt Lewin focused on the motivational benefits of participation. They looked at the commitment generated by participation and the energy elicited in carrying out the decisions made. 

Now the basic ideas of groupthink live on in the literature on management and leadership as part of contingency theories of leadership style. A contingency theory is one that attempts to specify the kinds of situations in which different forms and degrees of participation in decision making are likely to prove most effective. To be useful to managers and leaders, such theories must address multiple outcomes of decision making, including quality and implementation.  They must also consider the nature of the situation confronting the leader, such as specific properties of the group, the leader, the problem, and its organizational context.

Victor H. Vroom
BearingPoint Professor of Management and Professor of Psychology, Yale

Getting off the bus to Abilene
Army culture is susceptible to groupthink
. We have a strong emphasis on teamwork and cohesion, and we are much more hierarchical than most civilian organizations -- there is a leader with a lot of power at the top of every group. So we work pretty hard to ensure that groupthink isn't an issue in military decision making.

The army even has a term for it -- "the bus to Abilene." Any army officer can tell you what that means. It came originally from an article by a civilian management researcher. It's now a management training video, and every officer has watched it at least three times by the time they get to the War College. It's about a family sitting on a porch in Texas on a hot summer day, and somebody says, "I'm bored. Why don't we go to Abilene?" When they get to Abilene, somebody says, "You know, I didn't really want to go." And the next person says, "I didn't want to go -- I thought you wanted to go," and so on. Whenever you're in an army group and somebody says, "I think we're all getting on the bus to Abilene here," that is a red flag. You can stop a conversation with it. It is a very powerful artifact of our culture.

We also have something called the military decision making process. Usually, you've got to come up with at least three or four alternate courses of action to prevent you from locking onto one answer. And something that has become much more prevalent in the last ten years is red-teaming. One group -- typically experts from outside the organization -- plays the role of antagonist, and they try to pick apart our logic.

As for the decision to invade Iraq, that history hasn't been written yet. Just because you make a bad decision doesn't mean it was because of groupthink. It could be because you didn't have all the information, or you had bad intelligence, or you made bad assumptions. I think we have to wait a few years before we can really say. 

Colonel (Ret.) Stephen J. Gerras
Professor of Behavioural Sciences,
U.S. Army War College

The comment period has expired.