Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Michael Roberto's The Art of Critical Decision Making Day 2

Decision Making Series

Raza Usman

on 20 April 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Michael Roberto's The Art of Critical Decision Making Day 2

Day 2 Prof. Michael A. Roberto's
The Art of Critical Decision Making

Meridian Consulting
knowledge management presented by
Mr. Raza Usman Diagnose whether a debate is becoming unproductive and dysfunctional? Keeping Conflict Constructive Ask the following questions.
Have people stopped asking questions intended to gain a better understanding of others’ views?
Has the group stopped searching for new information?
Have individuals stopped revising their proposals based on the feedback and critiques offered by others?
Has no one asked for help with the interpretation of ambiguous data?
Have people begun to repeat the same arguments, only more stridently and loudly over time?
Has no one admitted concerns about their own proposals recently?
Have less outspoken individuals begun to withdraw from the discussions? 2 forms of conflict.
1. Cognitive conflict is task oriented. It’s debate about issues and ideas.
2. Affective conflict is emotional and personal in nature. It’s about personality clashes, anger, and personal friction.
The key is to stimulate the cognitive conflict while minimizing the affective conflict.
Note that effective leaders channel the emotions; they do not try to eliminate them. It’s nearly impossible to eliminate emotions from a lively debate. In fact, emotions, at times, can be helpful. What can leaders do during the decision-making process to help stimulate constructive conflict?
Redirect people’s attention and recast situations in a different light.
Reframing requires asking curious, nonthreatening questions.
The language people use in those questions matters a great deal.
Present ideas and data in novel ways so as to enhance understanding and spark new branches of discussion.
Revisit basic facts and assumptions when the group appears to reach an impasse.
One executive I interviewed discussed how he always brought people back to certain core facts and assumptions when the debate seemed to simply be at an impasse among a set of proposals.
The idea is to find some points of common ground in those heated moments. What can leaders do after the decision-making process to help stimulate constructive conflict?
Evaluate the process and develop lessons learned for application in the future. Conducting good after-action reviews can help improve a group’s ability to manage conflict constructively.
Attend to damaged relationships and hurt feelings that may not have been apparent to all during the process.
Ensure that people remember, and even celebrate, the effective ways in which they handled difficult disputes.
Share that best practice with the entire organization. What can leaders do before the decision-making process to help stimulate constructive conflict?
Establish ground rules for how people should interact during the deliberations.
Clarify the role that each individual will play in the discussions.
Build mutual respect, particularly with regard to differences in the cognitive styles of each team member.
It’s important for people to understand one another’s cognitive styles.
Teams can and should spend some time discussing each member’s cognitive style. Decision After During Before Lets take a look at IDEO, one of the world’s leading product-design firms.
Several years ago, ABC News produced an interesting program showing IDEO trying to design a new shopping cart in just 5 days.
Many case studies have also been written about the company.
In addition, one of the founders has written a book about how IDEO works. Creativity and Brainstorming Here are some key steps in the IDEO creative process.
Everyone at the firm becomes an ethnographer. They go out and directly observe how people are using particular products in natural settings. They are not just asking people what they do or what they like.
As it turns out, people do not always do what they say they do. Thus, observation is critical to truly understanding people’s needs, habits, and so on.
Employees share what they have learned from their data gathering in a wide-open session.
They vote on ideas using Post-it notes at times, to help narrow down the long list of concepts that come from a brainstorming session.
They practice deferred judgment during brainstorming sessions, holding back on critiquing one another’s ideas until they have lots of views on the table. The leaders intervene periodically to shape the process and to keep the team moving.
They engage in rapid prototyping—an essential part of the creative process.
They even build specialized prototypes that focus on one particular product dimension to help drive innovative deas.
They have subgroups work in parallel at times to create divergent thinking and to speed up the design process.
They take their prototypes out to the field to gather lots of feedback. IDEO’s culture and organizational context are conducive to creativity.
They have a work environment that is fun and encourages free-flowing ideas.
They do not have much formal hierarchy.
The workplace has few symbols of status.
The ground rules for how to have a productive brainstorming session are written on the walls as reminders.
There are materials everywhere so that people can think visually and so that crude prototypes can be built.
They keep old failures around, to remind people that you have to take risks to be creative and that you have accept some rate of failure. The leaders play an interesting role at IDEO.
They do not tell people what to design.
They guide and shape the process, and they intervene to keep it on track.
Team leaders are chosen carefully, with an emphasis on communication and interpersonal skills rather than seniority or technical capabilities.
The leaders openly encourage people to disagree with them. We can identify 3 important steps in the creative process.
You must use experts and expert knowledge in an appropriate manner. There are many examples of how experts can be wrong, particularly when the external environment suddenly changes or when the environment is turbulent for a long period of time.
You have to keep surfacing and testing underlying assumptions and orthodoxies. You have to wipe away old assumptions and beliefs, and unlearn old ways of working, before you can creatively generate new ideas.
You have to frame problems in a way that does not constrict the debate or the range of solutions that will be considered. You have to use multiple frames on the same issue. For the shopping cart, IDEO identified 4 different areas of focus, and they explored them all. In conclusion, creativity requires a willingness to focus intently on avoiding premature convergence on a single idea.
You have to defer judgment and generate many diverse ideas. You also have to be willing to experiment and to fail.
Willingness to fail—and encouraging people to make useful and intelligent mistakes—is critical.
Indecision is not simply a trait of particular leaders; it’s often a trait of organizational cultures.
Culture is defined as the taken-for-granted assumptions of how things work in an organization, of how members approach and think about problems.
With regard to indecision, it often arises from certain dysfunctional patterns of behavior that become ingrained over time within certain cultures.
There are 3 types of problematic cultures, which we call the culture of no, the culture of yes, and the culture of maybe. The Curious Inability to Decide Many leaders and companies have a persistent problem with indecision. A “curious inability to decide.” The “culture of no” is a phrase coined by Lou Gerstner when he took over as CEO of IBM.
Gerstner faced a tremendous challenge. The company lost more than $8 billion in 1993. Mainframe revenues, the company’s mainstay, had declined precipitously.
One cultural problem was that the powerful heads of various units at IBM could effectively veto major initiatives, even as lone dissenters.
IBM even had a name for this: They called it “issuing a nonconcur.”
Gerstner discovered that IBM managers had actually designed a formal nonconcur system into the company’s strategic planning process.
In a rather incredible memo that Gerstner uncovered, an executive went so far as to ask each business unit to appoint a “nonconcur coordinator” who would be responsible for blocking projects and proposals that would conflict with the division’s goals and interests. Sometimes, a culture of no arises in an organization because meetings have become places where people strive to deliver “gotchas.”
Some organizations reward those who are great at dissecting others’ ideas, even if they offer no alternatives themselves. You get rewarded for good “gotcha” moments, as if you were still in an MBA classroom.
It’s important to note that there is a key difference between the effective use of devil’s advocates versus the culture of no.
Dissenters in a culture like IBM’s back before Gerstner were simply trying to tear down or block proposals and ideas. They were trying to shut down a particular avenue for the firm.
Effective devil’s advocates are putting forth critiques in an attempt to ultimately strengthen a proposal, or to open up a discussion by helping to generate lots of new options. YES A culture of yes.
In senior management meetings people tended to stay silent if they disagreed with a proposal on the table. They seemed to indicate that they endorsed the proposal, at least by their silence.
Then, however, they would later express their disagreement, lobby to overturn the choice, or try to undermine the implementation of the plans with which they disagreed.
You end up with a false consensus that emerges at meetings. You think everyone is behind a plan, when they are actually not.
In these kinds of situations, you have to remember that silence does not mean assent. When people are not contributing to a discussion, they may disagree strongly but not wish to voice their dissent in the meeting. A culture of maybe entails management teams that strongly desire to gather as much information as possible, so much so that they get caught in “analysis paralysis.”
Analysis paralysis is when you constantly delay decision and action because you think just a bit more information and analysis might clarify your choice.
The culture of maybe afflicts people and organizations who have a hard time dealing with ambiguity, or who engage in conflict avoidance when someone disagrees with a majority position of the group. When we think about information search, we have to remember that there is always some theoretically optimal amount of data that we should gather.
The cost of gathering information tends to rise at an increasing rate. That is, when you already have 90% of available data on an issue, the incremental cost of gathering additional information is much higher than when you only have 10% of the available data.
Meanwhile, the benefit of gathering additional information tends to exhibit diminishing returns. In other words, when you already have 90% of the available data, the incremental benefit of gathering additional bits of information tends to be much smaller than when you have only 10% of the available data. Thus, what you ideally search for is the point where there is the biggest gap between the total benefits of gathering data and the total costs of gathering data. Put another way, you want to gather additional information as long as the incremental benefit exceeds the incremental cost.
Of course, you cannot calculate these costs and benefits as a leader. However, you can make judgments as to when you are not garnering additional value by searching for additional information. When leaders face a chronic problem of indecision, they often look for ways to accelerate decision making in the organization. They seek shortcuts.
They might grasp for analogies more frequently and leave themselves vulnerable to flawed analogical reasoning.
They might try to adopt some rules of thumb that have become conventional wisdom in their industry or organization.
They might imitate what their competitors are doing, even though simply copying your rivals is unlikely to lead to competitive advantage. Where do indecisive cultures originate?
They often originate from past success. Some variant of what appears to be dysfunctional behavior actually worked for the firm in the past.
This appears to have been the case at Digital Equipment Corporation under Ken Olsen.
Certain behaviors proved effective during the remarkable rise of that firm, but then the firm became rigid.
When the environment shifted, the firm could not adapt. What is procedural justice, or fair process?
Legal scholars first pioneered this concept.
They showed that people did not simply care about the verdict in a legal proceeding. They also cared about the process. They wanted it to be fair as well. Procedural Justice Leaders not only have to make good decisions; they also have to be able to implement them effectively.
The process we use to make decisions often has a great impact on whether we will be able to implement them successfully.
1. Consensus is the key to smooth implementation.
2. Consensus does not mean unanimity.
3. Consensus is the combination of commitment and shared understanding.
4. You must have both. Commitment without shared understanding just equals blind devotion. Understanding without commitment means you won’t get the dedication and cooperation you need. The problem is that the more vigorous debate you induce, the harder it may be to arrive at consensus.
1. Thus, you have to be sure to keep conflict constructive.
2. However, you must do more than that. You must ensure that the decision process is both fair and legitimate. Fair processes provide a “cushion of support” when you make decisions that are not necessarily popular with all those involved.
1. People’s satisfaction with legal proceedings don't vary much between unfair and fair processes if they win the verdict.
2. However, those who lost the verdict were much happier in the fair process condition. They weren’t as happy as those who won the verdict, but the gap was much smaller than in the unfair process.
3. Thus, the argument is that leading a fair process will help you get people on board with your decision even if it’s not popular. The Three Principles Of Fair Process Fair process matters in management as well, not simply in legal matters. Fair process helps to build consensus which in turn fosters effective implementation.
1. People do not want to see a “charade of consultation”.
2. In a charade of consultation, people develop alternatives, make a decision, consult with their team, steer the discussion toward their preferred choice, and then announce the decision that they had made at the outset. What are the components of fair process?
You must give people ample opportunity to express their views—and to discuss how and why they disagree with other group members.
People must feel that the decision-making process has been transparent (i.e., the deliberations have been relatively free of behind-the-scenes maneuvering).
They must believe that the leader listened carefully to them and considered their views thoughtfully and seriously before making a decision.
They must perceive that they had a genuine opportunity to influence the leader’s final decision.
They have to have a clear understanding of the rationale for the final decision. Put another way, fair process means a leader demonstrating genuine consideration of others’ views. To do that, leaders should do the following.
Provide a process road map at the outset of the decision process.
Reinforce an open mind-set.
Engage in active listening.
Explain their decision rationale.
Explain how others’ inputs were employed.
Express appreciation for everyone’s input. What is procedural legitimacy?
Procedural legitimacy refers to the notion that a decision process is perceived to be consistent with certain socially acceptable and desirable norms of behavior.
What types of actions in the decision process convey procedural legitimacy?
1. You can gather extensive amounts of data.
2. You can present many different alternatives.
3. You can conduct a great deal of formal analysis.
4. You can bring in outside experts and consultants. The challenge is that many efforts to promote legitimacy may actually diminish legitimacy if people perceive the actions as purely symbolic (i.e., if the decision is preordained).
1. Sometimes people just gather lots of information and present many options because they want to make it seem as though they were very thorough and comprehensive, so as to build legitimacy.
2. However, their minds may already be made up.
How can you preserve procedural legitimacy?
1. Share information equally with all participants.
2. Avoid token alternatives.
3. Separate advocacy from analysis.
It’s important that leaders test for alignment between their perceptions of the process and the participants’ perceptions. At times, leaders think the process is fair and legitimate, but their team members have very different perceptions. This can be quite problematic How do we reconcile a desire for conflict in the decision process with a need for procedural justice and legitimacy?
A. The two are not at odds with one another.
B. You will enhance perceptions of procedural justice and legitimacy if you give your team members an opportunity to not only air their views, but also debate them with others in an open and transparent manner. The traditional prescriptive model of decision making suggests that we should go through a linear progression of divergence and then convergence.
The model suggests that you should diverge in the early stages of a decision process, gathering as many diverse perspectives and views as possible.
Then you should try to converge, narrowing down the options and coming to a decision.
However research suggests that the most effective way to achieve closure is not to pursue such a linear process. Achieving Closure through Small Wins 1. Research suggests that effective leaders, such as Eisenhower, pursue an iterative process of divergence and convergence.
2. They stimulate debate, but they are always on the lookout for areas of common ground.
3. Those moments of agreement help the group avoid extreme polarization and dysfunctional conflict, and they help build momentum toward closure.
4. The idea is that leaders should pursue small wins throughout the decision process, rather than waiting to converge toward the end of the process.
5. Andrew Venton and his management team demonstrate an effective process of small wins, ultimately leading to efficient closure. Why are small wins important?
Small wins bring new allies together and give people proof that they can reconcile differences constructively.
One agreement serves as a catalyst for more productive debates and further agreements down the line.
Two obstacles are overcome by a small wins approach: One is cognitive, and the other is socioemotional in nature. Why are small wins important?
The cognitive obstacle in many complex decision-making situations is that individuals experience information overload. Ambiguity and complexity become overwhelming.
The socioemotional obstacle is that many decision makers experience frustration, stress, and personal friction during complex situations.
Breaking large, complex problems into smaller parts and then gaining small wins on those parts can be an effective way of dealing with these cognitive and socioemotional obstacles. Two Types of Small Wins There are process-oriented small wins. These involve goals and objectives, assumptions, and decision criteria. There are outcome-oriented small wins. These involve the way you take alternatives off the table gradually, seek option-oriented agreements at times, and adopt contingency plans. At some point, you must still find a mechanism for shifting into decision mode if you want to achieve closure in a timely fashion. You have to be able to get your team to see that the time for debate is over.
There are 3 keys to shifting into decision mode effectively.
1. First, leaders can develop a clear set of expectations regarding how the final decision will be made.
2. Second, they can develop a language system that helps them communicate how their role in a decision process will change at a critical juncture in order to achieve timely closure.
3. Finally, leaders can build a relationship with a confidante who can not only offer sound advice but also bolster the leader’s confidence when he or she becomes tepid and overly risk averse. The final important point is for leaders to work hard to sustain closure after it is achieved.
1. Sometimes individuals will try to unravel decisions that have already been made.
2. Leaders need to hold people accountable for decisions in which they have taken part and not allow them to undermine a group’s choice during the implementation process.
3. Leaders must work hard to build and sustain the trust of their colleagues, because in the end, that is the most important attribute that they can use to help achieve and sustain closure. We shift to a focus on the organizational unit of analysis.
In our first module, we focused on individual decision making as our unit of analysis. We largely discussed cognitive issues—things going on in the minds of individuals.
In our second module, we focused on group dynamics.
In this third module, we focus on the organizational unit of analysis. Normal Accident Theory 1. We begin by looking at 2 different perspectives on organizational decision-making failures, one structural and the other behavioral. The structural perspective is called normal accident theory.
2. Then we examine several theories that attempt to bring together organizational analysis with cognitive and group-dynamics perspectives. These “multiple lenses” approaches can be very powerful.
3. Using a multiple-perspectives approach, we look at the particularly challenging problem of how organizations make decisions in the face of ambiguous threats.
4. Finally, we look at how organizations can make better decisions in high-risk/high-ambiguity environments.
5. We will try to wrap up the main concepts and ideas from the course. 1. We begin by looking at 2 different perspectives on organizational decision-making failures, one structural and the other behavioral. The structural perspective is called normal accident theory.
2. Then we examine several theories that attempt to bring together organizational analysis with cognitive and group-dynamics perspectives. These “multiple lenses” approaches can be very powerful.
3. Using a multiple-perspectives approach, we look at the particularly challenging problem of how organizations make decisions in the face of ambiguous threats.
4. Finally, we look at how organizations can make better decisions in high-risk/high-ambiguity environments.
5. We will try to wrap up the main concepts and ideas from the course. Charles Perrow developed normal accident theory to explain how decision failures happen in complex, high-risk organizations.
Perrow examined the structural characteristics of organizational systems that involve high-risk technologies such as nuclear power. Most famously, he studied the Three Mile Island nuclear power plant accident that occurred several decades ago.
1. Perrow’s conceptual framework classifies all high-risk systems along 2 dimensions: interactive complexity and coupling.
2. Interactions within a system may be simple and linear, or complex and nonlinear.
3. Coupling may be either loose or tight.
4. Perrow argues that systems with high levels of interactive complexity and tight coupling are especially vulnerable to catastrophic failures.
5. In fact, he argues that accidents are inevitable in these situations; certain failures constitute “normal accidents.” Interactive complexity refers to the extent to which different elements of a system interact in ways that are unexpected and difficult to perceive or comprehend.
1. Often, these interactions among elements of the system are not entirely visible to the people working in the organization.
2. Simple, linear interactions characterize systems such as a basic manufacturing assembly line. In that instance, the failure of a particular piece of equipment typically has a direct, visible impact on the next station along the line.
3. The operations of a nuclear power plant do not follow a simple linear process; instead, they are characterized by complex and nonlinear interactions among various subsystems.
4. The failure of one component can have multiple unanticipated effects on various subsystems, making it difficult for an operator to diagnose the symptoms of a developing catastrophe. Tight coupling exists if different elements of an organizational system are highly interdependent and closely linked to one another, such that a change in one area quickly triggers changes in other aspects of the system.
1. Tightly coupled systems have 4 attributes: time-dependent processes, a fairly rigid sequence of activities, one dominant path to achieving the goal, and very little slack.
2.When such rigidity exists within an organization, with few buffers among the various parts, small problems can cascade quickly throughout the system, leading to catastrophe.
3. Loose coupling exists when subsystems are not as tightly integrated, such that small errors in one area can be isolated or absorbed without affecting other subsystems. Perrow points out that engineers often build redundancies into complex systems to try to protect against catastrophic failures.
1. Unfortunately, such redundancies may actually add to the complexity and rigidity of the system, making it more vulnerable to failure in some circumstances.
2. Thus, in the end, Perrow comes to the difficult conclusion that some accidents are simply inevitable in systems that exhibit interactive complexity and tight coupling. There are some limitations and critiques of normal accident theory as a way to explain organizational decision-making failures.
First, many scholars and practitioners find the theory frustrating, in that it does not move us toward an understanding of how to prevent catastrophic accidents. It appears to have little prescriptive value.
1. Some have argued that we should be exploring ways to reduce interactive complexity and tight coupling in organizations.
2. That is possible to some extent, but not completely.
3. Toyota clearly is an example of a company that has tried to do this, as are many hospitals that are trying to reduce the likelihood of tragic medical accidents. A second major criticism refers to the problems inherent in the classification scheme itself.
1. The 2 system dimensions articulated by Perrow are useful in helping us understand the vulnerability of organizations.
2. However, one cannot easily classify organizations in his 2 × 2 matrix.
3. For instance, where does commercial aviation fit? Despite the limitations, Perrow’s theory, along with subsequent work by others, has helped us understand how complex organizational decision-making failures happen.
We have come to learn that most of these failures do not trace back to one single cause.
They involve a chain of decision failures—a series of small errors that often build upon one other in a cascading effect.
In many situations, one seemingly small decision failure can snowball, leading to a whole series of other errors that ultimately leads to a catastrophic failure. Psychologist James Reason has done some interesting work in this area. He has crafted the famous “Swiss cheese analogy” for thinking about how organizations can limit the risk of these catastrophic decision failures.
1. He describes an organization’s layers of defense or protection against accidents as slices of cheese, with the holes in the block of cheese representing the weaknesses in those defenses.
2. In most instances, the holes in a block of Swiss cheese do not line up perfectly, such that one could look through a hole on one side and see through to the other side.
3. In other words, a small error may occur, but one of the layers of defense catches it before it cascades throughout the system.
4. However, in some cases, the holes become completely aligned, such that an error can traverse the block (i.e., cascade quickly through the organizational system).
5. Reason argues that we should try to find ways to reduce the holes (i.e., find the weaknesses in our organizational systems) as well as add layers (build more mechanisms for catching small errors). We shall discuss this more in upcoming lectures. In her groundbreaking book on the Challenger Space Shuttle Disaster, Vaughan explained her theory of the normalization of deviance. Normalizing Deviance She argued that engineers and managers moved down a dangerous slippery slope in a gradual evolutionary process
that took place over many years.
At first, NASA officials did not expect or predict O-ring erosion on shuttle flights. It was not in their original designs.
When a small amount of erosion was discovered on an early, successful shuttle mission, engineers considered it an anomaly.
Then it happened again. Gradually, the unexpected became the expected. O-ring erosion began to occur regularly.
Engineers rationalized that sufficient redundancy existed to ensure no safety-of-flight risk. Small deviations became taken for granted. Over time, however, deviations from the original specification grew. Engineers and managers expanded their view of what constituted acceptable risk.
Vaughan recounted to me that as the years unfolded, the “unexpected became the expected became the accepted.”
The launch decision, therefore, could only be understood in the context of this long pattern of decisions, during which a gradual normalization of deviance took place. In short, history matters a great deal. Decisions and catastrophic failures cannot be understood without examining their historical context.
A key point Vaughan argued is that the culture shaped this evolutionary process.
1. NASA operated under tremendous schedule pressure throughout the years.
2. It had a culture that emphasized a distinction between engineers and managers.
3. NASA had cast space flight as routine, and that mind-set permeated the culture.
4. NASA always operated under the influence of their perceptions of the political culture in which they existed.
5. Thus, the culture shaped, and even encouraged, the normalization of deviance over time. In this lecture, we examine a groundbreaking conceptual model that shows how we can look at organizational decision making through 3 lenses. Allison’s Model—Three Lenses Graham Allison was a political scientist who taught at the Kennedy School of Government at Harvard. Allison sought to question whether we could simply look at the leaders of a large, complex organization to understand decisions that were made. He wanted to understand how decisions might not simply be the product of a leader’s thinking—how they might be the outcome of group dynamics, organizational processes, and organizational politics. He wrote a classic book in which he developed 3 lenses for looking at any complex organizational decision, and he illustrated the model by looking at the Cuban missile crisis through each lens. Each lens provided an alternative way of explaining why certain decisions were made during that crisis. First Lens Allison’s first lens was what he called the rational actor model.
Allison wrote that many people looked at organizational decision making as the product of the thinking of a rational leader at the top. By “rational,” he referred to how economists and game theorists modeled decision making.
In game theory, the notion is that individuals make decisions based on an assessment of what other players (perhaps rivals) will do. 1. Game theorists point out that self-interested behavior on the part of each individual party sometimes leads to a suboptimal outcome. Collaboration can yield a better outcome.
2. The classic example of this problem is called the prisoner’s dilemma.
3. In the prisoner’s dilemma, 2 criminals each do what is in their self-interest, given their expectation of how the other will behave.
4. Unfortunately, they would be better off if they could collude.
5. Game theorists have then looked at how collaboration might evolve without collusive behavior. They focus, in particular, at how collaboration might evolve over time if a “game” is repeated many times.
6. One strategy for enticing other parties to collaborate is called the tit-for-tat strategy. Allison and others have criticized these game theory models for several reasons.
1. Most importantly, they assume that the other party is rational and is clearly pursuing its self-interest.
2. Critics wonder if decision makers canvass the full range of options or if they satisfice, as James March and Herbert Simon suggest.
3. Game theory also assumes that each party can accurately predict and take into account their payoffs, as well as their rivals’, under alternative scenarios. Some have questioned this presumption.
4. Allison wondered whether it was reasonable to think that Khrushchev was acting rationally during the Cuban missile crisis. Allison’s other 2 lenses are models of organizational processes and coalition politics.
In terms of organizational processes, Allison points out that organizations develop routines, decision rules, and procedures.
1. He builds heavily on the work of Herbert Simon, Richard Cyert, and James March in crafting this second model.
2. The argument is that decisions do “bubble up from below” at times, based on the decision rules and routines of various subunits of the organization.
3. Clearly, during the Cuban missile crisis, we see how the procedures of various units of the government shaped and constrained the ultimate decisions that were made. In terms of organizational politics, Allison focuses on the notion that all organizations are governed not simply by a single leader, but by a “dominant coalition.”
1. That senior management team, or dominant coalition, engages in bargaining and negotiating during decision-making processes.
2. Each member of the coalition has his or her own interests and objectives, beyond simply the shared organizational goals.
3. The balance of shared and personal goals creates room for negotiation.
4. Tension arises between what negotiation scholars call value-claiming and value-creating behavior.
5. Decisions result from the outcome of these complex negotiations.
6. These negotiations involve a hefty amount of political behavior in many cases. Second Lens In sum, Allison argues that we can’t simply look at the leader to understand decision making in complex organizations. Decisions are not simply the product of the leader’s cognitive process. Over the years, scholars have tried to understand complex organizational decisions by examining them through multiple lenses.
However, rather than simply looking at the lenses as alternative ways to explain a set of decisions, scholars have tried to integrate multiple perspectives.
Scholars have examined situations from individual, group, and organizational units of analysis and then tried to integrate these perspectives. Practical Drift 1. One example of this work is research by Scott Snook’s on a friendly-fire accident that took place in 1994 in northern Iraq.
3. Snook examined the incident using theories of individual, group, and organizational decision making.
4. Then he integrated these perspectives, creating a cross-levels analysis. From this, he developed his theory of practical drift for how some faulty organizational decisions are made. We’ll look at that theory in this lecture. Snook argued that these different lenses were not sufficient to explain what happened. He crafted his theory of practical drift, which he described as a “cross-levels” phenomenon. Here is how practical drift works.
All organizations establish rules and procedures.
Units within the organization engage in practical action that is locally efficient.
These locally efficient procedures become accepted practice and perhaps even taken for granted by many people.
Gradually, actual practice drifts from official procedure.
The drift is not a problem most of the time, but in certain unstable situations, it gets us into big trouble. In conclusion, we see many organizational decisions that occur because of practical drift.
Organizations have many informal processes that drift away from official procedure.
Some of that informal action is thoughtful and entrepreneurial.
But sometimes communication breakdowns (and other barriers) cause organization members to not understand how their actions may affect others in other units.
Unforeseen interactions occur at times, and this can be problematic. Multiple levels of analysis can help us understand a particularly thorny decision-making problem for organizations, namely, situations in which the organization is faced with ambiguous threats. Ambiguous Threats and the Recovery Window All organizations face ambiguous threats at times, where a problem exists, but its consequences are highly uncertain.
Organizations usually have some finite opportunity to recover from those initial threats.
However, many organizations systematically discount and underreact to ambiguous threats.
This raises 2 questions:
1. Why do organizations discount ambiguous threats when making decisions?
2. How can organizations improve their decision making in these situations? It’s important to introduce the concept of a recovery window at this point.
1. A recovery window is the time period between the emergence of an ambiguous threat and the actual occurrence of a catastrophic failure, during which some preventive action can be taken.
2. Recovery windows can last minutes, weeks, or months.
3. Many organizations fail to take advantage of these recovery windows.
4. Simply raising awareness of this concept can be important in helping organizations become more effective at making decisions. How can organizations cope with ambiguous threats more effectively?
We learn that coping with ambiguous threats requires organizational leaders to do the following.
1. Amplify threats and make it clear to everyone that a recovery window is now open.
2. Engage in learning by doing, particularly through simple, low-cost, rapid experimentation.
3. Lead a top-down, directed effort to establish focused, cross-disciplinary problem-solving teams to address threats. How can leaders foster more effective connecting of the dots within organizations? Connecting the Dots There are various reasons why organizations do not share and integrate information effectively.
In complex organizations, we have problems of structural complexity.
We also have high levels of differentiation among subunits, which can come at the expense of integration.
Finally, concerns about power can impede information sharing. People can cling to information because it provides them power.
More generally, we know that even small groups have trouble sharing information, as we discussed in an earlier lecture. We know that people tend to focus on information they hold in common with others and pay much less attention to privately held information. Leaders can work on their facilitation skills and approaches in various group settings.
1. They can “manage airtime”—ensuring that a few people do not dominate the discussion.
2. They can reiterate ideas and statements that emerged quickly but perhaps did not receive adequate attention from others.
3. They can ask many clarifying questions to ensure and test for understanding.
4. They can invite dissenting views and induce debates.
5. Finally, leaders can take time near the end of a decision-making process to highlight the areas of remaining uncertainty that would ideally be resolved before making a decision. At the organizational level, leaders might adopt centralized or hierarchical structures to help connect the dots.
2. Such hierarchical approaches can be quite problematic though.
Alternatively, organizations can work on other types of mechanisms to foster sharing and integration of information.
1. Leaders can foster the formation and enhancement of social networks across organizational units.
2. Leaders can use technology and mass collaboration techniques to marshal the collective knowledge and intellect
of many people throughout an organization (through things such as wiki technology).
3. Most importantly, leaders need to work on the mind-set of people throughout the organization, to see that sharing information becomes more acceptable and that problem prevention becomes as rewarded and valued as problem solving. Seeking Out Problems While many scholars have studied organizational decision-making failures, we also have a body of researchers who have studied why some complex organizations in high-risk environments have operated with very few accidents over many years. They have coined the term “high-reliability organizations” (HROs) to describe these enterprises.
Scholars have examined organizations such as aircraft carriers and air traffic control centers. The error rates for these organizations are remarkably low given the hazardous conditions in which they operate.
1. For instance, Karlene Roberts has noted that the number of accidents for pilots operating on naval aircraft carriers is amazingly low—slightly less than 3 fatalities per 100,000 hours of flight time.
2. Scholars in the high-reliability field have argued that some organizations seem to have found a way to cope with the interactive complexity and tight coupling that led to inevitable failures. Karl Weick and Kathleen Sutcliffe have coined the term “mindfulness” to describe the 5 characteristics of most HROs (High Reliability Organizations).
First, HROs appear to be preoccupied with failure of all sizes and shapes.
Second, HROs exhibit a reluctance to simplify interpretations.
HROs demonstrate sensitivity to operations.
Fourth, HROs exhibit a commitment to resilience.
Finally, HROs ensure that expertise is tapped into at all levels of the organization. FIVE Asking the Right Questions Leaders need to identify a problem before they can solve it.
In many instances, leaders do not spot a threat until it is far too late.
At times, leaders set out to solve the wrong problem.
In order to be an effective leader, you need to become a better problem finder, not just a better problem solver.
Organizational breakdowns and collapses tend to evolve over time, beginning with small errors that are compounded and eventually gain momentum.
Leaders need to become hunters who venture out in search of problems that might lead to disasters for their firms. The sooner they can identify and reveal problems, the more likely it is that they can prevent a catastrophe. What’s the common thread that ties together the great problem finder with the great problem solver or decision maker?
Leaders must discard the notion that they have all the answers.
Leaders must focus on shaping and directing an effective decision-making process, marshaling the collective intellect of those around them.
Leaders must focus on process, not just content.
Full transcript