Biases impacted many decisions wholeheartedly in this case. One instance in this case study would be the worker disagreements, and the instance from when BP and Transocean disagreed. The disagreements among the workers from the two firms resulted from discussion about buildup of pressure in the well. This resulted in bad coordination among both entities. One was situated on one outcome and the other group was set around their decision. To me this represents displayed signs of how things were looked upon as one-sided and partial. The text states that, “Judgmental Heuristics represents rules of thumb or shortcuts that people use to reduce information-processing demands” (Kreitner & Kinicki, 2013, p. 335). There were many shortcuts taken in this case study, especially highlighting how gathering information about less risky well designs should have been considered instead of focusing on choosing the cheapest option among other alternative …show more content…
The confirmation bias was displayed in how they set out to purchase the less costly design without looking into other alternative designs. Therefore, they discounted other methods that factored in cost and safety measures. The confirmation bias implies how we seek out information that is supportive of our decisions based on our instincts all the while limiting our focus and decisions based on the contradictory information that exists (Hammond, Keeney, & Raiffa, 1998). The overconfidence bias was also present. According to Kreitner and Kinicki (2013, p. 336), the disastrous oil spill aroused from “technological arrogance, hubris, and the overconfidence bias played major roles.” So it is essentially important to avoid falling into the trap of the overconfidence bias. In agreement with Hammond, Keeney, and Raiffa, when utilizing the overconfidence bias it tends to lead to errors in judgement therefore resulting in bad
In terms of the anchoring bias, regularly revisit of the original decision based on the newly gathered data needs to be set up within the organization. Additionally, the decision maker should avoid the Confirmation Trap in which Bazerman and Moore (2009) argues that people tend to seek information that confirms their expectations and hypotheses. To recognize the bias, Mike Francis could
Storr invested at least $2500 of his own money into this chapter of The Unpersuadables. While that may seem outrageous, it is only the tip of this chapter’s shocking iceberg – after all, Storr centered this chapter around his experience with the infamous World War II conspiracy theorist David Irving and his Nazi-sympathizing followers. This chapter was reminiscent of Storr’s time with John Mackay in many ways: like Mackay, Irving held unswayable beliefs, and like at the beginning of the book, Storr attempted to find out why. The difference in this chapter, however, was that Storr approached this case with a clearer understanding of his purpose among the men whom he understandably did not wish to associate with. With this understanding came the revelation of a few important gems of knowledge.
The key takeaway from this case is the impact that rational decision making, and bias avoidance could have altered the outcome of the events. The identification of the problem along with generating a solution, could have assisted the team in learning that not only was there a problem with the pressure on the well, but that a series of other anomalies were occurring that on their own were not devastating, but acting as a group, they could be traumatic to the well construction. Eliminating the understood biases of individuals involved would also help to address the problem in a rational manner with consideration given to the biases, but not allowing them to be a basis for the decisions that are made. It is understandable that individuals would
I really enjoy reading the article “You Are Not So Smart” the article really brought up some valid points about “Confirmation Bias” with using examples of our everyday life. There were many interesting piece of information from the article, but there was a few that really drew my attention. For instance, where the teaching confirmation bias in classroom where numbers were involved. The example is followed when the teacher show the students three different numbers, where they have to guess what the numbers were and why they are in a certain order as well as to guess the secret that the teacher used to select these number. Then the students were asked to come up with their own three numbers with the same method that they think that was used throughout
Confirmation bias is when a person would rather try to confirm or support a hypothesis than try to prove it. For example, in the Wason Task each card is either “E,” “J,” “6,” or “7,” which follows
Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true. People tend to be motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views (prejudices) one would like to be true (Heshmat, 2015). Therefore, is very important for people to use their critical thinking skills. Good critical thinking skills required that we evaluate evidence thoroughly and be aware of social and cognitive errors in our thinking to effectively evaluate any given situation. And avoid jumping to a conclusion or acting quickly based on preconceived ideas (Boss,
Biases can be inherent for an individual and impact a decision that they are making or are involved in. The eight biases that can affect decision making skills are availability, representativeness, conformation, anchoring, overconfidence, hindsight, framing and commitment escalation. In the case of Deepwater Horizon, the biases that impacted the decision making were representativeness, overconfidence, hindsight and commitment escalation. Representativeness bias can be identified when individuals use data to forecast the chance that an event will occur when uncertainty is factored in. In this case, the individuals involved did not feel like the data collected showed that there was a high probability of an oil spill to occur based on previous
Decision making biases played a devastating role in the Deepwater Disaster. The biases that were present are; representativeness heuristic, overconfidence bias, confirmation bias and escalation of commitment bias. Based on my findings, to some extent, each of the following biases contributed to the Deepwater Disaster.
Chapter 2 of the text book begins with an exercise designed to test the reader’s knowledge. The reader is to have a bounded range where a 98% confidence level is reached. I failed miserably in this exercise, which is probably why the chapter led with it. Bazerman writes that overconfidence is “the most robust finding in the psychology of judgment.” (p. 14) It is appears to be an innate characteristic for much of the population. Overconfidence has been studied by psychologists and three characteristics of overconfidence commonly appear: overprecision, overestimation, and overplacement. I am glad to know that I am a part of much of the population.
Scientific research is a process to discover truth. This process begins with the researcher developing a theory on a phenomenon which is formed into a hypothesis. A hypothesis is defined as an educated guess about a relationship between variables which is then further examined. This research process can be distorted by social cognition biases. Social biases are belief perseverance, confirmation bias, and availability heuristic. Confirmation bias and availability heuristic are the two social cognition biases that can lead to errors in research. This leads the researcher to ignore information that does not endorse their beliefs known as confirmation bias and/or overestimate the occurrence of events in their research to make their research more relevant known as availability heuristic.
Biases made a huge impact in this case where the companies were just too overconfident which is the overconfident bias. Overconfident bias is the tendency to be overconfident about estimates or forecasts( Kreitner & Kinicki 2013). BP ignored the warning signs due to their overconfidence and incompetence. As explained above BP's and Transocean's approach was more characteristic of the garbage can model because decision making was haphazard and sloppy and caused lives to be lost due to their being no orderly series of steps taken for making
A presidential oil-spill commission attributed the 2010 Deepwater Horizon oil rig explosion to technological arrogance, hubris, and the overconfidence bias (Kreitner & Kinicki, 2013). Based on countless engineering theories there were signs that decision makers should have paid more attention to. This disaster could have been prevented, as it was a direct result of cost-cutting decisions made by BP and its business partners.
There are many everyday examples of people using confirmation bias behavior. A student doing research on only one side to an argument for a paper to confirm their thesis may fail to fully search the topic for information that is inconsistent with what they are writing about. Also a reporter who is writing an article on an important issue may only interview
Yes, there were cognitive biases at Level 3 and other companies as well as the investors during 1997-2001. The biases were…
More specifically, they were trying to avoid the blame and further their individual motives. When I saw the film I thought to myself there’s no possible way this happens in real life, but the article shows two examples. The Challenger case as well as the Macondo Well blowout showed that even though engineers might have noticed a problem, in the end management or other engineers decided that it was safe enough to proceed. Then during the investigation, the leaders of the company or project managers decided to play innocent and ignorant. The other thing that I found interesting is the “practice defines facts” model. If correlations are found, then they become facts until they are disproven; this is the problem with the model. A fact should not be able to change, if a fact is truly a fact then the evidence should always support it. When we spread misinformation by stating a correlation is fact, it can lead to many people thinking that it is the truth. It goes back to the first articles we read in class about misleading information in science. Not only does the Macondo/Challenger article tie into the film that we watch it also ties into other topics that we learned about ethics. People in certain situations have an innate self-interest that hurts