Task 5 - The Emotional Tail Wags the Rational Dog PDF

Summary

This document examines judgment and decision-making, focusing on psychological concepts like heuristics, base rates, and the influence of emotion. It explores how biases and cognitive shortcuts affect our choices, drawing examples from real-life scenarios and studies to illustrate the theories discussed.

Full Transcript

Task 5 - The emotional tail wags the rational dog Judgement and decision-making - Eysenck chapter Introduction Judgement - deciding on the likelihood of various events using incomplete information. A measure of good judgement is accuracy. Decision-making - s...

Task 5 - The emotional tail wags the rational dog Judgement and decision-making - Eysenck chapter Introduction Judgement - deciding on the likelihood of various events using incomplete information. A measure of good judgement is accuracy. Decision-making - selecting 1 option from several possible options. Differs from problem solving in that individuals who problem solve must generate their own solutions rather than choose from presented options. Decisions are typically assessed in terms of their consequences. However, good decisions can also lead to poor consequences. Judgment is typically an important initial part of decision-making (e.g. someone deciding which car to buy may make judgments about how much various cars would cost to run, how reliable they would be etc.) Judgment research Bayesian inference - a form of statistical inference in which initial beliefs (prior probabilities) are modified by evidence or experience to produce posterior probabilities. Neglecting base rates According to Bayes' theorem, individuals making judgements should take account of base-rate information (the relative frequency of an event within a given population). However, most people tend to ignore or deemphasize such information. Heuristic - a strategy that ignores part of the given information, with the goal of making decisions more quickly, frugally and/or accurately than more complex methods. Representativeness heuristic - the assumption that an object or individual belongs to a specified category because it is representative (typical) of that category. Study: participants were given the lawyer-engineer problem: Jack is a 45-year-old man. He is married and has four children. He is generally conservative, careful and ambitious. He shows no interest in political and social issues and spends most of his free time on his many hobbies which include home carpentry, sailing and numerical puzzles. Half are told that the description had been randomly selected from 100 descriptions, 70 of which were of engineers and 30 of lawyers. The other half are told that the lawyers were 70 and the engineers were 30. On average, participants decided that there was a.90 probability Jack was an engineer regardless of whether most of the 100 descriptions were of lawyers or engineers (=> participants ignored the base-rate information and used the representativeness heuristic, because Jack's description is typical for an engineer). Conjunction fallacy - the mistaken belief that the conjunction or combination of 2 events (A and B) is more likely than one event (A or B) on its own. When people are given the following description, they use the representative heuristic and fall for the conjunction fallacy - to the question 'Is it more likely Linda is a bank teller or a feminist bank teller?', they answer that it's more likely she is a feminist bank teller: Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Most people misinterpret the statement 'Linda is a bank teller' as implying she is not a feminist. However, the conjunction fallacy is still found when people interpret the problem correctly. Heeding base rates Even though people underuse base-rate information, they do use some of it when making judgements. People use base-rate information when strongly motivated to do so. Study: participants are asked to put some saliva on a strip of paper. If it turns blue, it indicates a health problem. There is a 10% chance that the test was misleading. The paper turns blue. Most participants used the base-rate information (10% chance of misleading result) to argue the test was inaccurate. However, participants who were told the paper turning blue meant they did not have a health problem perceived the test as accurate. Availability heuristic Affect heuristic - using one's emotional responses to influence rapid judgements or decisions. Study: participants are asked to judge the benefits of medical tests (e.g., MRI scan for migraine headaches). Those receiving information indicating that a given test might cause harm experienced more negative affect and judged the benefits of the test as less. Complex Cognition Page 1 indicating that a given test might cause harm experienced more negative affect and judged the benefits of the test as less. Availability heuristic - the rule of thumb that the frequencies of events can be estimated accurately by the subjective ease with which they can be retrieved from long-term memory. Study: people are asked to judge the relative likelihood of different causes of death. Those attracting much publicity (e.g., murder) were judged more likely than those that do not (e.g., suicide) even when the opposite was the case. The results from the study above could be due to usage of both the availability and the affect heuristic. The availability heuristic can be based on media coverage or one's own experience (or both). Analysis showed that the best predictor of judged frequencies of different causes of death was availability based on recall of direct experiences (TEX = direct experience + media). Study: doctors faced with a case resembling ones recently encountered tended mistakenly to make the diagnosis appropriate only to the earlier cases. The availability heuristic is sometimes overridden. Study: participants were presented with pairs of names (one famous, one non-famous) and asked to indicate which was more common. They would have chosen the famous names if using the availability heuristic, but mostly selected the non-famous ones. The participants used deliberate thought to override the availability heuristic. When the task was performed under cognitive load to reduce deliberate thought, participants mistakenly chose the famous name 80% of the time. Overall evaluation Anchoring-and-adjustment heuristic - using an initial estimate (an anchor) and then adjusting it to produce a final estimate. However, the adjustment is generally insufficient. Inaccurate judgements are not necessarily due to biased processing. Instead, inaccurate judgements often occur because individuals have been exposed to a small and biased sample of information (an environmental factor). Therefore, the factor for the observed experimental results may not be internal (using heuristics), but external (exposure to a biased sample of information). Natural frequency hypothesis Natural frequency hypothesis - people are better at understanding and making decisions about things when they are presented with information in a simple and natural way, like actual counts or frequencies of events happening, rather than complex fractions or percentages. Study: when a judgment task was presented in probabilities, only 16% of participants produced the correct answer. In contrast, 46% of participants were correct when frequencies were used. Theories of judgement Support theory Support theory of judgment - an event appears more/less likely depending on how it's described. The subjective probability of 'What is the probability you will die on your next summer holiday from a disease, a car accident, a plane crash, contaminated food, or any other cause?' is higher than 'What is the probability you will die on your next summer holiday?' The theory states that there are 2 main reasons for this effect: 1) An explicit description draws attention to aspects of the event less obvious in the non-explicit description. 2) Memory limitations may prevent people remembering all the relevant information if it is not supplied. Subadditivity effect - the tendency to judge the probability of the whole set of outcomes to be less than the total probabilities of its parts (follows from the theory). Study: expert doctors received the description of a woman with abdominal pain. Half assessed the probabilities of 2 specified diagnoses (gastroenteritis and ectopic pregnancy) and a residual category (everything else). The other half assigned probabilities to five specified diagnoses (including gastroenteritis and ectopic pregnancy) and everything else. The subjective probability of all diagnoses other than gastroenteritis and ectopic pregnancy was.50 with the non-explicit description but.69 with the explicit one. An explicit description can reduce subjective probability if it leads one to focus on low-probability causes. Study: participants estimated the probability an American person selected at random would die of disease rather than some other cause was.55. The probability was similar when three typical diseases (heart disease, cancer, stroke) were explicitly mentioned. However, it was only.40 when three atypical diseases (pneumonia, diabetes, cirrhosis) were mentioned. Fast-and-frugal heuristics One view of heuristics is that we possess an "adaptive toolbox" of fast-and-frugal heuristics, which are often quite accurate. Take-the-best heuristic - when having to choose between 2 things, picking the option that has the most important or relevant information in favor Complex Cognition Page 2 Take-the-best heuristic - when having to choose between 2 things, picking the option that has the most important or relevant information in favor of it. For example, if you're trying to decide between two smartphones and one has a much longer battery life than the other, you might use the "take-the-best" heuristic by simply choosing the one with the longer battery life without considering other factors like camera quality or price. Recognition heuristic - a special case of the take-the-best strategy - using the knowledge that only 1 out of 2 objects is recognized as the basis for making a judgement. For example, if you're asked to choose between two vacation destinations and you've heard of Destination A but not Destination B, you might use the recognition heuristic and pick Destination A simply because it's familiar to you. Study: American students were presented with pairs of German cities and decided which was larger. When only one city name was recognized, participants apparently used the recognition heuristic 90% of the time. However, selecting the recognized city does not necessarily mean the recognition heuristic was used – it could have been chosen for other reasons. When using the recognition heuristic, people tend to ignore other information (apart from recognition): Study: participants are told that German cities with football teams tend to be larger than those without. When participants decided whether a recognized city without a football team was larger or smaller than an unrecognized city, participants apparently used the recognition heuristic 92% of the time. However, people sometimes do utilize additional information: Study: German students were asked to decide, which of two American cities was larger. The recognized city was identified as larger more often when they were told it had an international airport than when told it did not: 98% vs 82%, respectively. Many findings are better explained by theories that allow utilizing additional information (e.g. parallel constraint satisfaction theory, which assumes that all available information is integrated automatically in parallel). Dual-process theory: Kahneman (2003) Kahneman's dual-process model - probability judgements depend on processing within 2 systems: System 1 - fast, automatic, effortless, implicit and often emotionally charged, difficult to control or modify. Generates intuitive answers (e.g. based on the representativeness heuristic) System 2 - slower, serial, effortful, more likely to be consciously monitored and deliberately controlled, relatively flexible and can be rule- governed. Monitors, evaluates and may correct the responses generated by system 1. Later the theory was modified so that: System 1 & 2 processes can operate in parallel. System 1 processes cause some information to be strongly activated. This information is then strongly overweighed and leads to biased judgements. System 1 processing often produces errors which may then be corrected by System 2 processing. Study: participants producing the correct answer to the Linda problem (discussed above) took almost 40% longer than those apparently using system 1. Study: on standard base-rate problems, participants were less confident about their responses when they produced incorrect answers than when producing correct ones suggesting there is some processing of base-rate information even when it does not influence judgement. When participants are asked to produce the first answer that comes to mind to base-rate problems and then a more deliberate answer, studies tend to discover that base-rate information often influenced their immediate answer. This contradicts dual-process theory. Due to the contradiction above, it has been suggested that the distinction between system 1 & 2 processes is not useful. Instead, responses should be considered to differ primarily with respect to complexity - more complex responses are more time-consuming and challenging to produce. Alignment problem - even though systems 1 and 2 are assumed to be distinct and related to attributes of consciousness, deliberation, speed etc., these attributes are much less correlated than assumed theoretically. Therefore, it's not possible to define System 1 processes with precision. Good/bad fallacy - it is assumed that system 1 processing is often "bad" and error-prone whereas system 2 processing is "good" and leads to rational judgments. There is a lot of evidence that this assumption is oversimplified. The way forward? De Neys' Logical intuition model - there is rapid intuitive heuristic processing and intuitive logical processing in parallel. This initial processing is sometimes followed by deliberate system 2 processing if the 2 types of initial processing generate different responses. This model accounts for more of the data than the classical dual-process model and also clarifies how conflicts are rapidly detected and trigger system 2 processes. Decision-making under risk von Neumann and Morgenstern's expected utility theory - we try to maximize utility (the subjective value attached to an outcome) when we choose between several options. Utility is assessed with the formula: Expected utility = (probability of a given outcome) x (utility of the outcome) Losses and gains Problem 1 - you are asked if you want $200 if a tossed coin came up heads but a loss of $100 if it came up tails (average expected gain = $50). Complex Cognition Page 3 Problem 1 - you are asked if you want $200 if a tossed coin came up heads but a loss of $100 if it came up tails (average expected gain = $50). People tend to refuse to bet on the coin toss. Problem 2 - you are asked if you want a sure gain of $800 or an 85% chance of gaining $1000 and a 15% chance of gaining nothing (average expected gain = $850 vs $800). People tend to prefer the choice with the smaller expected gain. Problem 3 - you are asked if you want a sure loss of $800 or an 85% chance of losing $1000 with a 15% probability of avoiding any loss (average expected loss = $800 vs. $850). People tend to prefer the choice with the larger expected loss. Prospect theory Prospect theory - developed to explain the above results, based on 2 assumptions: 1) Individuals identify a reference point representing their present state. 2) Individuals are much more sensitive to potential losses than potential gains (loss aversion). They also prefer certain gains over potentially larger but less certain gains (risk aversion). The positive subjective value associated with gains increases slowly as gains become greater. However, the negative value associated with losses increases relatively rapidly as losses become greater. Framing effect - decisions can be influenced by situational aspects irrelevant to optimal decision-making. Study: all participants are given the following information (Asian disease problem): Imagine that the US is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the program are as follows... Gain-frame condition: participants choose between the following prospects: ▪ If program A is adopted, 200 people will be saved. ▪ If program B is adopted, there is a 1 in 3 probability 600 people will be saved, and a 2 in 3 probability that no people will be saved. Program A (certain gain) was chosen by 72%, even though both programs would both lead to the saving of 200 lives on average. Loss-frame condition: participants choose between the following prospects: ▪ If program C is adopted, 400 people will die. ▪ If program D is adopted, there is a 1 in 3 probability that nobody will die, and 2 in 3 probability that 600 people will die. Program D was chosen by 78%. According to prospect theory, the framing effect occurs because people focus on potential gains in the gain-frame condition, while they are motivated by loss aversion in the loss-frame condition. Performance on the Asian disease problem may be affected by social and moral factors deemphasized by prospect theory. Study: In the gain-frame condition, participants strongly prefer the deterministic option when the problem is related to 6 unknown patients. However, they preferred the probabilistic option when it related to 6 close relatives, presumably because participants were concerned about fairness in that condition (all patients share the same fate). Sunk-cost effect - to persist in a course of action because of the prior investments in that option, and not because of the future consequences of pursuing that option The more money and effort individuals have invested in an unhappy relationship, the more likely they are to stay. Study: participants were told that people had paid a $100 non-refundable deposit for a weekend at a resort. On the way there, they both became slightly unwell and felt they would probably have a more pleasurable time at home. Many participants argued the two people should drive on to avoid wasting the $100. Explanation 1) Many people would find it embarrassing if others knew they had wasted resources on an abandoned project. Explanation 2) In a study, Individuals chose whether to continue with an option in which they had already invested time and money (sunk-cost option) or to switch to an alternative option having a higher probability of being successful. Those choosing the sunk-cost option did so because it satisfied their need to feel competent more than the alternative option. The idea of loss aversion has been supported in many studies, but it is less profound when small amounts are concerned. Choice 1) 50% to win £1; 50% to lose £1. Choice 2) 50% to win £5; 50% to lose £5. Complex Cognition Page 4 Choice 2) 50% to win £5; 50% to lose £5. Individuals do not favor one option over the other unless the stakes are high => this is contrary to prospect theory. Hypothesis) Attentional processes influence loss aversion - those who attend more to losses (relative to gains) exhibit greater loss aversion. Study: participants' attention to losses and gains was manipulated. The hypothesis was confirmed. Therefore, it is possible that the loss neutrality with small stakes can be explained by the fact that small losses do not attract much attention. Prospect theory deemphasizes individual differences. People differ significantly in willingness to take risks even with very high stakes. Individuals high in narcissism engage in riskier stock-market investing because they have high sensitivity to reward but low sensitivity to punishment. Prospect theory predicts that people should be risk averse for gains but risk seeking to avoid losses. Study: 25% of participants showed this pattern whereas 44% were consistently risk averse or risk seeking for both gains and losses. Suggests that individuals especially motivated to avoid regret are consistently risk averse whereas those attaching less concern to possible regret are more likely to be consistently risk seeking. Prospect theory deemphasizes the impact of social and emotional factors. It has poor predictive power when the potential losses associated with a decision are of great personal relevance (e.g. medical side effects). Decision-making: emotional and social factors Emotional factors Study: participants were gambling and either they or the computer made the choices. When participants lost, they experienced regret if they had made the decision, but disappointment if the computer had. Regret was followed by riskier choices than disappointment. When participants won, they rejoiced if they had made the decision, or experienced elation if the computer had. Elation was followed by riskier choices. Therefore, the results were more consistent with prospect theory when participants had a sense of personal agency (regret & rejoicing conditions): Humans described by prospect theory are guided by the immediate emotional impact of gains and losses Prospect theory predicts the 2 following things: There is diminishing sensitivity to changes in value as gains and losses increase. The same diminishing sensitivity may be found for feelings associated with gains and losses. The subjective value of a given loss is greater than the impact of an equivalent gain (loss aversion) => feelings would be influenced more by a given loss than an equivalent gain. The first prediction was confirmed empirically, but the second not. It was concluded that the feelings are equal in intensity, but the individuals attend more to (and weight more heavily) the negative feelings anticipated from loss than the positive feelings anticipated from gain. Impact bias - the impact of given losses or gains is greater on expected feelings than on experienced feelings (i.e. people overestimate the intensity & duration of negative emotional reactions to losses and positive emotional reactions to gains). Prospect theory is much less applicable to affect-rich choices (e.g. medical side effects) than affect-poor ones (e.g. monetary losses). Study: participants are faced with an affect-rich or an affect-poor decision. With affect-rich problems, they imagine they suffered from an unspecified illness requiring medication. They choose between 2 medications each producing a given side effect with some probability (e.g. medication A: insomnia with a probability of 15%; medication B: fever with a probability of 10%). With affect-poor problems, participants chose between 2 monetary lotteries each leading to a given amount of loss with some probability. The decisions made by 90% of participants with affect-poor problems conformed to expectations from prospect theory (CPT on figure). This was not the case with affect- rich problems. Many more participants used the minimax rule (ignoring probabilities), which is inconsistent with prospect theory, and focusing only on outcomes with affect-rich problems. Omission bias - a biased preference for risking harm through inaction compared to risking harm through action. Study: British parents were willing to accept a greater risk of their children having a disease than of their children suffering adverse reactions to vaccination. Omission bias with respect to having children vaccinated occurs because parents believe this would increase their anticipated responsibility and regret. Study: perceived regret was greater for action than inaction when social norms favored inaction. However, this effect was much reduced (or even reversed) when social norms favored action. Status quo bias - a preference for maintaining the status quo rather than acting to change one's decision. Many individuals maintain the same allocation of retirement funds year after year even when no costs would be incurred by changing. Study: mistaken rejection of the status quo triggered stronger feelings of regret than mistaken acceptance of the status quo. Complex Cognition Page 5 Study: mistaken rejection of the status quo triggered stronger feelings of regret than mistaken acceptance of the status quo. In addition, mistaken rejection of the status quo was associated with greater activation in brain regions (medial PFC; insula) associated with regret. "Gut feeling" may be important for risk-taking. Study: financial traders working on a London trading floor showed superior interoceptive ability by perceiving their own heartbeats more accurately than controls. Traders with the highest interoceptive ability generated greater profits and survived longer in the financial markets. Social factors Social functionalist approach - people act like intuitive politicians: they need to answer to different groups, and their ability to maintain a positive image over time depends on how well they can predict and address the concerns that others might have about different actions they might take. Accountability for decisions may exacerbate the sunk-cost effect: Study: some participants were told their decisions would be shared with instructors and other students (high-accountability condition) whereas others were told their decisions would be confidential (low-accountability condition). High-accountability participants were more likely to continue with their previously ineffective course of action. They showed a stronger sunk-cost effect because they experienced a greater need to justify their previous decisions. Accountability may also exacerbate the status quo bias: Study: participants decided on the acceptability of a new drug that was not yet on the market. They were told it would probably save many lives but would also probably cause many deaths. There was much more evidence of the status quo bias (maintaining the present state but not accepting the drug) when they felt accountable for their decision. Accountability may also influence experts' decisions: Study: medical experts were asked to choose treatment for a patient with osteoarthritis. Their decision-making was more biased when they were made accountable for their decision by writing an explanation for it and agreeing to be contacted later to discuss it. Politicians are more prone to various decision-making biases (sunk-cost effect; status quo effect) than other people, because their decisions are often open to public scrutiny (and thus accountability). Applied and complex decision-making Real-world decision-making (contrasted to laboratory decision-making) is more complex, has more serious consequences and usually involves several decisions over time as we strive towards important goals, compared to a single decision in the laboratory. Fast-and-frugal heuristics: applied decision-making The fast-and-frugal approach to decision-making states that simple heuristics can be more effective than complex strategies in real-world decision- making. Study: a simple heuristic (1/N: allocate funds equally to each of N funds) was compared to 14 more complex models in investment decision- making. None of the more complex models were better than 1/N. This may be because while complex models are sensitive to past data, they are also sensitive to the noise in the data. Strokes can be diagnosed using a simple approach of administering 3 tests of eye movements and then apply the heuristic: a stroke is diagnosed if at least 1 test indicates abnormality. This approach performs better than MRI. Complex decision-making Utility - how rewarding/satisfying a given outcome is perceived to be subjectively. In an ideal world, our decision-making would involve strategies maximizing utility. Multi-attribute utility theory - an approximation to an ideal decision-making strategy involving the following stages: 1) Identify attributes relevant to the decision. 2) Decide how to weight those attributes. 3) List all options under consideration. 4) Rate each option on each attribute. 5) Obtain a total utility (by summing the weighted attribute values for each option) and select the one with the highest weighed total. Individuals rarely adopt this optimal strategy because they usually possess incomplete knowledge of decision-relevant information and they are constrained by processing limitations (e.g. small WM capacity). However, if decision-making is only moderately complex, people can approximate to this optimal strategy: Study: participants are asked to act as job interviewers and choose 1 of 2 candidates based on their attributes (3, 4, or 5 attributes) and the importance of these attributes (weights of 1-4). Participants are required to make rapid decisions (mean response time = 1.5s). Mean overall accuracy was 90% for 3-attribute problems and 84% for 5-attribute problems. 59% of the participants used a strategy approximating that assumed by multi-attribute utility theory. Since decisions were made rapidly, the processing of attributes and their weights must have occurred fairly automatically. Satisficing - simplifying the decision-making process by using heuristics and ignoring some relevant information sources. When making complex decisions, we cannot use the multi-attribute utility strategy, but instead engage in satisficing. There are individual differences in how much one tends to satisfice. Satisficers (content with making reasonably good decisions) are happier and more optimistic than maximizers (perfectionists) and experience Complex Cognition Page 6 Satisficers (content with making reasonably good decisions) are happier and more optimistic than maximizers (perfectionists) and experience less regret and self-blame. Maximizers also set higher goals (i.e. choose the best) and use different strategies (seek out numerous options and compare them) and are more willing to put effort into achieving desirable goals. Elimination-by-aspects theory - decision-makers eliminate options by considering one relevant attribute or aspect after another. Elimination is cognitively undemanding, but it may produce different results depending on the order in which attributes are considered => the selected option may not be the best one. Two-stage theory - a modified version of elimination-by-aspects, which has 2 stages: Initial stage - resembles the elimination - only options fulfilling certain criteria are retained, which reduces the number considerably. Second stage - involves detailed comparisons of the patterns of attributes of the retained options and is only feasible when the number of options is relatively small. Study: students are asked to decide among flats based on information about various attributes (e.g., rent; distance from campus). When there were many flats to consider, the students typically started with a simple strategy (e.g., satisficing; elimination-by-aspects). When only a few flats remained, they often switched to a more complex strategy corresponding to the assumptions of multi-attribute utility theory. Study: single women made selections from a real dating website with 4, 24 or 64 potential dates. The weighted averaging strategy (multi-attribute utility theory) was used by 81% with 4 potential dates but only by 41% choosing from 64. The respective figures for elimination-by-aspects were 39% and 69%. The respective figures for satisficing were 6% and 16% (the numbers exceed 100% because many women used multiple strategies.). Speed-dating decisions at large events were determined mostly by easily assessable attributes (e.g. age, height, weight). At small events, the opposite was true => probably reflects the increased cognitive load at large events. Complicating factors: changing preferences and selective exposure Unlike assumed by the above theories, the assessment of the utility of any attribute does not remain constant. Study: participants decide between 2 job offers using 4 attributes (e.g. salary; commuting time). They were then told one of the jobs was in a much better location. This increased preference for desirable attributes of the chosen job and decreased preferences for undesirable attributes of that job. Decisions can even cause individuals to misremember factual information used during decision-making. Study: advanced nursing students prioritized a male or female patient for surgery. Afterwards, their memory for the facts (e.g. chance of surviving surgery) was distorted to increase the apparent support for their decision. Changing preferences can be rational if people's initial preferences are based on uncertain observations. Preference changing is more common when time pressure decreases, suggesting it is based on deliberate/analytic thinking. Selective exposure - a preference for information that strengthens preexisting views and avoidance of information conflicting with those views. Increased selective exposure is found when people have high defense motivation (they need to defend their personal position), and also when they have high accuracy motivation but restricted access to information. Selective exposure reduces when there is high accuracy motivation produced by instructing decision-makers to make the best choice. Accuracy motivation reduces selective exposure when it is triggered by the goal of making the optimal decision but increases it when it is triggered during the search for information. Naturalistic decision-making Galotti's theory of naturalistic decision-making in unstructured environments - involves 5 phases Complex Cognition Page 7 Galotti's theory of naturalistic decision-making in unstructured environments - involves 5 phases with flexible order (decision-makers can return to previous phases when struggling to make a decision): 1) Setting goals. 2) Gathering information. 3) Structuring the decision (i.e. listing options + criteria for deciding among them). 4) Making a final choice. 5) Evaluating the decision. When studying important real-life decisions (e.g. students choosing a college or their main subject), the following was found: Decision-makers constrained the amount of information they considered, focusing on 2-5 options (mean = 4) at any given time. The number of options considered decreased over time. The number of attributes considered at any given time was 3-9 (mean = 6). The number of attributes did not decrease over time; sometimes it actually increased. Individuals of higher ability and/or more education considered more attributes. Most of the decisions were assessed as good. The fact that people consistently limited the amount of information considered is consistent with bounded rationality but not multi-attribute utility theory. The number of options considered decreased by 18% over several months. Elimination-by-aspects theory predicts a reduction, but a higher one. Klein's recognition-primed decision-making model - a model of experts' rapid decision making under pressure: When the situation is familiar/typical, experts match the situation to learned patterns of information stored in long-term memory using pattern recognition. This rapid automatic process usually leads to retrieval of a single option. It is followed by mental simulation (imagining what would happen if the expert acted on the retrieved option). If the imagined outcome is satisfactory, that option rapidly determines his/her actions. Study: in over 600 decisions, various kinds of experts tended to consider only 1 option at a time. They typically rapidly categorized even a novel situation as an example of a familiar type of situation. After that, they simply retrieved the appropriate decision from long-term memory. The model deemphasizes individual differences: Consistent with the model, most individuals use an intuitive (or System 1) thinking style when making decisions in areas involving relevant expertise. However, some individuals prefer to use a reflective (or System 2) thinking style even when they possessed relevant expertise. Most of the data obtained to support the model is from real-life, which are quite complex and it is hard to identify the key factors triggering experts' decisions. More, there has been extensive reliance on experts' fallible memories for their thought processes during high-pressure situations. Unconscious thought theory Unconscious thought theory - conscious thinking is constrained by the limited capacity of conscious awareness, and so unconscious thinking is better than conscious thinking at integrating large amounts of information, and thus it's more effective during complex decision-making. The paradigm for testing this theory is to: 1) Present the participant with a problem providing several options. 2) Distract the participant in order to prevent conscious thought about the problem. 3) Allow the participant to select an option. Evidence is mixed. Unconscious thought sometimes, but not always produces superior decision-making. It has been argued that performance is optimal when it involved conscious and unconscious thought. Furthermore, participants who are assumed to rely heavily on unconscious, intuitive processes in their decision-making, report to rely Complex Cognition Page 8 Furthermore, participants who are assumed to rely heavily on unconscious, intuitive processes in their decision-making, report to rely on conscious memory. Additional Articles Article Summary - Nudging healthy and sustainable food choices: three randomized controlled field experiments using a vegetarian lunch-default as a normative signal Article summary here. Complex Cognition Page 9

Use Quizgecko on...
Browser
Browser