Chapter 2 - Impediments to Critical Thinking PDF
Document Details
Uploaded by NeatSolarSystem846
Tags
Summary
This document provides lecture and tutorial notes on impediments to critical thinking, covering topics such as self-interested thinking, group thinking, relativism, and philosophical skepticism. The notes include definitions, mechanisms, overcoming strategies, examples, and peer insights related to each impediment.
Full Transcript
Chapter 2: Impediments to Critical Thinking Category 1 Impediments: How We Think 1. Self-Interested Thinking ○ Definition: Accepting claims solely because they align with personal desires, goals, or convenience. ○ Mechanism: Driven by cognitive biases like confir...
Chapter 2: Impediments to Critical Thinking Category 1 Impediments: How We Think 1. Self-Interested Thinking ○ Definition: Accepting claims solely because they align with personal desires, goals, or convenience. ○ Mechanism: Driven by cognitive biases like confirmation bias (focusing on evidence that supports pre-existing beliefs) and motivated reasoning (rationalizing conclusions to fit emotions). ○ Overcoming: 1. Actively seek disconfirming evidence. 2. Reflect on whether your interests are clouding judgment. ○ Examples: 1. Climate Change Denial: A fossil fuel executive dismissing scientific evidence to protect profits. 2. Academic Overconfidence: A student skipping study sessions because they believe they "already know everything," despite poor past performance. 3. Health Choices: A smoker ignoring lung cancer statistics to justify their habit. 2. Peer Insight: ○ Self-interested thinking often pairs with the sunk-cost fallacy (e.g., continuing a failing project because of prior investment). 3. Group Thinking ○ Definition: Conforming to group opinions to avoid conflict, often leading to irrational decisions. ○ Key Fallacies: 1. Appeal to Popularity: "Millions buy this product, so it must work!" 2. Appeal to Tradition: "We’ve always used coal energy; why switch now?" 3. Stereotyping: "All teenagers are irresponsible drivers." ○ Examples: 1. Historical: The Salem Witch Trials (1692), where fear and group pressure led to false accusations. 2. Corporate: The Challenger Space Shuttle disaster (1986), where engineers ignored safety concerns to meet launch deadlines. 3. Social Media: Viral misinformation (e.g., "5G causes COVID") amplified by echo chambers. 4. Peer Insight: ○ Groupthink thrives in hierarchical structures where dissent is punished. Modern algorithms exacerbate this by filtering opposing views. Category 2 Impediments: What We Think 1. Relativism ○ Subjective Relativism: "Truth depends on individual belief." 1. Objections: 1. Self-defeating: If "all truth is relative," then relativism itself cannot be universally true. 2. Implies infallibility: If your belief is your truth, you can never be wrong. 2. Examples: 1. Moral debates: "Cheating is wrong for you, but acceptable for me." 2. Health myths: "I believe crystals heal cancer, so it’s true for me." 3. Historical revisionism: "The Holocaust didn’t happen for me." ○ Social Relativism: "Truth depends on cultural/societal norms." 1. Objections: 1. Contradicts universal principles (e.g., human rights). 2. Justifies harmful practices (e.g., slavery in ancient societies). 2. Examples: 1. Cultural relativism: Defending female genital mutilation as a "cultural tradition." 2. Dietary norms: Eating dogs in some cultures vs. viewing them as pets in others. 3. Legal systems: Stoning as punishment in some societies vs. global human rights laws. 2. Peer Insight: ○ Relativism challenges cross-cultural dialogue but can be countered by distinguishing between descriptive (what is) and normative (what ought to be) claims. 3. Philosophical Skepticism ○ Definition: Doubting the possibility of knowledge, often by questioning the reliability of perception or evidence. ○ Key Arguments: 1. Descartes’ Evil Genius: A hypothetical being deceives all sensory input. 2. The Matrix Hypothesis: Reality is a simulation. ○ Objections: 1. Pragmatic Response: Absolute certainty is unnecessary; probabilistic reasoning suffices for practical knowledge. 2. Scientific Method: Skepticism drives inquiry but assumes observable reality. ○ Examples: 1. Radical Skepticism: "Can we really prove the Earth is round?" 2. Historical Doubt: Denying well-documented events (e.g., moon landing) due to potential biases. 3. Conspiracy Theories: "Governments hide alien existence." 4. Peer Insight: ○ Moderate skepticism is healthy (e.g., peer review in science), but radical skepticism paralyzes decision-making. Propositional Knowledge Three Pillars: ○ Belief: Mental acceptance of a claim (e.g., believing water boils at 100°C). ○ Truth: Correspondence with reality (e.g., water does boil at 100°C at sea level). ○ Justification: Evidence or rationale (e.g., scientific experiments). Gettier Problems: Cases where justified true belief isn’t knowledge. ○ Example: A broken clock shows the correct time twice a day. You glance at it at the right moment, forming a true, justified belief—but it’s still luck, not knowledge. Peer Insight: Knowledge requires reliability (e.g., a working clock) to avoid Gettier-style luck. Course Assumptions 1. Objective Truth: Propositions are true/false independently of beliefs (e.g., "The Earth orbits the Sun"). 2. Attainable Knowledge: Justified true beliefs are possible (e.g., medical diagnoses based on tests). Study Strategies Apply Concepts: Analyze news articles for fallacies (e.g., political speeches using appeal to tradition). Debate Relativism: Is morality universal or culturally constructed? Use examples like genocide or free speech. Challenge Skepticism: How much evidence is "enough" to know something? Discuss vaccine efficacy data. Peer Tip: Use the SEEC Method (State, Explain, Example, Conclude) to structure arguments. Example: ○ State: Self-interested thinking hinders objectivity. ○ Explain: It prioritizes personal gain over evidence. ○ Example: Tobacco companies hiding cancer research. ○ Conclude: Critical thinking requires confronting biases. Key Takeaways: Category 1 impediments stem from cognitive shortcuts; Category 2 from flawed epistemic views. Examples bridge theory to real-world contexts (e.g., corporate scandals, historical events). Critical thinking demands vigilance against both internal biases and external misinformation. 1. Types of Reasoning Deductive Reasoning Definition: Moves from general premises to a specific conclusion. Key Features: ○ Validity: The conclusion necessarily follows if premises are true. ○ Soundness: Requires validity and true premises. Example: ○ Premise 1: All humans are mortal. ○ Premise 2: Socrates is a human. ○ Conclusion: Socrates is mortal. Critical Thinking Focus: ○ Check logical structure (validity). ○ Verify the truth of premises (soundness). Inductive Reasoning Definition: Moves from specific observations to a general conclusion. Key Features: ○ Strength: Conclusion is probable given the premises. ○ Cogency: Requires strength and true premises. Example: ○ Observation: Every crow observed in Ontario is black. ○ Conclusion: All crows in Ontario are black. Critical Thinking Focus: ○ Assess likelihood of conclusion (strength). ○ Avoid hasty generalizations (e.g., "I met one rude French person → all French people are rude"). 2. Connecting Reasoning to Barriers Deductive Reasoning vs. Relativism: ○ Relativism undermines objective truth, making premise validation impossible. ○ Example: A relativist rejecting "All humans need water" due to subjective truth claims. Inductive Reasoning vs. Group Thinking: ○ Groupthink leads to hasty generalizations (e.g., "Everyone invests in crypto → It’s safe"). ○ Solution: Use inductive reasoning cautiously (e.g., analyze diverse data). Skepticism & Soundness: ○ Radical skepticism questions all premises, but sound deductive arguments require accepting some truths (e.g., "Water boils at 100°C at sea level"). Peer Insights & Examples Deductive Fallacy: ○ "All birds fly. Penguins are birds → Penguins fly." ○ Flaw: False premise (not all birds fly). Inductive Strength: ○ "10/10 patients recovered after taking Drug X → Drug X is effective." ○ Weakness: Small sample size; no control group. Relativism in Media: ○ "My truth" narratives in debates (e.g., flat-Earthers dismissing scientific consensus).