Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Summary

This document appears to be study notes related to a philosophy midterm exam. It covers topics like the philosophy of science, critical thinking, and predictable irrationality. It includes definitions, concepts, and examples related to these areas.

Full Transcript

Chapter 1: What is philosophy of science ======================================== What is science? ---------------- - Sciences attempt to explain certain aspects of reality based on observation. - Pseudoscience refers to theories that may appear scientific but are not. - Realists t...

Chapter 1: What is philosophy of science ======================================== What is science? ---------------- - Sciences attempt to explain certain aspects of reality based on observation. - Pseudoscience refers to theories that may appear scientific but are not. - Realists think that scientific theories represent reality truthfully. - Anti-realists think that scientific theories can make accurate predictions but not that they actually represent reality. - Karl Popper: scientific change happens in a gradual way, they represent the world more truthfully than the theories they replaced. - Thomas Kuhn: sciences undergo revolutions, discarding everything that came before it. Anti-realist: sciences do not come closer to the truth over times, they just switch ways of looking at the world. - Scientists are constantly making choices about what to study (what they find important), and therefore, science can never be completely objective. - Bad theories lead to bad practices and bad policies, so we must permanently question the theories, models and assumptions we use. What is philosophy? ------------------- Philosophy is: 1. Rational way of thinking (understanding the world by using your own power of understanding) 2. Radically critical (questions the grounds or foundations of any theory) 3. Looks beyond boundaries of different domains (unlike sciences) The importance of philosophy of science --------------------------------------- - The importance lies in its reflective and critical approach as well as in its broad scope. (reflecting on theories, assumptions and putting findings in broader context). - And also important to the processes of the science instead of only the result. Chapter 2: Predictably irrational ================================= What is (and what is not) critical thinking? -------------------------------------------- Critical thinking consists of forming beliefs in a rational (not intuitive and/or emotional) and autonomous way (not relying on tradition and/or authority). Critical thinking is not negative, skeptical, intelligent, creative, well-informed thinking. The goal of critical thinking ----------------------------- Critical thinking aims to distinguish sense from nonsense. Critical thinking must be learned, we must constantly beware of reasoning errors. No one is immune to rational thinking. The usefulness of critical thinking ----------------------------------- We make many decisions every day, and it is relatively new in the history of humankind that we have much information at our disposal. In Middle Ages, life was already settled before it began, today that is not the case. We have never been so dependent on information. We must find out for ourselves whether information is reliable. Nonsense has always been around, but the amount of nonsense we are served today is greater than ever. Moreover, nonsense breeds more nonsense. The tenacity of nonsense ------------------------ Whereas blatantly irrational views generally seem completely absurd to an outsider, people within groups that hold these views are often not aware of the bizarre nature of their convictions. This makes it harder to expose them. With our intuitions or common sense, we can perhaps expose the most outrageous claims, but certainly not all illusions. On the contrary, irrationality often stems from our intuitive and spontaneous thinking. So normal thinking leads us astray, it makes us predictably irrational, according to Ariely. Three rules of thumb -------------------- In most cases we can set our thinking straight by using three rules of thumb (Braeckman): 1. Do not accept a claim merely because it sounds plausible, rely on external support so assess its reliability. 2. Use Occam's razor: the most economical or parsimonious explanation is often the best. 3. Beware of cognitive pitfalls, everyone is susceptible to them. Cognitive illusions are systematic, permanent and universal. Predictably irrational ---------------------- To learn to think critically, first we need to become aware that our spontaneous thinking is deceiving us in predictable ways. List of reasoning errors ------------------------ ### General reasoning errors - Confirmation bias: The tendency to search for, interpret, favor and recall information in a way that confirms one's preexisting beliefs or hypotheses. - Irrational cognitive dissonance reduction: When information gathered contradicts our belief, we tend to interpret that information in such way that it no longer contradicts our belief. - Overconfidence bias: Too much confidence in the correctness of our own beliefs. - Dunning-Kruger effect: Tendency for lay people to overestimate their knowledge and of experts to underestimate their knowledge. - Bias blind spot: We detect reasoning errors much more easily in the reasoning of others than in our own reasoning. - Self-overestimation: We overestimate our own talents and prospects in life. - Belief bias: Accepting the validity of an argument simply because the conclusion sounds plausible or because you agree with it - Hindsight bias: With hindsight knowledge, it often seems quite probable that something would occur, and we overestimate the probability that we would have known. - Stereotyping: Expecting an individual of a particular group to have certain characteristics without having information about that group. ### Reasoning errors of investors/consumers - Choice supportive bias (or post purchase rationalization): We remember the choices we made in the past as being better than they actually were. - Endowment effect: We accord more value to something simply because we own it. - Bandwagon effect (ingroup bias): We adopt beliefs to quickly when they come from people in our group and blindly follow the behavior of the group - Anchoring: A given piece of information can strongly influence our estimates - Framing effect: Drawing different conclusions based on the same information because it is presented differently. - Loss aversion: We feel the negative impact of a loss more intensely than the positive impact of a gain of the same size. - Sunk cost fallacy: Taking into account incurred and non-recoverable costs in deciding whether to continue with a project. ### Statistical/mathematical reasoning errors - Statistical reasoning errors: Intuitively we perform poorly at estimating the probability. - Base rate fallacy: We tend to ignore base rates in estimating the probability that something will occur. In general, we often turn a blind eye to general, implicit information and focus exclusively on specific, explicit information. - Availability bias: We overestimate the likelihood that something will occur when it is easy to recall or imagine. - Gambler's fallacy: Expecting a statistical correction when that expectation is not justified. - Hyperactive pattern detection: Seeing patterns in random series - Exponential reasoning errors: We underestimate exponential growth because we are used to linear growth. Ariely summarizes: Even the most analytical thinkers are predictably irrational, the really smart ones acknowledge and address their irrationalities. Chapter 3: Why are we irrational? ================================= The peculiar architect of our thinking -------------------------------------- Since Darwin, the architect of our thinking apparatus known: evolution by natural selection. It appears to be blind, has no say in the materials with which it works and has one goal: reproduction. Evolution by natural selection ------------------------------ Evolution is the fact that species change over time and that all life forms share a common ancestor. It is mainly driven by natural selection. The process consists of three steps. 1. Random genetic mutations 2. Genetic mutations are passed on to offspring 3. Mutations that help the organism survive and reproduce in its environment are selected. What does this entail for our thinking? --------------------------------------- Because natural selection cannot anticipate the future and does not always have optimal mutations to select from, the process of our brains often results in suboptimal designs. We can assume that with regards to our brains, more optimal designs are possible. But the goal of natural selection is reproduction, and genes are successful to the extent that they provide the organism with characteristics that increase its chance to reproduce. Truth is an expensive means to an end ------------------------------------- Each characteristic of an organism is selected only insofar as it yields a reproductive advantage. Truth is usually the best strategy to increase the chances of survival and reproduction. But truth come at a cost, it requires a lot of brain power. More brain power can only evolve if the benefits it generates for the organism are greater than the additional costs it requires. Natural selection is interested in the truth only to the extent that it is relevant for survival and reproduction and wants the truth to be as inexpensive as possible. System 1 and system 2 --------------------- Our thinking apparatus was developed to function rapidly and economically. As a result, we are equipped with a thinking system that is both fast and frugal (economical). System 1 works automatically, quickly and intuitively. System 2 is the system that can check the output of system 1, it is slow, conscious and requires effort. In general, system 1 is in control. The fallibility of system 1 --------------------------- 1. System 1 is a system of approximation 2. Error management: Avoid costly mistakes and it is better to make more mistakes than costly ones. 3. Evolutionary mismatch: System 1 has been designed to guide us through historical environment. ### Heuristics: simplicity trumps complexity System 1 regularly leads us astray, because it makes use of heuristics. Heuristics are simple rules of thumb that generally produce good results but can sometimes be misleading. System 1 is a system of approximation: is applies simple rules to get as much relevant truth as possible as quickly and cheaply as possible, but is fallible. Gives rise to availability bias. ### Error management System 1 can be misleading as it replaces complex reasoning processes that require much information with simple reasoning processes that can be carried out in a split second. Natural selection has not designed system 1 to be as accurate as possible, but to avoid costly mistakes, therefore it makes more mistakes. This is error management. ### Evolutionary mismatch Third reason why system 1 sometimes deceives us is that heuristics have been designed to guide is through the environment in which we have spent most of our evolutionary history, not the environment of today. Taking stock ------------ System 2 is the lazy controller, who can put our thinking back on track but requires a lot effort. System 2 only intervenes when there is no response from system 1 and when we deliberately switch it on. A critical thinker recognizes that system 1 is fallible and call upon system 2 to check output of system 1. Other sources of irrationality ------------------------------ ### The social environment Humans are part of a social environment, our survival and reproductive opportunities depend on our relationship with members of the group. Our cognitive apparatus is therefore also designed to navigate the social environment, where truth is less important. We often benefit from deceiving others. To successfully deceive others, natural selection has equipped us with a resourceful bias, the best way to deceive others is to deceive yourself. This explains self-overestimation and here also system 1 leads us astray. ### The irrationality of system 2 Our slow and conscious thinking processes also deceive us in certain ways due to its function in the social environment. Reasoning is primarily used to argue with others, where persuading that you are right is more important than actually being right. Therefore, natural selection has equipped us with capabilities that are designed to persuade. The cognitive mechanism that helps us with persuading others that we are right is the confirmation bias. It often comes at the expense of being right. The confirmation bias selectively suppresses everything that could contradict our beliefs and opinions. And it creates another bias: overconfidence bias, the tendency that we overestimate the odds that we are right. ### Emotions Another prominent distorter of truth: emotion. We are hotheaded primates who see the world through a prism of emotions. The reason that we have emotions: Natural selection is only interested in actions: to act you need a belief and a desire. The affect heuristic consists of making decisions on the basis of emotional reactions, not on available information and objective analysis. It makes us bad investors and bad policy makers. Also ingroup-outgroup bias makes has profound negative consequences for society and distorts our thinking. Chapter 4: Irrationality in action ================================== Superstition, horoscopes and palm reading ----------------------------------------- An important source of irrationality is our 'over perception' of causal relations, this bias leads to superstition. We are all prone to making false connections/relations. It is a by-product of the fact that natural selection wanted to make sure that we did not overlook any causal relations in our environment. Correlation does not imply causation ------------------------------------ Correlation does not imply causation Order in randomness and randomness in order ------------------------------------------- We often underestimate the role of chance, randomness and coincidence. We see order in randomness and imitate randomness with order. Chance blindness ---------------- The chance factor is grossly underestimated. When we are confronted with extreme coincidences, we often refuse to see those events as merely accidental. Causal reasoning errors ----------------------- We are overzealous in finding causal relations: we see many causal relations that are not there, and if we cannot find out the real causes, we just make something up. Moreover, we misinterpret causal relations. Conspiracy theories ------------------- Conspiracy theories provide accounts for events that differ strongly from their official version. The official version is seen as a cover story set up by the guilty party. They often arise from basic causal reasoning errors. ### The ingredients for conspiracy theories Also other biases play a role. Confirmation bias: conspiracy theorists focus almost exclusively on the evidence that supports their theory. Ingroup-outgroup bias: they surround themselves with like-minded people. Shielding a theory from criticism is uncritical thinking, and conspiracy theories have immunization strategies, it protects the theory against refutation. ### Debunking conspiracy theories Conspiracy theories tend to become increasingly less likely as they evolve, as they need to come up with more supporting evidence. Remember Occam's razor: most economical explanation is the most probable. The more questions the theory raises, the less likely it is. Pseudoscience ------------- ### Popper's demarcation criterion According to Popper: a theory is scientific if it can in principle be refuted by means of observation. ### Freud vs Einstein Popper's insight came while he was reflecting on Freud's psychoanalysis and Einstein's theory of relativity. Freud could not be debunked by observable facts, while Einstein could. ### The importance of criticism Science remains open to criticism while pseudoscience protects or immunizes its theories from refutation. Still, scientists are susceptible to the confirmation bias. Fortunately, the scientific context and methodology protects the sciences from the confirmation bias. In the scientific context, there is no lack of motivation to refute a theory. ### Immunization strategies This environment of open criticism is not present in pseudosciences. Theories are shielded from criticism by immunization strategies. 1. Pseudoscientists often weaken their claim or give new interpretation when there is strong counterevidence. 2. Build in enough vagueness in the theory ### Healthcare: a perfect storm! In healthcare, pseudoscience is most prominent. Three factors why this is the case: 1. Confirmation bias of the therapist 2. Placebo effect on the patient 3. Spontaneous healing of the patient ### Randomized double-blind trials with control group To avoid these pitfalls, therapies and medication should be tested in randomized double-blind trials with control group. ### What about traditional medicine? Just because something has been carried out for centuries, does not mean that it is beneficial or even that is it not harmful. Religion -------- Religion is universal but often comes at an evolutionary cost. Irrationality thrives here on a large scale. ### The ingredients for supernatural beliefs - Hyperactive agency detection: seeing patterns in random series (thunder, lightning). - Intuitive dualism: we tend to perceive mental phenomena as separate from physical or material phenomena. - Preference for teleo functional explanations (explanations in terms of purposes). - Ingroup-outgroup bias: religious beliefs spready like a wildfire within groups. The myth of homo economicus --------------------------- Irrationality also in everyday life. Traditional view: economic actors always rational (homo economicus), this belief is overthrown. We are very prone to anchoring and framing. Our behavior appears to be far removed from the rational behavior assumed in traditional economics. Chapter 5: Mastering critical thinking ====================================== Three sources of reasoning errors --------------------------------- Learn how you can protect your thinking from irrationality which arises from: 1. Intuitive thinking (system 1) 2. Interference with emotions 3. Confirmation bias and overconfidence bias (biases of system 2) Intuitive reasoning errors -------------------------- ### There is no off-switch for system 1 Realize that system 1 cannot be switched off, only protection is to check with system 2. We are not inclined to reflect upon the reliability of our intuitions. Even in contexts where we mainly rely on our conscious and reflective thinking processes, system 1 remains active behind the scenes. ### Can we never trust our intuition? It depends whether we can rely on our intuition. ### Two kinds of intuitions Intuition refers to two sources of beliefs: 1. Genetically anchored or innate thought processes: fast and frugal reasoning abilities shaped to navigate our environment, fallible, reasoning errors due to error management and evolutionary mismatch. 2. Automatic but acquired thought processes ### Ecological rationality Gigerenzer: heuristics are not misleading because they are simple, but rather well-adjusted tools evolved to deal with important ecologically relevant problems. They enable us to solve ecologically relevant problems quickly and accurately. But these heuristics/intuitions are only reliable insofar as they are applied in an ecologically valid context (ancestral context). We can trust our intuition if we apply our intuitions in an ecologically valid context. But we should always be aware of cognitive pitfalls inherent to our intuitive thinking and switch to system 2 in these contexts. ### Acquired intuitions Automatic and unconscious thinking are also shaped by our experience. When first learning, you use system 2, after enough practice, system 1 takes over. ### A manual for intuitions Can we trust our intuition? 1. Check origin: innate intuition or acquired intuition a. If innate: check whether the context is a context where our intuition is generally reliable i. If ecologically valid: intuition could be true ii. If not: do not follow intuition Emotions -------- Emotions is another source of irrationality. Our thinking is not always isolated from our feelings, we often develop emotional ties with our beliefs ### Irrational forms of 'cognitive dissonance reduction' When we are presented with strong counterevidence to our beliefs, a state of cognitive dissonance occurs: we find it unpleasant that our beliefs are not consistent with the information that comes from reality. As emotional beings we often refuse to discard or adjust our beliefs. Instead we engage in irrational dissonance reduction: We do not adapt the belief to the facts, but the facts are interpreted in such way so that the belief remains intact. How can we curb the confirmation bias? -------------------------------------- Our conscious and reflective thinking also leads us astray. The biases arising from system 2 are the confirmation bias and overconfidence bias. We can protect our thinking in two ways 1. Limit confirmation bias by being aware that we are all affected by it and making a conscious effort to record evidence that would refute our beliefs. (Darwin) 2. Surround ourselves with people who think differently. ### The wisdom of the crowds Other people are very good at uncovering fallacies in our arguments, and vice versa. Only, we seem to lose most of this ability when it comes to our own beliefs and arguments. Groups generally come to more accurate beliefs than individuals. Wisdom of the crowds. Large groups of laymen often prove better at making predictions than the best experts. However, the group should not behave as a group, otherwise social emotions take over and wisdom of the crowds disappears. ### The overconfidence bias By making our beliefs vulnerable we also get rid of the overconfidence bias. People who are generally very confident that they are right, tend to predict worse than people who are less certain. The more certain we feel, the more we succumb to tunnel vision and the more oblivious we become to counterevidence. Expose our beliefs to the critical gaze of others. ### The extended mind thesis For most of our history as modern human being, we did not get much further than making fire and rudimentary tools. Not the use of brain activity in isolation, but the use of external elements in our thinking made the leap forward to now. ### Three levers for our thinking Three types of mind-external levers that can bring our thinking to a higher level: 1. Other minds: collaboration through time and collaboration at the same time. 2. Cognitive artifacts: The use of logic, mathematics and language in our thinking 3. Instruments: extend memory with writing, extending the reach of our senses with technology and perform complex computational operations ### Outsourcing our thinking The true power of our thinking does reside outside of our heads, we must outsource our thinking. To think better, we must acknowledge the limitations of our thinking. A critical thinker is someone who consciously makes an effort to get rid of tunnel vision. ### Thinking about thinking We should keep questioning the output of our thinking. Chapter 6: The importance of critical thinking ============================================== Are there beneficial illusions? ------------------------------- Some illusions may be useful, but we must be on guard, because: 1. They often come with negative consequences 2. Illusions tend to branch out in our thinking The impact of irrationality on the world ---------------------------------------- ### Overconfidence and war Overconfidence can lead to war. Johnson: correlation between form of political decision-making and chance of undertaking military actions. The way in which protagonists deal with intelligence also plays a role. ### The ingredients of financial crises Overconfidence and the illusion of control also impact the financial world, it is an illusion that you can predict the market. Hyperactive pattern detection in trying to detect patters in the movement of markets. Also success bias plays a role in the financial market. The price for irrationality of investors befalls on society as a whole. What about religion? -------------------- One domain of illusions was considered beneficial and even necessary, that domain is religion. Religion was traditionally regarded as the foundation of morality. ### Religion and morality For the vast majority of human history, religious beliefs were not linked to moral rules. Moral norms were introduced in religions quite recently because groups with such moral religions were better able to maintain internal harmony and cooperation. The moral concern of 'big gods' comes with a dark side. Moral norms work in two direction: on one hand they increase cooperation within a group, on the other hand, they increase hostility towards other groups. So moral norms in religion are mainly focused on relationships within the group, Religion often hinders moral progress. Critical thinking and moral progress ------------------------------------ True moral progress comes from rational, critical thinking. Rationality itself does not lead automatically to moral behavior, but our reasoning ability and capacity for critical thinking. Singer's escalator: by reasoning about morality we come to moral behavior that is far removed from the type of behavior for which natural selection has equipped us with moral intuitions. ### The natural selection of moral intuitions Moral intuitions evolved to strengthen harmony and cooperation in small hunter-gatherer bands and not by extending that courtesy to rivaling bands. In modern context: ingroup-outgroup bias causes racism and wars because different groups live together. ### Four centuries of moral progress By thinking rationally about morality, we can make moral progress. We see huge wave of moral progress from the moment that philosophers began to reflect on morality (after Middle Ages). Our rational and critical thinking has greatly increased the scope of our empathy and has made the world a much better place. Critical thinking and progress in general ----------------------------------------- The importance of critical thinking goes beyond the domain of morality. In history of Western thought, two breakthroughs of critical thinking where dogmatic thinking was replaced by critical thinking. First with the birth of philosophy in ancient Greece and the second the advent of modernity in the Renaissance. Irrationality is not innocent, It leads people to wage war in the name of God or some utopian ideology and makes people surrender to ingroup-outgroup bias. Bad thinking leads to bad outcomes. A better world follows from better thinking. The major challenges of today ----------------------------- Today, there are challenges for which natural selection has not equipped us. Pessimist think humanity is headed towards it end, optimists argue that necessity is the mother of invention. Our thinking is both our greatest asset and our greatest threat. A lasting struggle ------------------ Critical thinking is not a spontaneous way of thinking, it should be learned. Human thinking can revert into dogmatism. Anyone can participate in critical thinking, it brings progress. The fate of future generations will be sealed by the quality of our thinking. Chapter 7: The importance and reliability of science ==================================================== The scientific method --------------------- Which aspects of the scientific method make the sciences reliable? There is not a single scientific method, since every science lacks one characteristic. Human and natural sciences -------------------------- Leaving aside the formal sciences (mathematics and logic), there are two distinct families of sciences. Social or human sciences and natural sciences.\\ ### Different objects, different aims Human sciences focus on human thinking, acting and interacting. Natural sciences focus on the physical and natural world. Dilthey: both types have a different goal. Natural sciences seek to explain, social sciences aim to understand. He criticized positivists who aimed to provide social sciences with the same quantitative methods as the natural sciences and sought to discover general laws within the domain of human sciences. Dilthey: natural and human sciences do not only have completely different objects, but also very different methods and aspirations. ### Looping effects in the human sciences Another difference: in natural sciences there is no interaction between theory and its object, in human sciences there often is. Hacking's looping effect: a theory can influence its object in the human sciences because a theory can inform the people it describes and this can influence their thinking and actions. A self-correcting process ------------------------- According to some, the term social sciences is an oxymoron, since society can never be described scientifically. Human sciences and natural sciences have an important characteristic in common: they are self-correcting. Sagan: the methodology and context correct for the mistakes made by scientists. How is science protected against the reasoning errors of scientists? -------------------------------------------------------------------- Scientists are susceptible to reasoning errors. The methodology and framework of science protect against reasoning errors to which the human mind is susceptible. - Scientific methods protect against intuitive reasoning errors, by making extensive use of cognitive artifacts (mathematics, logic, statistics). - Scientific framework and context protect against pervasive biases of system 2 by peer review. Scientists do not lack motivation to try to revise influential theories of their time. - Scientific framework protects against overconfidence bias, because experiments must be reproducible. - Scientists engage in meta-analysis: pooling all studies on a given phenomenon to reach more robust conclusions. They can weed out distorted results. - Scientists will not only publish the result of their research, but also methodology. These steps make science a self-correcting process. The scientific framework also protects against emotional distortion by the existence of rival research groups. ### The power of the community The more scientists join the ranks and try to improve each other's theories, the faster we move forward. The only condition for progress is that new theories are shared with the entire scientific community and are thus subjected to criticism. The importance of scientific progress ------------------------------------- It can hardly be overestimated. Rational thinking is the driving force behind improving living conditions throughout history. Sciences do not only have a crucial role in improving our living standards and conditions, but also in improving society. To improve society, we need to develop human sciences, which are still in their infancy compared to natural sciences. Self-censorship in the human sciences ------------------------------------- Reaching a better understanding of humans and society is often impeded because there are taboo subjects or issues that scientists often steer clear of in fear of causing negative social consequences. The demarcation criterion ------------------------- The sciences must therefore be inclusive, not engage in self-censorship in search for the truth. That does not mean that everything must be admitted. Based on what criteria do we distinguish legitimate from pseudoscience? ### Popper's falsifiability The most influential demarcation criterion is Popper's criterion of falsifiability: A theory is only scientific if it is testable. In principle it must be possible to refute the theory based on observation. In Popper's time, the traditional demarcation criterion was verifiability: a theory is only scientific when it is shown by observation that the hypothesis is right. Popper: no theory has ever been verified, there is always a possibility that future observations will falsify the theory. Another demarcation criterion: confirmation: It is not because a hypothesis is supported by observation that it is scientific. Verifiability too strong, confirmation too weak. Popper: scientific progress is driven by conjectures and refutations, the consequence is that we can never state with absolute certainty that a theory is true. Scientific investigation is a constant attempt to refute existing theories. Popper's demarcation got criticized, it appears that scientists do not practice science in the way Popper envisions it, they do not throw a theory overboard when confronted with counterproof, moreover, pseudoscientists also sometimes make falsifiable claims. ### Feyerabend's epistemological anarchism A more fundamental criticism came from Feyerabend, who considers himself an epistemological anarchist. There is not one correct way to understand reality, but many different and valuable ways. The world is more complex than presented in scientific models and when we limit ourselves to them, we are left with an impoverished worldview. His principle: anything goes. He says that a demarcation criterion prevents new knowledge from being acquired and this prevents knowledge from progressing. Major scientific breakthroughs came because scientists broke the rules of their time. ### Postmodern constructivism Feyerabend's view fits in the context of postmodernism, there are no objective facts, only constructions and interpretations. Scientists are sculptors of reality. Feyerabend strives for separation of state and science. Sokal's hoax ------------ This postmodern constructivism did not convince everyone. Sokal responded in a remarkable way. Postmodern thinking opens the door to a whole lot of nonsense. Sokal thinks that we should make a distinction between meaningful theories and nonsensical theories. He came up with a hoax, he submitted an article to a leading academic journal and got it published. Sokal's article was purposefully made nonsensical. ### The danger of 'anything goes' If we open the doors of what is academically acceptable too widely, we risk drowning in nonsense. Without a demarcation criterion, science inevitably loses its power, reason: 1. Sciences can only progress if epistemic and methodological principles are shared by the scientific community. 2. Scientists build on the work of others. Striking a balance ------------------ Try to strike balance between openness and restriction, develop the habit of gauging the reliability of a belief by considering the way it came about. The future will be determined by the quality of our thinking.

Use Quizgecko on...
Browser
Browser