Science, Technology, and Society: Module 1 PDF
Document Details

Uploaded by HilariousPeridot8931
Tags
Summary
This module provides an overview of Science, Technology, and Society (STS). It explores the relationship between scientific progress and its social, ethical, and cultural contexts. The module covers topics such as the history of science, scientific revolutions, ethical dilemmas in science and technology and a general survey of science and technology in the Philippines.
Full Transcript
MODULE 1 Section 1 Science and Technology In fact, it forms part of the foundation courses in which each is expected to demonstrate competence, primarily for its role in shaping a holistic individual and productive citizen. Science - comes from the Latin word scientia, mea...
MODULE 1 Section 1 Science and Technology In fact, it forms part of the foundation courses in which each is expected to demonstrate competence, primarily for its role in shaping a holistic individual and productive citizen. Science - comes from the Latin word scientia, meaning "knowledge." It is a system or method of finding answers to questions about the nature of the universe. As a discipline, it is empirical, meaning its arguments are based on observations and experience, which are systematically collected and then analyzed. It is also evidence-based, meaning its descriptions of the behavior of the universeand of living and non-living things and systems are grounded on data, which have been systematically and critically analyzed. Note that these empirical and evidence-based nature are as of much value in the study of SOFT SCIENCE - psychology, sociology, philosophy, and communication HARD SCIENCE - physics, biology, chemistry, and physiology. For his part, the famous American science historian John Heilbron viewed science not only insofar as what it is but also in light of what it does. For him, "modern science is discovery as well as invention" (Heilbron, 2003, p. vii). On the one hand, he considered science as a discovery of regularity in nature, enough for natural phenomena to be described by principles and laws. On the other hand, he explained that science required invention to devise techniques, abstractions, apparatuses, and organization in describing these natural regularities and their law-like descriptions. We also learned from our early dealings with science and technology as an academic subject matter that technology, which almost always comes along with science, is the application of scientific knowledge, laws, and principles to produce services, materials, tools, and machines aimed at solving real-world problems. While science comes from the Latin scientia, technology comes from the Greek root techne, meaning "art, skill, or cunning of hand." Needless to say, technology is very purposive. It has practical reasons beyond just describing the behavior of the universe. Always, this purpose is to make our lives more convenient, more efficient, and more comfortable. Technology empowers humans. This is most succinctly captured in Facebook CEO and co-founder Mark Zuckerberg's definition of technology as a response to an audience question during a December 2014 live public townhall session (Stalling, 2019). Zuckerberg quipped: "What defines a technological tool-one historical definition-is something that takes a human's sense or ability and augments it and makes it more powerful. So, for example, I wear contact lenses or glasses; that is a technology that enhances my human ability of vision and makes it better." It is interesting to note that, because modern science drives present-day technological advancements, many of us think that science came before technology, which is quite the opposite. Technology is much older than science. Technology happened even before our ancestors made formal the study of the behavior of the universe. In fact, some of the great leaps of technology during critical periods of science history took place without much contribution from science as a formal field of study. Instead, our ancestors used trial and error methods to test the tools, materials, products, and even infrastructure they built back in the day. For example, the precise and state-of-the-art engineering sciences we enjoy today progressed from an ancient engineering practice which tested the strength of a building using the "five-minute theorem." Our ancestors believed that if a building stood still five minutes after its scaffoldings and support were removed, it will probably last forever. Against our modern understanding of engineering sciences, the five-minute theorem does not only sound unscientific, but also risky and very dangerous. And yet, looking at some of the great man-made wonders of the ancient world, we could not help thinking how, without as much science as we enjoy today, these ancient structures and buildings not only remain to be one of the greatest feats of engineering but also continue to stand the test of time. There is no debate anymore as to which came first. In fact, the only reason we still had to mention this part of science and technology history is because it will come in handy in the succeeding discussions on the moral and ethical foundations of science and technology. The Birth of STS as a Field of Study The birth of STS as an academic and social discipline could be traced back to two important periods of 20th-century history. First, the period between the end of World War I and the beginning of World War II was a relatively brief yet immensely pivotal junction. This period, popularly known as the "Interwar Period," saw significant changes in economy, politics, and society on a global scale. Second, the period after World War II, commencing in 1945 until roughly 1991, saw a time when previously warring nations directly continued to wage war through indirect conflict, thus popularly called the "Cold War." This period foregrounded the post-World War II geopolitical tensions between the former Union of Soviet Socialist Republics (USSR) and primarily the United States of America (USA) and its allies in Europe. During these periods, when countries upgraded their capacities for DIRECT (e.g., military and nuclear technologies) and INDIRECT (e.g., industrial, communication, and transportation technologies) war and conflict, social scientists, such as historians and sociologists, and scientists themselves, such as those who were involved in the discovery and production of science and technologies, became increasingly concerned about the impacts and drawbacks of wars and conflict and the role that science and technology play. At the same time, they became increasingly interested at the seemingly intricate links between scientific knowledge, technological systems produced out of this knowledge, and society-how it shapes and how it is shaped by these knowledge and systems. For example, the scientists who participated in the development of atomic theory could not have easily predicted the impact of their discovery on technology and, by extension, society. It was when they saw the applications of their theory in the building of atomic bombs and the subsequent devastation brought by their (indirect) participation in the outcomes of technological applications of science that they have become disturbed, and from being mere spectators they decided to take a more active role in the decision-making processes related to the conduct and applications of science to technology. It was at that critical period of the 20th century that STS as an academic and social enterprise had arguably been born. STS, therefore, is a relatively young academic field that combines previously independent and older disciplines, such as history of science, philosophy of science, and sociology of science, which are common undergraduate programs in the United States, United Kingdom, Canada, and Australia, to name a few. As an interdisciplinary field, the emergence of STS was a result of questions about its dynamic interaction with various aspects of society and was thus viewed as a socially embedded enterprise. As a socially embedded enterprise, the tenets of STS are hoped to not only be learned within the four walls of the classroom but to empower students and citizens to take a more active role in social change and nation-building through a renewed understanding of the space that science and technology takes up in the various spheres of everyday living. As the Kennedy School effectively encapsulates, STS seeks to bridge the gap between two traditionally exclusive cultures-humanities (interpretive) and natural sciences (rational)-so that humans will be able to better confront the moral, ethical, and existential dilemmas brought by the continued rise of science and technology. It applies methods drawn from history, philosophy, and sociology to study the nature of science and technology and judge its value, scope, and impacts in society. No one field owns STS. It is neither exclusive for those majoring in the hard sciences nor for those in the social sciences. Instead, it is a hybrid field which aims to integrate cross- disciplinary frameworks, encourage civic engagement, and promote critical thinking regarding timely and relevant science and technology issues affecting people's lives. It questions how scientific discovery and technological developments link up with broad social ventures, such as politics and governance, public policy, law and justice, ethics and morality, economy, sociology, anthropology, and culture. As a result of the awakening that took place during the Interwar Period and Cold War, STS, as an academic discipline, commenced. It now concerns itself with how we, individually as humans and collectively as a society, are producers and users of science and technology. This field of study hopes to offer us pedagogical, discursive, and interactive resources with which we can evaluate the benefits and risks and the perils and promises of modern science and technology-just as the atomic bomb scientists did during their time. Is Science Dangerous? One classic debate in STS is encapsulated in a question, which Lewis Wolpert (2005), a South African-born British developmental biologist, author, and broadcaster, in his Medawar Lecture 1998 asked, "Is Science Dangerous?" As students of STS, this question the author proposes is where your study of the field should start. Studying STS today happens at a time when there is overwhelming fear and distrust of science. There is anxiety whether scientists are driven by wisdom and social responsibility, not ambition, in the pursuit of their research. In pop culture, scientists "playing God" has been a recurring theme. For the most part, this general apprehension towards science has become deeply engrained in our culture. In the Medawar Lecture 1998, Wolpert (2005) responds to the same question he asked, asserting arguably the most thought- provoking passage in the article as follows: "In contrast to technology, reliable scientific knowledge is value-free and has no moral or ethical value” (p. 1254). Wolpert (2005) began unpacking this statement by arguing that "science is not the same as technology" (p. 1253). To support this, he himself delineated one from the other. On the one hand, science simply describes how the universe works. Its aim is to produce ideas and explanations about how the world works. On the other hand, technology is the application of that knowledge to create something or use it for some practical purpose. While technology is much older than science as earlier discussed, modern technology is now based on fundamental science. The problem lies in that scientists are often faced with political and financial constraints to pursue their research and transform them from ideas into practice. Scientists need funding from political and business interests and to be supported with marketing and business skills, which are often outside their own skillset. In contrast to this practical and purposeful nature of technology, how can we impose moral standards and ethical values on a discipline whose only concern is to describe the inner workings of this world and this universe we live in? That we could have possibly evolved from some ancestors which shared characteristics with apes or that our past experiences which lie in an unconscious mind could, at any given moment, influence our behavior is neither good nor bad. It is simply what it is. The danger lies when this knowledge is used and applied in technology. Sadly, many people today still take science and technology as one and the same, leading to a serious problem, which Wolpert (2005) calls the conflation of science and technology. This distinction raised in the Medawar Lecture 1998 does not create a perfect dichotomy between science and In fact, if we look closer at the passage, Wolpert (2005) seemed to technology have very judiciously placed the word "reliable" as a safeguard against those who might take his viewpoint as an absolute dichotomy, whereas, in truth, some sciences cannot be free from moral or ethical value. Reliable is the operational word in this case. Indirectly, Wolpert (2005) tells us that when some science or scientific experiment is proven unreliable or breaches the principles of the scientific method, then we can impose moral and ethical standards upon such science or experiments. For example, some experiments raise ethical issues on its test subjects, say when they require human trials, such as vaccine development experiments. Others, such as research on genetic modification of crops and other food products, pose risk to the safety of people and the richness and balance in biodiversity. Other scientific endeavors are problematic not in what they are concerned about but in how the experiments and research are funded. Research funding has become a staple source of financial backbone, especially for researchers and scientists who often have brilliant ideas but very little support from government and science ministries. As such, and without much discernment on their end as well, they become willing or unwilling subjects of manipulation of political or business interests. Conflicts of interest are a very common cause for concern in many scientific disciplines today. Hence, in many scientific journals, especially those in fields which have direct impact on human health and safety, editorial policies include a statement of conflict of interest and declaration of funding and support for the sake of transparency and, more importantly, so that readers can vet the reliability of the methods and findings used in scientific inquiry. Yes, science is value-free and has no moral or ethical value, but, first, it has to be good. In this context, "good" means reliable. Only when science has established itself as reliable and has proven it presents a reliable description of how the world works-not influenced by external forces or conflicts of interest- can we take Wolpert's words as true. Socially Responsible Science Modern science and technology ushered in an era where many of the challenges and difficulties faced by those who came decades and centuries before us are no longer a cause for concern today. From communication technologies to advancements in transportation, from food safety to breakthroughs in public health, from entertainment to advanced technologies for national security, humans today enjoy the comfort, convenience, and efficiency of 21st-century living. Yet, when we look around us and assess modern life, we cannot help feeling that the problems we face today are made complicated by the same science and technology that are meant to address them. Problems like data breach and invasion of privacy, public health emergencies and pandemics, and hate crimes and gun violence have beset societies even more today as a result of the misuse and abuse of science and the collapse of ethical and moral fibers in the conduct of technology. This is the irony of complexity. As scientific and technological advancements move us forward, they, at the same time, open up the possibilities for more catastrophic outcomes. The popular American scientist Carl Sagan, as quoted in Tom Head's (2004) book, could not put it more clearly: "We live in a society absolutely dependent on science and technology and yet have cleverly arranged things so that almost no one understands science and technology. That's a clear prescription for disaster." It is against this backdrop that we realize the need for something more than just reliable or good science. The complex entanglements of science, technology, and modern societies demand a science that does not only follow the minimum requisites for it to be good, but one that speaks truth to power at a time when governments and business interests seem to use it as a tool for the maintenance of their power and influence. What we need today is not only good science but a socially responsible science. Here are some ways in which scientists can engender socially responsible science as adapted from science and engineering ethicist and consultant Prof. Stephanie J. Bird (2014): 1. Ensure accurate and reliable research - As a primer and as previously discussed, scientists must ensure methodological accuracy in order for science to generate reliable ideas about how the world works and, ultimately, be free from moral and ethical values. 2. Oppose misuse or abuse in the application of research findings - Scientists must stand and speak against those who misuse and abuse scientific ideas for selfish gains and unfair advantage, such as in politics and governance and business and economy. 3. Attend to both the limitations and the foreseeable impacts of their work - Scientists must be fully transparent about their research, and that includes discussing the limitations of their experiments and, as a departure from the conservative view on the role of science and scientists, tackling the social, political, economic, and cultural consequences of their findings. 4. Participate in discussions and decisions regarding the appropriate use of science in addressing societal issues and concerns - Beyond their laboratories and research settings, scientists must use their voice and engage in discursive participation on how to use and how not to use their findings in addressing social problems. 5. Bring their specialized knowledge and expertise to activities and discussions that promote the education of students and fellow citizens - Scientists must take an active role in bringing the discussion on the grassroots level (e.g., local communities and everyday users of science) for science to be accessible and meaningful to the ordinary person. This includes participating in activities where scientists talk about their ideas and findings using non-technical language and communicate empathically so as to encourage students and citizens to be genuinely interested and concerned about science and its scope and impact on their lives. 6. Enhance and facilitate informed decision-making and democracy - Where opportunity exists, scientists must participate in discussions of how science can inform effective governance and active citizenship. This includes volunteering their expertise in shaping science-driven policies, such as those in the areas of public health, transportation and communication, data analytics, and food safety and security. Clearly, the social obligations of scientists are different from those of ordinary people. In the grand scheme of things, scientists have obligations to support democratic societies and ensure that the rights of the ordinary people are taken care of by and through science. Wolpert (2005) says that this comes precisely from the scientists' privilege to specialized knowledge about how the world works, which is not easily accessible to ordinary people. Thus, their knowledge has to be used in the pursuit of the greater good. Public Understanding of Science One goal of Science, Technology, and Society as an academic discipline is to promote public understanding of science. Ordinary citizens have a general apprehension towards science brought about by a fear of not having sufficient technical and sophisticated understanding of scientific disciplines. People casually talking about science is far-fetched. They think they neither have the technical knowledge nor the appropriate language to do so. As a result, throughout history, scientific findings have been, for the most part, exclusive and accessible only to scientists and a small inner circle, thus missing out on the many benefits of public understanding of science. To succeed in modern society, Marincola (2006) explains that in this day and age it is crucial that the public participates in science issues affecting their lives and their own self-interest. Beset with problems which are increasingly becoming more clearly connected with matters of science, the public will benefit from good science education as this will allow them to succeed and become productive in science-driven markets and economies and science-driven issues of democratic participation. Public understanding of science will also benefit the scientific enterprise itself. When ordinary citizens understand science, innovations and products of scientific endeavors are more likely to be accepted, more likely to be shaped by shared values, and more likely to be geared towards the pursuit of the greater good. The public must demand competitive science education and transparency of scientific data affecting their lives. Scientists must practice an inclusive approach to science education and information dissemination. Governments must encourage participation of the public on matters involving science and public policy. The importance of a public understanding of science could not be underscored better in many a government's response to the COVID-19 pandemic. Countries which have done well in preventing, mitigating, and responding to the health, economic, and societal impacts of COVID-19 are countries which have a very vibrant public understanding of science, alongside a local scientific community that has a voice in policy and response and a government that listens to science and its people. To exemplify this, we zoom in on the case of the Republic of Korea. South Korea has had arguably one of the best COVID-19 responses in the world. They were the first to report a rising number of COVID-19 cases in February 2020 as a result of aggressive and mass testing of its citizens, contact tracing and isolation, and a highly efficient healthcare system to begin with. In a BBC interview, then South Korea's Foreign Minister Kang Kyung-wha underscored the basic principles of South Korea's COVID-19 response: openness, transparency, and fully keeping the public informed. The South Korean public was informed about the government's strategic plans every step of the way. South Koreans were given full transparency of the COVID-19 situation, the government's response, and the people-focused initiatives, which for FM Kang was the reason the South Korean government won the public's trust and support. Alongside their people-focused approach to public understanding of South Korea's COVID-19 response, FM Kang named the keys to their successful response with explanations of each, directly quoted from the BBC interview: a. Testing is central This leads to early detection, minimizes further spread of the virus, and quickly treats those found with the virus. As of March 15, 2020, South Korea was testing 260,000 South Koreans a day. b. Government quickly approved the testing system After the Chinese (health) authorities released the genetic sequence of the virus in mid-January (2020), South Korean authorities quickly conferred with the research institutions and shared that result with the pharmaceutical companies who then produced the reagent and the equipment needed for the testing. c. People were monitored afterwards South Korea did not go into the same kind of lockdowns or social exclusion that many European countries opted for at the time, but instead monitored South Koreans through a mobile phone application. d. The South Korean public is very demanding - Being faithful to the values of South Korean democracy, the government was fully in the service of the South Korean people, who were engaged in the COVID-19 discourse and who expected the highest standards of government service. South Korea's response to COVID-19 is telling of the role of public understanding in critical science issues that beset global societies today. One common denominator among countries that did well in their COVID-19 response is that governments in these countries did not only listen to their people but also encouraged them to speak up. These governments, like South Korea's, were comfortable with the idea of a demanding public and were unafraid of a public that understands science. The COVID-19 pandemic exposed the cracks not only in countries' healthcare systems and government policies, but also the countries' state of science education and how, sadly, many countries have citizens who fall behind the ideals of public understanding of science. Have Filipinos been as demanding as South Koreans have towards the government's response to COVID-19? Did we expect the highest standard of government service during this pivotal point in history? What forces shaped the seeming lackluster understanding of Filipinos of their role in shaping science- related public policy, such as COVID-19 response? Do we have a government that listens to its people? When and from whom shall we demand accountability for the state of science education and public understanding of science in the Philippines? And, where do we go from here? Emerging Ethical Dilemmas Ethical dilemmas are issues in which a difficult choice has to be made between two or more options. They are called dilemmas because, even as individuals and societies are presented with multiple approaches in dealing with them, no single approach resolves the issue totally free from any potential breach of ethical guidelines. When faced with an ethical dilemma, a person needs to select a response to a situation that does not align with any of the existing and established ethical codes, societal norms, or their personal moral judgment of what is right and wrong. In science and technology, ethical dilemmas are issues which challenge us to decide on a most plausible course of action, whose potential benefits on human health, safety, and security outweigh the possible risks. For example, the COVID-19 pandemic is an ethical dilemma in science and technology because, even as we take out approaches ill-informed by science that some governments picked as part for their response, the science- based approaches are surrounded with questions regarding their compliance with accepted ethical guidelines. One dilemma was the appropriate action to take at a point in the COVID-19 timeline when the economy was failing yet the number of cases was surging. Did we have to open up the economy? Which components of the economic infrastructure and in what manner did we have to open? Did we enforce maximum restrictions on people's movement? If so, what happens immediately after the restrictions to ensure that such restrictions are not prolonged? And what types of support do we provide the most economically vulnerable? Between opening up the economy and restricting people's movement, can there be a compromise course of action? If there is, which does it lean more towards? What are the safeguards of the compromise? Pandemics, such as the COVID-19 pandemic, are but one of the most challenging ethical dilemmas in science and technology, which we will all have to ultimately live with. Writer, consultant, and tech ethicist Dr. Jessica Baron, in collaboration with the John J. Reilly Center for Science, Technology, and Values of the University of Notre Dame (Indiana, USA) has been, since 2012, publishing an annual top 10 list of emerging ethical dilemmas in multiple approaches in dealing with them, no single approach resolves the issue totally free from any potential breach of ethical guidelines. When faced with an ethical dilemma, a person needs to select a response to a situation that does not align with any of the existing and established ethical codes, societal norms, or their personal moral judgment of what is right and wrong. In science and technology, ethical dilemmas are issues which challenge us to decide on a most plausible course of action, whose potential benefits on human health, safety, and security outweigh the possible risks. For example, the COVID-19 pandemic is an ethical dilemma in science and technology because, even as we take out approaches ill-informed by science that some governments picked as part for their response, the science- based approaches are surrounded with questions regarding their compliance with accepted ethical guidelines. One dilemma was the appropriate action to take at a point in the COVID-19 timeline when the economy was failing yet the number of cases was surging. Did we have to open up the economy? Which components of the economic infrastructure and in what manner did we have to open? Did we enforce maximum restrictions on people's movement? If so, what happens immediately after the restrictions to ensure that such restrictions are not prolonged? And what types of support do we provide the most economically vulnerable? Between opening up the economy and restricting people's movement, can there be a compromise course of action? If there is, which does it lean more towards? What are the safeguards of the compromise? Pandemics, such as the COVID-19 pandemic, are but one of the most challenging ethical dilemmas in science and technology, which we will all have to ultimately live with. Writer, consultant, and tech ethicist Dr. Jessica Baron, in collaboration with the John J. Reilly Center for Science, Technology, and Values of the University of Notre Dame (Indiana, USA) has been, since 2012, publishing an annual top 10 list of emerging ethical dilemmas in science and technology. For 2021, the list, including descriptions from the Tech Top 10 website, consists of the following: 1. Robot abuse - Why would anyone hit a robot? And why is it so uncommon? 2. Doomscrolling - Should you put down your phone for your mental health? 3. Your "Digital Twin" - A digital version of yourself could involve everything from your genome to your search history. But will it be used for good? 4. The tech battle for the Arctic – A new “Cold War" or the setting for WWIII? 5. Secret surveillance apps Is someone tracking your every move? You may not even know what apps lurk in your devices. 6. Facebook – Is it time to stop pretending that this platform is making our lives better? 7. Selfie medicine – Is this the future of medical care? 8. The sleep-tracking app that alters your dreams – Real- life Inception: cool or creepy? 9. CIVVL – A most uncivil app for evicting people from their homes 10. The weaponization of data voids – People are creating rabbit holes of misinformation. Even though a few of these items sound familiar, the list points to the ever-growing challenges and complex entanglements of science, technology, and living in the modern world. These ethical dilemmas will ultimately come knocking at our doors in the future and, when they do, we have to face them head on. Discussing them does not aim to provoke even more fear and anxiety towards science and technology. Quite the contrary, tackling these now is aimed to dispel our fears of things simply because we do not know and understand them. Knowledge of these things is the first step in freeing us from science anxiety. To this end, Science, Technology, and Society as an academic and social discipline capitalizes on the interpretive frame of the humanities and social sciences and the rational lens of the natural sciences to prepare students and future professionals for the great moral and ethical challenges that science and technology in modern society bring into our lives and the lives of those we care for. Section 2 Place of the History of Science in STS Science, Technology, and Society is an interdisciplinary discipline, which brings together previously independent and older disciplines; among those is the history of science. As a subdiscipline of STS, History of Science is, in itself, broad. It encompasses the study of the development of science across time. It seeks to explore how science evolved and is evolving, and analyzes the formation of the sciences as part of the human experience. It also asks questions regarding the relationship between scientific progress and larger social, political, economic, and cultural contexts. These aims span across periods of history, that is from the ancient to modern ages, and different disciplines of the natural and social sciences. Professor Lawrence Principe of Johns Hopkins University explains that historians of science aim to get a true and accurate depiction of science in the past. Particularly, they study how scientific ideas developed (evolution), where they came from (genesis), and why (context). By doing so, historians of science achieve the following applications: 1. Getting a better idea of science in the past gives us a better picture of science nowadays which, in turn, encourages interest in the subject. 2. Studying the subject tells us more about how our ancestors were involved in the same processes of discovery about how the world works. 3. Discussing the history of science benefits students as it dispels the false notion that getting into science requires being a genius. 4. Providing access to the development of science across time encourages more people to go into science. History of Science covers many aspects of scientific historical development. This section zooms in on the historical antecedents of scientific and technological inventions-the contextual forces that necessitated their discovery and production and how these inventions themselves ushered society into greater heights. What's in a Historical Antecedent? An antecedent is defined as a precursor to the unfolding or existence of something. A historical antecedent in science and technology, therefore, can be defined as the previous state of science and technology before something more advanced was created. There are two ways in which this specific definition holds. First, antecedents can refer to the older and tangible counterparts of a more advanced, more efficient, and more useful tool, device, or technology. Second, antecedents can also refer to the prevailing and non-tangible societal conditions, which required scientists to think of inventing new tools, devices, and technologies or improving existing ones in order to better address the ever-growing challenges at the time. To illustrate this, let us focus on the historical development of bitcoin, a digital or virtual cryptocurrency invented in 2009, which uses peer-to-peer technology and strong cryptography to facilitate online transactions and secure online payments. A bitcoin is essentially a computer file stored in a digital wallet application on people's smartphone or computer. A bitcoin, or just a part of it, can be sent to your or another person's digital wallet. These transactions are recorded in a public list called the "blockchain." As a medium of exchange, one tangible antecedent of the bitcoin is our traditional currencies and monetary units, such as copper coins and paper money. As for the non-tangible antecedent, several events shaped the need for the creation of the bitcoin. One such event that many refer to in the historical development of bitcoin is the 2008 Financial Crisis, which culminated with Lehman Brothers Holdings Inc.'s declaration of bankruptcy disaster since the Great Depression from 1929 to 1939. There was mistrust in centralized financial systems as the crisis exposed its weaknesses, which proved costly for investors, many small- time investors who relied on stock market investments to grow their money. Thus, the creation of the first bitcoin came after this crisis as a strategy for users to take control of their money, instead of trusting the financial system, which proved vulnerable to collapse, making people lose their hard-earned money and investments in an instant. Attention on the tangible and non-tangible definitions of a historical antecedent in science and technology is one of the ways in which historians of science, which is what you practically are by studying STS, are able to understand the evolution and genesis of science and technology and the social, political, economic, and cultural contexts that shaped them. Ancient Age During the Ancient Age, our ancestors relied on protoscience, which was an era when the scientific method was just unfolding. In the history of science, the development of proper science, through the rise of scientific thinking and the scientific method, took place only during the Middle Ages. As such, during the Ancient Age, knowledge and understanding about how the world works was handed down through generations using oral tradition. It was a difficult period in the development of science as early humans were focused on survival. This quest for survival paved the way for our ancestors to successfully build ancient civilizations such as Mesopotamia (3500–500 B.C.), Indus (3300-1900 B.C.), Ancient Egypt (3150-31 B.C.), Ancient Greece (2700-479 B.C.), Ancient China (2100-221 B.C.), and Ancient Rome (550 B.C. - 465 A.D.). It was in these civilizations that the foundations of proper science were discovered and put in place. For example, forms of writing allowed for knowledge to be documented and communicated to the next generations with greater accuracy. Components of more advanced modes of transportation in the later ages took form in many of these ancient civilizations. The transition from hunting to agriculture in the Ancient Age paved the way for a surplus of food. Under these civilized conditions, our ancestors were able to devote more time to activities besides survival, including the pursuit of knowledge for its own sake. Outstanding Ancient Age Inventions Ancient Wheel. People from ancient civilizations had long used animals for transportation ages before the invention of the wheel. No one knows exactly who invented and when the wheel was invented. There is, however, a general agreement that the ancient wheel grew out of a mechanical device called the Potter’s wheel - a heavy flat disk made of hardened clay which was spun horizontally on an axis. It is believed that the Sumerians invented the potter's wheel shortly after 3500 B.C. The invention of the ancient wheel is often credited to the Sumerians since no other ancient civilization used a similar device at the time. It could be that a potter thought of shifting the potter's wheel to a 90-degree angle for the purpose of transportation or the wheel was reinvented for this purpose. Nonetheless, it would not be until 1000 to 1500 years later that the wheel was first used on carts. Paper. Roughly around 3000 B.C., ancient Egyptians began writing on papyrus, a material similar to thick paper. The Egyptian papyrus was made from the papyrus plant that grew near the Nile River. It was lightweight, strong, durable, and, more importantly, portable. Before the Egyptians invented the papyrus, writing was made on stone. Because of the difficulty of writing on stone, writing was reserved only for very important occasions. With the advent of the papyrus, documentation and record-keeping became efficient and widespread. Through the use of the papyrus, information dissemination moved exponentially faster. Records were kept and stood the test of time more durably and efficiently. Shadoof. The shadoof was an early tool invented and used by ancient Egyptians to irrigate land. Among Egyptians who lived near the Nile river, irrigation was necessary to water their crops. Shadoof, also spelled as "shaduf," was a hand-operated device for lifting water. Its invention introduced the idea of lifting things using counterweights. Because of the invention of the shadoof, irrigation and farming became more efficient. The shadoof is also believed to be an ancient precursor of more sophisticated irrigation tools. Antikythera mechanism. Even before the advent of the antecedents of the modern computer, the Greeks had already invented the ancient world's analog computer. Discovered in 1902 and retrieved from the waters of Antikythera, Greece, the Antikythera mechanism was similar to a mantel clock. Upon its discovery, bits of wood on its fragments suggest that it must have been housed in a wooden case. It was akin to a clock in the way that the case had a circular face and rotating hands. A knob on the side made it possible for it to be wound forward or backward. As this knob wound forward or backward, its mechanism would have made it display celestial time. Thus, it is widely believed that the Antikythera mechanism was used to predict astronomical positions and eclipses for calendar and astrological purposes. It is also believed that the Antikythera mechanism, which is one of the oldest known antecedents of modern clockwork, was invented by Greek scientists between 250–87 B.C. Aeolipile. Also known as Hero's engine, the aeolipile is widely accepted as the ancient precursor of the steam engine. Hero of Alexandria is credited for the demonstration of the aeolipile in first century A.D. Aeolipile was a steam-powered turbine, which spun when the water container at its center was heated; thus, making it the first rudimentary steam engine. It is not immediately clear whether the aeolipile served any practical purpose and is believed to be one of many "temple wonders" at the time. Nonetheless, Vitruvius, a Roman author, architect, and civil engineer, described the aeolipile as a scientific invention through which "the mighty and wonderful laws of the heavens and the nature of winds" may be understood and judged. Middle Ages Between the collapse of the Western Roman Empire in 5th century A.D. and the colonial expansion of Western Europe in late 15th century A.D., major advances in scientific and technological development, including a steady increase of new inventions, introduction of innovations in traditional production, and emergence of scientific thinking and method, had taken place. The Middle Ages was not as stagnant as alternate terms, such as the "Medieval Period" and "Dark Ages" suggest. To better make sense of the development of science in the Middle Ages, it might be helpful to divide this period into three subperiods: the Early Middle Ages (476-1000 A.D.), the High Middle Ages (1000-1250 A.D.), and the Late Middle Ages (1250- 1500 A.D.) Early Middle Ages (476-1000) In the years following the collapse of the Roman Empire, society became more concerned with peacekeeping and empire building than honing centers of learning and knowledge production. As such, the period immediately following the fall came to be known as the "true Dark Ages," where society slipped from a period of reason and high philosophy and into one of barbarism and ignorance. While it was a period marked by frequent wars and conflict, population shifts, and the visible disappearance of urban life, that the Early Middle Ages is completely devoid of any contribution to scientific advancement is incorrect. Although little, evidence exist to prove that medieval thinkers and great minds were trying to find answers about the nature of the universe. Progress in the scientific method took shape in the production of illuminated manuscripts in Ireland. Examples include Saint Bede's meticulous record of the Saxon Era and a book on using astronomical observations to determine the beginning of Easter in England, craftmanship and metalwork by the Vikings and Saxons, and Norse Sailors' use of stars for navigation. Outside the purlieus of the fallen empire, a Chinese man named Bi Sheng (990-1051 A.D.) invented the movable type printing towards the end of the Early Middle Ages during the Song Dynasty. This technology would replace woodblock printing, which was a widely used but expensive and time- consuming printing technology. The Early Middle Ages proved to be too long a period for the products of science and technology and of the scientific method, which were few and far in between at the time. Understandably, this outcome was a result of an increasingly rural and dispossessed population, who were concerned more than anything else about being able to rebuild their lives after the fall of the empire. Nonetheless, the period saw monastic study keep some of the scientific processes alive and well. While monks in both the West and East were mostly concerned about scholastic endeavors on the Bible and Buddhism, those from Western Europe also endeavored to understand medicine so that they could care for the sick, and studied astronomy so that they could observe the stars and determine the date for Easter. In 725 A.D. Asia, a Buddhist monk named Yi Xing invented the world's first mechanical clock, which ran by dripping water on a wheel that made one revolution every 24 hours. Small contributions to astronomy ensured that mathematics and geometry as scientific disciplines remained relevant, although the methods remained largely an echo of the sophisticated mathematical functions introduced by the Romans and Greeks High Middle Ages (1000-1250) After more than five centuries of constant warfare, Europeans slowly began to crawl out of the "Dark Ages." Population growth and a shared Christian identity triggered this settling when people came to embrace the belief that they had more in common and found some unity of purpose. While European infighting declined, the High Middle Ages was a time of prolonged war between Christians and Muslims and one of territorial bickering between Spain and the East. Like a fringe benefit, trade flourished as it became a standard for merchants and mercenaries to share their practices and experiences from Spain, the Holy Land, and Byzantium. Translation grew steadily, instigated by the Muslims who translated books and texts from Greek to Arabic. Spain became a hotbed of this practice by mid-11th century as scholars from all around Europe came to translate into Arabic even more Latin books and texts. The growth of translation allowed knowledge transfer to blossom as it channeled what the Greeks wrote and knew into the European consciousness. Around this time, medieval universities, individually known as stadium generale, were also booming in Europe, among them the University of Bologna (founded in 1088), University of Oxford (1167), University of Cambridge (1209), and University of Paris (1215). These studia generalia hosted many High Middle Age scholars, like the Italian Gerard of Cremona (c. 1114-1187) of the Italian School of Translators, whose work relied on knowledge and use of Arabic. By the 12th century, several studia generalia drew scholars whose contribution were in blending ancient Greek knowledge with that of the discoveries of Muslim philosophers and scientists. As a result, the requisites of Christian scholasticism were established, which, while much was focused on theology, initiated the nexus of scientific empiricism and religion. Although the period saw a continued vacuum in terms of great technological advances, particularly those of the Greeks, Romans, Chinese, Persians, and Muslims, great thinkers at the time worked tirelessly to advance the agenda of establishing the scientific method. Among them, Thomas Aquinas led the transition from Platonic reasoning to Aristotelian empiricism. Robert Grosseteste lobbied for the dualistic scientific method. Similar to the ancient astronomers' practice of using observation to determine trends and propose astronomical models, Grosseteste believed that empiricism should be used to propose laws governing the universe and that the laws may be used to forecast outcomes. Roger Bacon took Grosseteste's work and that of Aristotle and the Islamic alchemists to propose the idea of induction as the foundation of empiricism. For his work describing the method of observation, hypothesis, experimentation, independent verification, and detailed documentation of results, Bacon stood alongside Aristotle, Avicenna, Galileo, and Newton as one of the great minds behind the scientific method. Elsewhere, flashes of technological advancement were present. In Middle Age China, for example, it would not be until 1092, more than 350 years after the first mechanical clock was invented, that Su Song developed a more sophisticated version of the mechanical clock called the "cosmic empire." The cosmic empire was 200 years ahead of its European counterpart. Navigational compasses, which were more sophisticated versions of that invented between the second century B.C. and first century A.D., were used in Chinese ships by 1000 A.D. While the precursors of the navigational compasses were used in Feng shui to decide on building layouts, navigational compasses allowed for a more efficient navigation of the Chinese. It is believed that these compasses were very efficient that Arabs who were trading in China learned about the technology and brought it to Europe for replication and further improvement. Late Middle Ages (1250-1500) The end of the Middle Ages is considered to have delivered the world from a medieval society into a modern one. This period, known as the Late Middle Ages, is generally considered to have begun in the mid-13th century and ended in the 14th century. It was a time which wiped out the gains of the flowering of science and widespread prosperity from the High Middle Ages. The plummeting social order was triggered by a series of famines and plagues which swept through Europe, including the Great Famine (1315-1317) and the Black Death (1346-1353). These events caused depopulation to almost half of the populace. Other cataclysmic events of the 14th century exacerbated the prevailing social conditions and led to social unrest, including the Hundred Years War (1337-1453), the Peasants' Revolt (1381), the burning of Joan of Arc at the stake (1431), and the fall of Constantinople to the Turks (1453). The crises of the Late Middle Ages were rather unfortunate as they came at a time of leaping scientific progress. Late Middle Age thinkers layered the philosophy of Christian scholasticism with accompanying science. For example, the 14th century English Franciscan friar William of Ockham proposed the idea of parsimony based on his work on logic studies. His principle of parsimony, known as the famous Ockham's Razor, is still used in modern science to choose between two or more competing theories, of which the simpler theory or explanation to a phenomenon is to be preferred. Influential 14th century French philosopher Jean Buridan established the antecedents of Newtonian physics, particularly that of inertia, by challenging Aristotelian physics and developing the theory of impetus, which is a motive force enabling a body to move in the direction in which the mover sets it in motion. For this, Buridan provides a mathematical formula: impetus = weight Fourteenth century English physicist Thomas Bradwardine and x velocity. his colleagues differentiated kinematics from dynamics and proposed the mean speed theorem, which they demonstrated through the Law of Falling Bodies, which can be considered as a or of Galileo's work on predecessor falling objects. Nicolas d'Oresme, a 14th century French philosopher, in his Livre du ciel et du monde, discussed a theory about a heliocentric universe. Although Oresme himself sat on the fence about this theory and that none of his arguments were conclusive, he was the first to argue that it would be more economical for the Earth to rotate on its own axis rather than the entire sphere of stars. He argued this two centuries before Nicolaus Copernicus' seminal work De revolutionibus orbium coelestium would be published. Finally, during this period, many Christian scholastics became more open to factor out divine intervention in their attempts to explain natural phenomena. It was during this period that scholars sought simpler and natural causes to events rather than dismissing them as divine providence. Strangely, the advances of Late Middle Age philosophers and scholars were forgotten in favor of those during the Renaissance and the Age of Enlightenment. Oddly enough, the Black Death halted the scientific and technological advancement. It would not be until the Renaissance (1400-1600) that knowledge, which took a backseat because of mass disruption brought by the plague during the Middle Ages, would reemerge. Outstanding Middle Age Inventions Heavy Plough. Perhaps one of the most important technological innovations during the Middle Ages is the invention of the heavy plough. Clay soil, despite being more fertile than lighter types of soil, was not cultivated because of it being heavy. However, through the invention of the heavy plough, it became possible for the first time to harness clay soil. University of Southern Denmark professor Thomas Bernebeck Andersen succinctly describes the impact of the invention of the heavy plough as follows, "The heavy plough turned European agriculture and economy on its head. Suddenly, the fields with the heavy, fatty, and moist clay soils became those that gave the greatest yields" (Andersen, Jensen, & Skovsgaard, 2016). This resulted in the rapid economic prosperity of the northern territories of Europe. The heavy plough stirred an agricultural revolution in Northern Europe marked by higher and healthier agricultural yields and more efficient agricultural practices. Gunpowder. Around 850 A.D., Chinese alchemists accidentally invented black powder or gunpowder. Multiple accounts suggest that the gunpowder might have been an unintended by-product of attempts made by the Chinese to invent the elixir of life, which is why the Chinese called it huoyao, roughly translated as "fire potion." Prior to the invention of the gunpowder, swords and spears were used in battles and wars. Towards the end of the 13th century, the explosive invention crept into most parts of Europe and Asia. Since its invention, the gunpowder has allowed for more advanced warfare. From fiery arrows to cannons and grenades, the gunpowder has become the basis for almost every new weapon used in war since its invention. It ushered in the unprecedented advancement in warfare and combat throughout the Middle Ages. Paper Money. Although it was not until the 17th century that bank notes would begin to be used in Europe, the first known versions of paper money could be traced back to the Chinese in the 7th century A.D. as an offshoot of inventing block printing. Before the introduction of paper money, precious metals, such as gold and silver, were used as currency. However, the idea of assigning value to a marked piece of paper did not immediately become popular. In fact, 13th-century Mongols attempted to introduce paper money into the Middle Eastern market, but this immediately proved to be a failure. Nonetheless, traders and merchants eventually realized the huge advantage of using paper money because it was much easier to transport compared to the previous forms of currencies being used. Mechanical Clock. Although devices for timekeeping and recording had sprung from the ancient times, such as the Antikythera mechanism, it would not be until the Middle Ages that clockwork technology would develop so that mechanical clocks would be able to accurately keep track of time for the first time. The sophistication of clockwork technology at the time so much that it became possible to determine not only the hour but also the minute and second of every moment-drastically changed the way days were spent and work patterns were established, particularly in the more advanced Middle Age cities. Spinning Wheel. Another important invention of the Middle Ages is the spinning wheel. It is a machine used for transforming fiber into thread or yarn which is eventually woven into cloth on a loom. Although no consensus could be made about the origins of the spinning wheel, it is theorized that the Indians invented the spinning wheel between sixth and 11th century A.D. Prior to the invention of the spinning wheel, weaving was done predominantly through the more time-consuming and tedious process of hand spinning. According to White (1974), the invention of the spinning wheel sped up the rate at which fiber could be spun by a factor of 10 to 100 times and removed this bottleneck to cloth production. Thus, White argued that this invention ushered in a breakthrough in linen production when it crept into Europe in 13th century A.D. Modern Age The Modern Age is the postmedieval era beginning in the 1500s to the present. It is marked by a steady population increase worldwide, technological innovations, urbanization, scientific discoveries, and globalization. Like the Middle Ages, the Modern Age is also often split into subperiods: the early modern period and the late modern period. While the periods discussed in this section are generally based on the Modern Age as it unfolded in Europe and the rest of the Western world, modernity in other parts of the world came later or even earlier and, as such a different periodization is more logical. Nonetheless, there is a consensus that the defining characteristics of the Modern Age unfolded predominantly in Europe. Early Modern Period The foundations of the Great Divergence, a period during which the West overcame pre-modern growth and reached unprecedented levels of wealth and power in the 19th century, were laid during the Early Modern Period. The period began with the invention of Johannes Gutenberg's movable type printing press, which made more efficient the processes and practices of knowledge production. Literacy rates rose and educational reforms were introduced, spurring the 14th-16th century Renaissance and 16th century Protestant Reformation. It was also during the Early Modern Period that older methods of science were replaced by empiricism and modern science through the 16th and 17th-century scientific revolutions (for separate discussion in Module 1 Section 3). This transition was made possible through the work of Middle Age thinkers such as Roger Bacon, whose work on the scientific method was the foundation of experimentation and hypothesis testing during the Early Modern Period. As knowledge became accessible at exponential levels, other areas of life improved as well-transportation improved and politics became more secular. Capitalism grew and countries became more powerful. Alongside these, science became more independent. Reason, rationalism, and faith in scientific inquiry became the hallmark of Early Modern Period science, replacing previously dominant forces of monarchy and church. Late Modern Period Into the Late Modern Period beginning sometime in 1750 to 1815, huge political, social, and economic changes took shape. These massive changes were brought by the combined and complex effects of the First Industrial Revolution (1750), the American Revolution (1776), and the French Revolution (1789). The First Industrial Revolution had far-reaching consequences as it altered not only the ways goods were produced, but the fundamental framework within which the economy, society, and even culture operated. The invention of the internal combustion engine, steam-powered ships, and railways led to the dramatic increase in production capacity and output. Manufacturing took the economic spotlight away from agriculture. In turn, people moved closer to the busy city centers where mass production was concentrated, understandably for economic and social mobility reasons. Factory work became a popular model of mass production such that it served a template for other fields such as education. Schools were designed to follow the template of a factory in which students were treated as raw materials to be transformed into products that are able to competently meet the demands of modern life. Consumer goods became the go- to options of people who, in the midst of personal and societal economic growth, could no longer produce their own food, clothing, and supplies. As such, the economic outburst led to a domino effect where increased production led to greater wealth for more people. Consequently, the French and American Revolutions paved the way for national sovereignty and representative democracy which replaced monarchy. These 18th- century revolutions ushered in a period of greater secularism and gave birth to democracy as a product of individual rights and progress. Going into the Contemporary Period, more and more people preferred living in the city, received education, read books and newspapers, participated in politics, spent on consumer goods, and embraced the identity of citizenship in an industrialized nation. Urbanization and mass media brought people closer to each other through a sense of shared mass culture, transcending regional, social, and cultural boundaries and differences. Outstanding Modern Age Inventions Compound Microscope. A Dutch spectacle maker named Zacharias Janssen is credited with the invention of the first compound microscope in 1590. Together with his father, they began experimenting on lenses and put several lenses on a tube. This led to an amazing discovery: that the object placed near the end of the tube is magnified far larger than what a simple magnifying lens could achieve. The Janssen compound microscope was an important progression from the single lens microscope. It was capable of magnifying objects three times their size when fully closed and up to 10 times when extended to the maximum. Today, the compound microscope is an important instrument in many scientific studies, such as in the areas of medicine, forensic studies, tissue analysis, atomic studies, and genetics. Telescope. Perhaps the single, most important technological invention in the study of astronomy at the time was the practical telescope, which was built by Galileo Galilei. This invention could magnify objects 20 times larger than the Dutch perspective glasses. It was Galileo who first used the telescope skyward and made important astronomical discoveries, such as the presence Telescope. Perhaps the single, most important technological invention in the study of astronomy at the time was the practical telescope, which was built by Galileo Galilei. This invention could magnify objects 20 times larger than the Dutch perspective glasses. It was Galileo who first used the telescope skyward and made important astronomical discoveries, such as the presence of craters and mountains on the moon. Galileo’s remarkable technological contribution drastically changed astronomical science. For the first time, it became clear that the universe was far larger than previously imagined and the Earth was far smaller compared to the entire universe it was in. Jacquard Loom. As the Industrial Revolution reached full speed, the Jacquard loom was viewed as one of the most critical drivers of the revolution. Built by French weaver Joseph Marie Jacquard, the Jacquard loom was a device that simplified textile manufacturing. Prior to the invention of the Jacquard loom, a draw loom was used requiring two individuals: the weaver and a "drawboy," if figured designs on textile were needed. In 1801, Jacquard demonstrated the ingenuity of his version of a loom in which a series of cards with punched holes automatically created complex textile designs. Prior to the Jacquard loom, more manual labor and greater effort had to be exerted to produce complex designs. This only became easier because of the Jacquard loom which also made mass production easier. It was also an important antecedent of modern computer technology as it demonstrated the use of punched cards to instruct a machine to carry out complex tasks (i.e., make different textile patterns). Engine-Powered Airplane. Orville and Wilbur Wright are credited with designing and successfully operating the first engine-powered aircraft on December 17, 1903 at Kitty Hawk, North Carolina. The Wright brothers approached the design of powered aircraft and flight scientifically. Wilbur and Orville proved that aircrafts could fly without airfoil-shaped wings. They demonstrated this in their original "Flying Machine" patent (US patent #821393), since slightly-tilted wings, which they referred to as aeroplanes, were the key features of a Their pioneering success marked an age of powered flights. Sans modern knowledge on aerodynamics and a comprehensive understanding of the working of aircraft wings, the Wright brothers were brilliant scientists who paved the way for modern aircraft science and technology. Television. The Scottish engineer John Logie Baird is largely credited with the invention of the modern television. Baird successfully televised objects in outline in 1924, recognizable human faces in 1925, and moving objects in 1926 at the Royal Institution in London, and demonstrated colored images in 1928. Baird's television technology caught on really swiftly. In fact, the British Broadcasting Corporation (BBC) used this for its earliest television programming in 1929. Even as it was the first television, Baird's technology would later on be criticized for its fuzzy and flickering images, primarily because it was mechanical compared to the electronic versions that were being developed at the time. Historical Antecedents of Science and Technology By providing a snapshot of the unfolding of science and technology across different periods of time, the foregoing discussion sought to establish the importance of studying the historical antecedents of science and technology. Particularly, the goal is to understand how scientific ideas developed, where they originated, who invented them, and their intended and unintended purposes and effects. Science and technology are all around us today. As such, if we are to capitalize on their benefits, we may also be interested on a thing or two about their historical antecedents. By understanding how scientific ideas developed, replication and improvement can be made possible. By knowing where products of science and technology originated, it becomes easier to understand the social, political, and economic contexts that shaped them and is shaped by them. By being familiar with thinkers, scientists, and inventors and their work and example, greater participation in science and technology is engendered as ordinary students and citizens alike understand that scientists are but simple fallible individuals who had questions to ask and wanted answers for them. Overall, understanding historical antecedents of science and technology should lead to a greater interest in the field. For the longest time, science has been perceived as a discipline that only a gifted few can pursue. Nothing is farther from reality. In contrast, science and technology are for everyone thirsty for knowledge and concerned not only about one's own life but also others. More individuals can be encouraged to participate in science and technology if we spoke about it as if it is only through it that we can be able to live a healthy and meaningful life-because it is. Section 3 Revolutions in the History of Science Throughout the long history of science, one area which caught the interest of those studying STS is that of scientific revolutions, starting with those that took place during the 16th and 17th century. Scientific revolution is the term used to refer to a period in history when drastic changes in scientific thought, scientific communities, and the scientific method took place. Prior to these revolutions, the work of pre-Socratic Greek philosophers dominated widely held beliefs about the nature of the universe, much of which focused on human society, ethics, and religion. The Greek views about nature were popular for almost 2000 years before major shifts emphasized abstract reasoning, quantitative thinking, and developing an experimental scientific method. Given the vast new information which were emerging at the time, these scientific movements began to question religions and moral beliefs and principles and traditional views about nature. Needless to say, the scientific revolutions benefited from increasing secularism at the time, which paved the way for the view of nature as a machine, rather than a divine providence. It thus scrutinized traditional institutions, practices, and belief and demanded new modes of communicating and disseminating information, which did not fit with the changing and emerging science at the time. For one, it was no longer enough to publish results of scientific inquiry in expensive books, which tended to be accessible only to the few who could afford them, vis-à-vis the need for scientific information to be communicated efficiently-widely and quickly-for communities to benefit from them. These platform where they revolutions introduced important innovations such as scientific societies, which provided scientists a discuss and validate new discoveries, and scientific papers, which served as a venue for scientists to report, disseminate, and comprehensive, have their new discoveries reviewed by other scientists. Scientific societies and scientific papers allowed for a more diligent, and reliable vetting of new discoveries and hypotheses being advanced. By and large, scientific revolutions were paradigm shifts. They were changes in scientific perspectives from the traditional to the novel, paving the way for the emergence of modern science. In the words of 18th-century French astronomer, mathematician, and freemason Jean Sylvain Bailley, scientific revolutions involved a two-stage process of sweeping away the old and establishing the new. In the process, challenging long-held views about the nature of the universe required presenting a more efficient alternative, one which was a result of a disciplined and comprehensive experimental scientific method. At this backdrop, however, key figures during different scientific revolutions in history did not have it easy. They and their discoveries and hypotheses were often met with huge resistance and controversy. Governments and churches, which were already at the losing end of an increasingly secular society, found the need to pull society back to traditional beliefs and principles and away from an enlightened view of nature and the universe. Power relations between traditional institutions and the scientific communities and their key figures disproportionately favored the former, such that the revolutions would have to drag on for decades and even for a few centuries before widescale acceptance of alternative theories and principles was reached. Throughout this, it took entire scientific communities and one key figure to another to corroborate and support each other in order to dismantle old scientific beliefs, ways, and practices. The foregoing typification can be used to make sense of the unfolding of many scientific revolutions from the 16th century to the present. In this section, however, we zoom in on representative scientific revolutions from representative centuries: the 16th- century Copernican Revolution in astronomy, the 19th-century Darwinian Revolution in evolutionary biology, and the 18th- 19th-century Freudian Revolution in psychoanalysis. Copernican Revolution The Copernican Revolution refers to the 16th-century paradigm shift named after the Polish mathematician and astronomer Nicolaus Copernicus. Copernicus formulated the heliocentric model of the universe. At the time, the belief was that the Earth was the center of the solar system based on the geocentric model of Ptolemy (i.e., Ptolemaic model). Copernicus introduced the heliocentric model in a 40-page outline entitled Commentariolus. His model was formalized in the publication of his treatise, De revolutionibus orbium coelestium (The Revolution of Celestial Spheres) in 1543. In his model, Copernicus repositioned the Earth from the center of the solar system and introduced the idea that the Earth rotates on its own axis. The model illustrated the Earth, along with other heavenly bodies, to be rotating around the Sun. The heliocentric model is encapsulated in the following seven axioms or key ideas: 1. The celestial spheres do not have one common center. The Earth is not at the center of everything. 2. Earth is not the center of the universe, only the center of gravity and the lunar orbit. Only the Moon orbits Earth. 3. All the spheres orbit the Sun. "Spheres" means the planets. 4. Compared to the distance to the stars, the Earth to Sun distance is almost nonexistent. The stars are very much farther away than the Sun. 5. The motion of the stars is due to the Earth rotating on its axis. 6. The motion of the Sun is the result of the Earth's motions (rotation and revolution). 7. The retrograde and forward motions of planets is caused by the Earth's motion. It is caused by the fact that Earth's orbit is of a different length than the other planets. The idea that the Sun is at the center of the universe instead of the Earth proved to be unsettling to many when Copernicus first introduced his model. In fact, the heliocentric model was met with huge resistance, primarily from the Church. Although Copernicus faced no persecution when he was alive, resistance came after a wave of Protestant opposition, which happened after Copernicus' death in 1543 and the publication of his book in the same year. The idea that it was not the Earth, and, by extension, not man, that was at the center of all creation was unthinkable for both Protestants and Catholics. This led to the Catholic Church prohibiting the reading of De revolutionibus for two centuries beginning in 1616. Moreover, although far more sensible than the Ptolemaic model, which as early as the 13th century had been criticized for its shortcomings, the Copernican model also had multiple inadequacies that were later filled in by astronomers who participated in the revolution. Nonetheless, despite problems with the model and the persecution of the Church, the heliocentric model was soon accepted by other scientists of the time, most profoundly by Galileo Galilei. The contribution of the Copernican Revolution is far-reaching. It served as a to sway scientific thinking away from age-long views about the position of the Earth relative to an catalyst the universe. This marked the beginning of modern astronomy. enlightened understanding of Although very slowly, the heliocentric model eventually caught on among other astronomers who further refined the model and contributed to the recognition of heliocentrism. This was capped off by Isaac Newton's work a century later. Thus, the Copernican Revolution marked a turning point in the study of cosmology and astronomy, making it a truly important scientific revolution. Darwinian Revolution English naturalist, geologist, and biologist Charles Darwin is credited for stirring another important scientific revolution in the mid-19th century. His treatise on the science of evolution, On the Origin of Species, was published in 1859 and began a revolution that brought humanity to a new era of intellectual discovery. The Darwinian Revolution benefitted from earlier intellectual revolutions in the 16th and 17th century in that it was guided by confidence in the human reason's ability to explain phenomena in the universe. During a five-year surveying mission aboard the Royal Navy Brig HMS Beagle to the Galapagos Islands, Darwin became fascinated by the 18th-century Scottish geologist Charles Lyell, whose work on geology focused on uniformitarianism. In his Principles of Geology, Lyell argued that observable processes occurring in the present are sufficient evidence to explain all geological formations and features across a vast period of time. Darwin would apply this theory of evolutionary uniformitarianism in his observations on board the Beagle and at Galapagos Island to account for the varying features in living systems and organisms. For his part, Darwin gathered evidence pointing to what is now known as natural selection, an evolutionary process by which organisms, including humans, inherit, develop, and adapt traits that favored survival and reproduction. These traits would be manifested in an offspring that is more fit and well-suited to the challenges of survival and reproduction. His most important observations applying the theory of evolution include the famous Darwin's finches, a group of 14 or so closely related specifies of finches, which went through rapid adaptation to an unstable and challenging environment. Diversification and speciation then took place in the finches' phenotypes, such as beak size and shape, body size, plumage, and feeding behavior. In his book, On the Origin of Species, Darwin presented a logical argument for the mechanism of natural selection based on two observations and inferences First, he argued that individuals in a in traits. Second, a species vary to some degree actually species produces more offspring than a survives to mature and reproduce. Out of these two observations and inferences, Darwin explains that individuals with traits better fitted to their environments are more likely to survive and reproduce; hence, their offspring are more likely to inherit their adaptive traits. Darwin's theory of evolution was, of course, met with resistance and considered to be controversial. Critics accused the theory of being either short in accounting for the broad and complex evolutionary process or that the functional design of organisms was a manifestation of an omniscient God. The Darwinian Revolution can be likened to the Copernican Revolution in their demonstration of the power of a lawful system in nature-in this case evolution-in explaining the biological phenomena of survival and reproduction. Nonetheless, the importance of the Darwinian Revolution in modern science cannot be underestimated. It made the Copernican Revolution from three centuries earlier come in full circle by demonstrating that, even in the case of biology and evolution, nature may be described as a lawful system that can be explained through scientific thought. Through the Darwinian Revolution, the development of organisms and the origin of unique forms of life and humanity could be rationalized by an orderly process of change underpinned by the laws of nature. Freudian Revolution The 19th-century Austrian neurologist Sigmund Freud is credited for stirring the 20th-century scientific revolution named after him, the Freudian Revolution. Psychoanalysis, as a school of thought in psychology, is at the center of this revolution. Freud developed psychoanalysis as a scientific method of understanding inner and unconscious conflicts embedded within one's personality, springing from free associations, and the dreams and fantasies of the individual. Psychoanalysis immediately shot into controversy. Psychoanalytic concepts of psychosexual development, libido, and ego were met with both support and resistance among scholars. Freud suggested that humans are inherently pleasure- seeking individuals. These notions were particularly caught in the crossfire of whether Freud's psychoanalysis fit in the scientific study of the brain and mind. He also proposed the id, superego, and ego as the components of his structure of personality. He refers to the id as the primitive and instinctual part of the mind, containing sexual and aggressive drives and hidden memories; the superego as the ethical component, providing the moral standards and acting as the moral conscience of an individual; and the ego as the realistic component, mediating between the desires of the id and superego. Freud's arguments are hinged on the individuals' unconscious conflicts. According to him, the human mind tends to keep evil thoughts and desires away, which in the process are banished to the unconscious mind. This creates a two-sided personality (i.e., dual personality). Individuals tend to keep things which do not threaten their self-esteem in the conscious mind, and those which threaten their self-esteem in the unconscious. In Freudian theory, this dual personality creates a "Jekyll and Hyde" situation in a person. Scientists working on a biological approach in studying human behavior criticized psychoanalysis for lacking vitality and bordering on being unscientific as a theory. Particularly, the notion that all humans are destined to exhibit Oedipus and Electra complexes (i.e., sexual desire for the opposite- sex parent and exclusion of the same-sex parent) did not seem to be supported by empirical data. In the same vein, it appeared to critics that psychoanalysis, then, was more of an ideological stance than a scientific one. Amidst the controversy, Freud's psychoanalysis is widely given credit for dominating psychotherapeutic practice from the early 20th century. Psychodynamic therapies that treat a myriad of psychological disorders still remain largely informed by Freud's work on psychoanalysis. Scientific Revolutions and Society – What's the connection? the history of mankind Scientific revolutions across ushered in a renewed and enlightened world through a better understanding of nature and the universe. Because of scientific revolutions, more reliable scientific knowledge became available. In turn, the application of more reliable scientific knowledge led to more comfortable, more efficient, and more meaningful lives-better than those whose lives were constrained because knowledge about public health, internal medicine, weather and climate change, and pollution and waste management was not as robust. Truly, mankind today stands on the shoulders of men and women of science, whose struggles laid the foundation of an enlightened, empowered, and healthy society. Even as more daunting and more complex problems arise as society moves into the challenges of 21st-century living, society continues to benefit from the work and example of men and women of science whose struggles for reliable science inspired generations that came after them to continue the work and gift the future with an even more reliable science and its applications. The University College London aptly puts this cycle of beneficence and progress as follows: 1. The progress-achieving methods of science need to be correctly identified. 2. These methods need to be correctly generalized so that they become fruitfully applicable to any worthwhile, problematic human endeavor, whatever the aims may be, and not just applicable to the one endeavor of acquiring knowledge. 3. The correctly generalized progress-achieving methods then need to be exploited correctly in the great human endeavor of trying to make social progress towards an enlightened, wise world. Indeed, social progress is the point of all of this. Society is only able to move forward because science-based and technology-driven decision-making has become possible through the scientific revolutions before us. As a member of the scientific community or community of science, it is our reward and privilege to be living in a world that is safer, more connected, and more efficient than ever before. Our responsibility, however, is to hold the line and continue to refine and sharpen not only our understanding of the universe but also, and more importantly, the application of traditional and new knowledge in the pursuit of social justice and equality. SECTION 4 A General Survey on the Unfolding of Science and Technology in the Philippines Alongside writing and counting systems that were already in place even before the Spaniards colonized the archipelago, the country's rich history of science and technology was fueled by rich natural resources, which were-by and large-the source of medicinal and therapeutic products and methods of early settlers. Even then, Filipino ancestors displayed keen awareness of the country's opulent flora and fauna and an ability to transform these readily-available resources to meet their daily and survival needs. Aside from this, precolonial Filipinos also exhibited advanced engineering knowledge as evidenced by the Banaue Rice Terraces, which is a hallmark of Philippine science and technology at the time (Reyes, 1972). The arrival of the Spaniards in the country sped up the development of science and technology in the country. The catalyst of this period's advancement is, of course, the introduction of formal education and the establishment of scientific institutions. With an understanding of the state of science and technology before they settled in the archipelago, the Spaniards focused on furthering the area of agriculture among other areas. Rodriguez (1996) also claimed that the colonizers' recognition of the country's lush flora and fauna entailed an emphasis on biology in formal education, which is evidenced by a number of botanists who advanced the study of endemic fauna in the country. Unfortunately, however, the rise of the Galleon trade meant that agriculture at the time was left relatively underemphasized. The American period continued the progress made during the Spanish era. As Philippine governance and bureaucracy took form and stabilized during the American rule, science and technology progressed at the helm of the Bureau of Science in 1905, which was the Government Laboratories established in 1901, which dealt with the study of tropical diseases and laboratory projects. This was later replaced by the Bureau of Science in 1905, which was the country's primary research center when World War II broke out. (Cariño, 1993). In 1933, the National Research Council of the Philippines was established, which played an important role in the advancement of science and technology at the time (Reyes, 1972). In 1946, the Bureau of Science was replaced by the Institute of Science. In terms of legislation, the seminal statute passed during this era that facilitated the further advancement of science and technology was the Science Act of 1958, establishing the National Science Development Board (Cariño, 1993). Under these government bureaus during the American period, science geared towards the country's traditional strengths, such as agriculture, forestry, medicine, pharmacy, and food processing, which is a seeming recognition of the country's rich natural resources, as was the case during the Spanish rule. The administration of Ferdinand E. Marcos, Sr. placed greater importance on science and technology. Advancement of science and technology became not merely a research area, but a matter of national development. This is corroborated by the fact that the prevailing constitution at the time, the 1973 Philippine Constitution, in Article XV Section 9 declared, "advancement of science and technology shall have priority in the national development." Raising the status of science and technology as a matter of national policy, the era saw the enactment of various laws that drove the development of science and technology in the country. Several agencies and organizations were established, such as the National Grains Authority (now the National Food Authority), Philippine Atmospheric, Geophysical and Astronomical Services Administration (PAGASA), International Rice Research Institute (IRRI), and the National Committee on Geological Sciences, among others. In 1976, Marcos established the National Academy of Science and Technology (NAST) to be the reservoir of scientific and technological expertise in the country. In succeeding administrations, emphasis on science and technology as a pivotal component of national progress was played out in many various ways. Cariño (1993) reported that Corazon Aquino's presidency placed an even greater emphasis on science and technology on its role towards economic recovery and sustained economic growth, as articulated in the Medium Term Philippine Development Plan of 1987-1992. President Fidel Ramos believed that science and technology was one of the means through which the Philippines could attain the status of a newly industrialized country. Unsurprisingly, it was during his term as president that the number of science and technology personnel and scholars grew speedily. This was facilitated by the passing of Republic Act 8439 or the Magna Carta for Science and Technology Personnel. During President Joseph Estrada's term, two key laws relating to science and technology were passed: Republic Act 8749 or the Clean Air Act of 1999 and Republic Act 8790 or the Electronic Commerce Act of 2000. President Gloria Macapagal Arroyo's administration was dubbed as the "golden age" of science and technology in the Philippines (Rodriguez, 1996). During Arroyo's presidency, numerous laws and projects were passed to push technology as a key to economic progress. During the term of President Benigno Aquino III, efforts were exerted to further improve science and technology. Laws such as Republic Act 10601 or the Agricultural and Fisheries Mechanization (AFMech) Law, which aims to beef up agriculture and fisheries through mechanization, and Republic Act 10692 or the PAGASA Modernization Act, which compels and provides support for the Philippine weather bureau to modernize its technological operational capacity and further strengthen its role as the national weather agency and become a center of excellence in weather information services, were signed into law. It was also under the term of President Aquino III that the Republic Act 10844 or the Department of Information and Communication Technology Act of 2015, which recognizes the vital role of information and communication in nation- building thereby establishing the DICT, was signed into law. Finally, under the administration of President Rodrigo Duterte, the signing of the Republic Act 11035 or the Balik Scientist Program Act was made possible. With its signing, the law strengthened the Balik Scientist Program, first established in 1975, by providing support for returning Filipino scientists from abroad to participate in the Grants-in-Aid research program of the Department of Science and Technology, in order for the returning scientists to implement their projects in accordance with government regulations and the need for such programs. The law also provides for tax and duty exemptions, importation of professional equipment and materials, and free medical and accident insurance for returning scientists. In 2020, President Duterte conferred the Order of National Scientist to Emil Q. Javier for his outstanding work in the field of agriculture. This attempt to trace the history of science and technology is surely rather too concise to account for the leaps and feats that the Philippines had already made in the area. If one is to fully understand the state of science and technology in the Philippines, one has to look into the history in greater detail. Nonetheless, from this brief historical review, one can conjecture what had been and what could be for Philippine science and technology. Since time immemorial, the Philippines always placed premium upon science and technology and its role in nation- building. Periods and administrations that have come and gone all displayed a collective recognition of the significance of science and technology in building a better Philippines. The number of legislations, scientists, and scientific achievements in the country suggests the importance of this area to achieve not only national unity but also economic progress. Hence, one can surmise that if the Philippines is to continue its track towards a more progressive society and a more robust economy, science and technology has to be a key component of this endeavor. Inventions by Filipino Scientists The Philippines boasts of its own history and tradition of scientific and technological innovation. Filipino scientists have long been known for their ingenuity. As with all other inventions, necessity has always been the mother of Philippine inventions. Most of these inventions appealed to the unique social and cultural context of the archipelagic nation. Even during the precolonial era, our Filipino ancestors developed scientific and technological innovations focused on navigation, traditional shipbuilding, textiles, food processing, indigenous arts and techniques, and even cultural inventions. Some of the most important inventions by Filipino scientists are discussed below. Electronic Jeepney. The jeepney is perhaps one of the most recognizable international symbols of the Philippines and the most popular mode of public transportation in the country. It is also perhaps one of the most enduring symbols of Filipino ingenuity. They were designed and improvised from scratch out of military jeeps that the Americans left in the country after World War II. As demand for more responsive transportation technology arose, the e-jeepney was introduced in Metro Manila and Bacolod City. The e-jeepney was the inventive response to criticisms of the traditional jeepney that belched smoke, directly causing air pollution which also made it unsustainable and uneconomical. The e-jeepneys were designed to be environment- friendly, as they did not produce noise and belch smoke as do traditional jeepneys. E-jeepneys are also more sustainable as they run on electricity, limiting the need for diesel and gasoline. They are also more economical as electricity is far cheaper than ordinary diesel, paving the way for more profit for jeepney drivers. Erythromycin. Perhaps one of the most important local medical inventions is the erythromycin. The Ilonggo scientist Abelardo Aguilar invented this drug out of a strain of bacterium called Streptomyces erythreus, from which the drug got its name. As with several other local scientists, however, Aguilar was not credited for this discovery by Eli Lilli Co., Aguilar's US employer, to whom he sent the strain for separation. The US company eventually owned the credit entirely for this discovery. Medical Incubator. World-renowned Filipino pediatrician and National Scientist Fe del Mundo is credited for her work leading to the invention of the incubator and jaundice-relieving device. Del Mundo, the first woman pediatrician to be admitted to the prestigious Harvard University's School of Medicine and founder of the first pediatric hospital in the country, with her pioneering work in pediatrics that spanned a total of eight decades, won the 1977 Ramon Magsaysay Award, Asia's premier prize granted to outstanding individuals whose selfless service remarkably contributed to the betterment of the society. Her original improvised incubator consisted of two native laundry baskets of different sizes that were placed one inside the other. Warmth was generated by hot water bottles placed around the machine. A makeshift hood over the baskets allowed oxygen to circulate inside the incubator. Del Mundo's incubator was particularly outstanding as it addressed the context of Philippine rural communities that had to contend with the absence of electricity in trying to regulate the body temperature of newborn babies. For this purpose, Del Mundo's invention was truly ingenious. Mole Remover. In 2000, a local invention that had the ability to easily remove moles and warts on the skin without the need for any surgical procedure shot to fame. Rolando dela Cruz is credited for the invention of a local mole remover that made use of extracts of cashew nuts (Annacardium occidentale), which were very common in the Philippines. The indigenous formula easily caught on and affordable, was not painful, and did not leave marks that surgical procedures would. Dela Cruz won a gold medal for this invention in the International Invention, Innovation, Industrial Design, and Technology Exhibition in Kuala Lumpur, Malaysia in 2000. Banana Ketchup. Filipino food technologist Maria Orosa is credited for the invention of banana ketchup, a variety of ketchup different from the commonly known tomato ketchup. Her invention appeals particularly to Filipinos who love using condiments to go along with their food. Historical accounts posit that Orosa invented the banana ketchup at the backdrop of World War II when there was a huge shortage of tomatoes. As a result, Orosa developed a variety that made use of mashed banana, sugar, vinegar, and spices, which were all readily available. Orosa's banana ketchup is brownish-yellow in natural color, but is dyed red to resemble the color of the most loved tomato ketchup.