Full Transcript

INFORMATION SYSTEM CONCEPT ACCURACY: Information should be free - An information system (IS) is a from errors and should reflect reality as collection of hardware, software, closely as possible. data, procedures, and people that work together to prov...

INFORMATION SYSTEM CONCEPT ACCURACY: Information should be free - An information system (IS) is a from errors and should reflect reality as collection of hardware, software, closely as possible. data, procedures, and people that work together to provide information RELEVANCE: Information should be necessary for the functioning of an pertinent to the task or decision at hand. organization. It encompasses a wide range of technologies and processes TIMELINESS: Information should be that manage and manipulate data available when needed to be effective. into meaningful information for various users. COMPLETENESS: Information should provide a full picture of the subject matter. KEY CONCEPTS: Data vs. Information RELIABILITY: Information should be dependable and consistent. DATA - raw facts - -figures na kino-collect with respect sa subject matter FRAMEWORKS OF INFORMATION SYSTEMS INFORMATION - process na FRAMEWORKS: Various frameworks help - na-analyze, manipulate, transcribe na conceptualize and structure information yung data na na-collect and binigyan na ng systems. results Three common frameworks are: Key Concepts: Information Technology vs. Information System FIVE COMPONENTS FRAMEWORK: Developed by Peter Senge, this framework INFORMATION TECHNOLOGY includes hardware, software, data, - refers to the technology used to manage procedures, and people as the five essential and process data components of an information system. INFORMATION SYSTEM INFORMATION SYSTEMS PYRAMID: This - includes the IT, processes, people, framework categorizes information systems procedures into three levels: operational, managerial, - in terms of real life examples, mga and executive, based on their role in the applications na ginagamit like Facebook, organization. TikTok, and Instagram STRATEGIC INFORMATION SYSTEMS ATTRIBUTES OF INFORMATION FRAMEWORK: Focuses on aligning IS with an organization's strategic goals and competitive advantage. - COMPONENTS OF INFORMATION SYSTEMS HARDWARE: Physical devices like computers, servers, and networking equipment. SOFTWARE: Programs and applications that enable data processing and manipulation. DATA: Raw facts and figures that serve as the foundation for information. PROCEDURES: Sets of instructions or protocols for handling data and making decisions. PEOPLE: Users, analysts, developers, and other personnel involved in the IS lifecycle. - The invention and creation of computers is a complex tapestry of human brilliance, engineering skill, and unrelenting technological pursuit. This voyage captures a fascinating continuum of invention, from the first abacuses to the advanced digital devices we use today. It is crucial to recognize the significant significance of these discoveries and their influence on society as we explore this astonishing trajectory. This study aims to provide a thorough analysis of the historical background by tracing the origins of the earliest computing devices, closely examining the development from the first generation to the present, outlining key computer science turning points, and examining the far-reaching implications for various facets of society. In the words of Charles Babbage, often regarded as the "father of the computer," who envisioned the Analytical Engine in the 19th century: "Errors using inadequate data are much less than those using no data at all." Indeed, the evolution of computers has been an unceasing pursuit of precision, efficiency, and connectivity, shaping the world we inhabit today. Early Computing Devices Abacus The abacus' creation in antiquity marks the beginning of computing as we know it today. It is thought that the abacus was invented in Mesopotamia between 2700 and 2300 BCE. It was a straightforward counting device that let users move beads along rods to accomplish elementary arithmetic calculations. The abacus played a significant role in early mathematics and commerce and laid the groundwork for later computational breakthroughs, while lacking the complexity of modern computers. Babbage's Analytical Engine Charles Babbage developed the idea for the Analytical Engine in the 19th century; it was a mechanical general- purpose computer that could do numerous calculations automatically. Babbage's theories laid the foundation for modern computing even if it was never constructed during his lifetime (1822–1871). His idea, which resembled the architecture of modern computers, had elements like an arithmetic logic unit (ALU), control flow, and storage. The Analytical Engine was developed with help from Ada Lovelace, who is frequently cited as the first computer programmer. Ada Lovelace wrote algorithms for the Analytical Engine. Early Punch Card Systems Punch card systems became a widely used method of data processing in the late 19th and early 20th centuries. In the late 1800s, Herman Hollerith created a punch card tabulating device after being influenced by Charles Babbage's work. His creation was applied to the 1890 US Census, greatly cutting down on the amount of time needed for data analysis. Early punch card systems emphasized the value of data entry and processing, paving the way for the creation of electronic computers. First Generation (1940s-1950s) The first generation of electronic computers marked a significant leap forward in computational power. Notable developments during this period include:  ENIAC (1945): The Electronic Numerical Integrator and Computer (ENIAC) was the world's first general-purpose digital computer. It used vacuum tubes and could perform a wide range of calculations, including ballistics calculations for the military.  UNIVAC I (1951): The UNIVAC I, short for Universal Automatic Computer, was the first commercially produced computer. It played a role in the 1952 presidential election by accurately predicting the outcome. Second Generation (1950s-1960s) The second generation of computers saw the transition from vacuum tubes to transistors, resulting in smaller, faster, and more reliable machines. Key developments included:  IBM 1401 (1959): This widely used business computer contributed to the automation of data processing tasks, making it accessible to a broader range of organizations.  COBOL (1959): The development of the COmmon Business-Oriented Language (COBOL) made software more accessible, allowing programmers to write code in a language closer to natural language. Third Generation (1960s-1970s) Advancements in integrated circuits and the development of the microprocessor defined the third generation of computers. Notable milestones included:  IBM System/360 (1964): The System/360 family introduced compatibility across different models, a concept that remains relevant in modern computing.  Intel 4004 (1971): Intel's 4004 microprocessor marked the birth of microcomputing, paving the way for personal computers. Fourth Generation (1980s-Present) The fourth generation witnessed the emergence of personal computers and the widespread use of microprocessors. Key developments include:  IBM Personal Computer (1981): The IBM PC became the industry standard and led to the popularization of personal computing.  World Wide Web (1989): The invention of the World Wide Web by Tim Berners-Lee revolutionized communication and information sharing, setting the stage for modern internet-based computing. From the abacus, an ancient counting device, to the complicated and interconnected digital marvels of the current era, the evolution of computers captures an amazing journey. This evolution is broken up by important breakthroughs like Charles Babbage's forward-thinking Analytical Engine and the introduction of electronic computing in the middle of the 20th century, which ushered in a line of increasingly potent and portable machines. This progress has been fueled by significant developments in computer science, such as the development of programming languages, the rise of complex operating systems, and the creation of the internet. Beyond technological advancements, computers have had a profound social impact, transformed business, education, healthcare, and entertainment while reshaped the very fabric of our everyday lives. The development of computers is evidence of humanity's persistent intellect and capacity to use innovation to create significant societal change. Milestones in Computer Science Invention of Programming Languages  Fortran (1957): John Backus and his team created Fortran, the first high-level programming language, simplifying the process of writing code.  C Programming Language (1972): Dennis Ritchie developed C, a highly influential language that underlies many modern programming languages. Development of Operating Systems  UNIX (1969): Ken Thompson, Dennis Ritchie, and others at Bell Labs developed UNIX, which served as the basis for many subsequent operating systems.  Microsoft Windows (1985): Microsoft's Windows operating system became dominant in the personal computer market. Birth of the Internet  ARPANET (1969): The U.S. Department of Defense created ARPANET, a precursor to the modern internet.  World Wide Web (1990s): The development of web browsers and HTML by Tim Berners-Lee made the internet accessible to the public. Impact on Society Business  The use of computers has revolutionized company processes by enabling the automation of jobs like accounting, inventory control, and customer service.  E-commerce has thrived, enabling businesses to connect with a global clientele. Education  Computers have revolutionized education with e-learning platforms, online resources, and interactive educational software.  Access to information has been democratized, facilitating research and learning. Healthcare  Electronic health records (EHRs) have eased healthcare administration and enhanced patient care.  Medical imaging and diagnostic tools have advanced, assisting in the early discovery of disease. Entertainment  Computers have revolutionized the entertainment industry, enabling video games, digital art, and multimedia experiences.  Streaming services and digital distribution have transformed how we consume music, movies, and television. Conclusion In conclusion, the development of computers has been marked by significant technological developments, starting with primitive tools like the abacus and Babbage's Analytical Engine and continuing to the present. The present digital environment has been shaped by significant developments in computer science, the creation of programming languages and operating systems, and the advent of the internet. Computers have had a significant impact on society, influencing a variety of industries including commerce, education, healthcare, and entertainment. They also continue to significantly influence our daily lives in many ways. Contemporary Developments Emerging Technologies Emerging technologies in the field of computing are poised to revolutionize various industries. Quantum Computing Utilizing the ideas behind quantum physics, new methods of information processing are made possible. Its possible uses include drug discovery, optimization issues, and cryptography. Quantum computers could break existing encryption systems, but they could also be used to create unhackable communication channels through quantum key distribution. Quantum simulations may hasten the development of novel substances and medicines. Artificial Intelligence (AI) AI is still developing quickly and is having an impact on everything from finance to healthcare. Deep learning approaches have recently made significant advances in speech recognition, computer vision, and natural language processing. Autonomous vehicles powered by AI offer safer mobility, while AI is assisting with medication discovery and disease diagnostics in the medical field. The ethical issues around bias, data privacy, and AI decision-making are still crucial to consider. Blockchain Blockchain technology is disrupting industries like finance and supply chain management. Transparent and tamper-proof transactions are ensured by its secure and decentralized ledger technology. Beyond cryptocurrencies, blockchain technology can be used to improve trust and security in a variety of industries by enabling identity verification, voting processes, and supply chain traceability. Recent Innovations Recent innovations in computer hardware, software, and networking are pushing the boundaries of what computers can achieve. Quantum Supremacy Google claimed to have achieved quantum supremacy in 2019 by proving that quantum computers can solve a given issue more quickly than even the most sophisticated classical computers. This development heralds the beginning of real-world uses for quantum computing. Edge Computing Edge computing reduces latency and enables real-time decision-making by bringing processing capacity closer to data sources. It serves IoT, autonomous driving, and augmented reality applications where split-second reactions are essential. AI in Generative Models Astonishing powers in natural language synthesis, artistic creation, and even coding have been demonstrated by generative models like GPT-3. Although these models are opening the door for innovative AI applications, they also raise questions regarding the development of deep fakes and false information. Ethical and Societal Considerations Advanced computing technologies raise significant ethical and societal concerns. Privacy Protecting people's privacy becomes a top priority as data collecting grows increasingly prevalent. It might be difficult to strike a compromise between privacy protection and data-driven innovation. Bias in AI AI algorithms can pick up biases from training data, leading to unfair results. A crucial ethical requirement is to address bias in AI and ensure fairness in decision-making procedures. Cybersecurity Greater cybersecurity dangers come with more advanced technology. It is still difficult to secure sensitive data, infrastructure, and personal information from cyberattacks and breaches. Future Trends Future computing trends are difficult to predict, but several important themes are starting to emerge. Quantum Computing Maturity Quantum computing is expected to mature, enabling breakthroughs in cryptography, optimization, and materials science. Widespread adoption, however, may be a decade or more away. AI in Healthcare Predictive analytics, medication discovery, and individualized treatment regimens will all become mainstream as AI's involvement in healthcare grows. In this progression, regulatory frameworks and data privacy will be crucial. Ethical AI Regulation Governments and organizations will step up their efforts to control AI, concentrating on accountability, transparency, and bias reduction. The establishment of ethical AI standards will guarantee ethical AI development and application. Conclusion In summary, modern technological advancements, particularly cutting-edge technologies like quantum computing, artificial intelligence, and blockchain, are transforming industries and societies all over the world. Recent developments are expanding computing power, but to ensure responsible technology adoption, ethical and sociological issues must be considered. The evolution of ethical AI rules, the maturation of quantum computing, and the growing significance of AI in healthcare are all factors in predicting future trends in computing. These advances offer both benefits and difficulties, highlighting the necessity of ongoing study and moral stewardship in the technology industry. References: F. Arute et al., “Quantum supremacy using a programmable superconducting processor,” Nature News, https://www.nature.com/articles/s41586-019-1666-5 (accessed Sep. 9, 2023). S. J. Bigelow, “What is edge computing? everything you need to know,” Data Center, https://www.techtarget.com/searchdatacenter/definition/edge-computing (accessed Sep. 9, 2023). IBM, “What is quantum computing?,” IBM Quantum Computing, https://www.ibm.com/topics/quantum-computing (accessed Sep. 9, 2023). E. Burns, N. Laskowski, and L. Tucci, “What is artificial intelligence and how does ai work? TechTarget,” Enterprise AI, https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence (accessed Sep. 9, 2023). A. Hayes, “Blockchain facts: What is it, how it works, and how it can be used,” Investopedia, https://www.investopedia.com/terms/b/blockchain.asp#:~:text=Blockchain%20is%20a%20type%20of,ha s%20been%20as%20a%20ledger. (accessed Sep. 9, 2023). IBM, “The Birth of the IBM PC,” IBM Archives: The birth of the IBM PC, https://www.ibm.com/ibm/history/exhibits/pc25/pc25_birth.html (accessed Sep. 9, 2023). CERN, “A short history of the web,” CERN, https://home.cern/science/computing/birth-web/short- history- web#:~:text=Where%20the%20Web%20was%20born,and%20institutes%20around%20the%20world. (accessed Sep. 9, 2023). G. Wright, “What is ARPANET and what’s its significance?,” Networking, https://www.techtarget.com/searchnetworking/definition/ARPANET (accessed Sep. 9, 2023). L. Heide, “Punched-card systems and the early information explosion, 1880–1945,” Project MUSE, https://muse.jhu.edu/book/3454 (accessed Sep. 9, 2023). M. S. Swaine and P. A. Freiberger, “Analytical Engine,” Encyclopædia Britannica, https://www.britannica.com/technology/Analytical-Engine (accessed Sep. 8, 2023). T. I. Team, “Abacus: Definition, how it’s used, and modern applications,” Investopedia, https://www.investopedia.com/terms/a/abacus.asp#:~:text=An%20abacus%20is%20a%20calculation,up %20to%20the%20cubic%20degree. (accessed Sep. 8, 2023). Overview of Computer Resource in Information 4. Data Storage Systems Data storage systems store and manage the vast Computer resources serve as the backbone of amount of data used by organizations. Types of information systems, enabling organizations to process storage devices (e.g., hard drives, SSDs), databases, data, make informed decisions, and optimize operations. and data storage technologies are discussed. These resources include hardware, software, networks, Emphasis on data security, redundancy, and and data storage. scalability in data storage solutions. Components and Elements of Computer Resources Evolution of Computer Resources  Hardware Mainframes  Software Personal Computers (PCs)  Network  Data Storage Servers Cloud Computing 1. Hardware Hardware comprises physical devices such as central 1. Mainframes processing units (CPUs), memory, storage devices,  Mainframes were the initial computing input/output peripherals, and network equipment. resources used for large-scale data processing. Different types of hardware serve specific functions  They were characterized by high processing within an organization's information systems, from power, reliability, and centralized control. servers that host applications to end-user devices like  Mainframes played a vital role in early desktop computers and smartphones. information systems for tasks like accounting and inventory management 2. Software 2. Personal Computers (PCs)  Software encompasses operating systems,  The advent of PCs decentralized computing, application software, and middleware that empowering individual users. enable hardware components to work together.  PCs facilitated personal productivity and led to  Distinguishing between system software (e.g., the development of client-server architectures. OS) and application software (e.g., word  The importance of desktop applications and processing, CRM) is essential. graphical user interfaces (GUIs) in information  The role of software in managing hardware systems. resources and facilitating information processing 3. Servers is explored.  Server technology evolved to support networked 3. Network computing environments.  Discussion on file servers, application servers,  NETWORKS, INCLUDING LANS (LOCAL AREA web servers, and their roles in modern NETWORKS) AND WANS (WIDE AREA information systems. NETWORKS), CONNECT HARDWARE  Emphasis on scalability and redundancy in server COMPONENTS AND ENABLE DATA architectures. EXCHANGE. 4. Cloud Computing  UNDERSTANDING NETWORK PROTOCOLS,  The paradigm shift towards cloud computing, TOPOLOGIES, AND SECURITY MEASURES IS where resources are accessed remotely via the CRUCIAL. internet.  MODERN ADVANCEMENTS LIKE WIRELESS  Advantages of cloud computing, such as NETWORKS AND THE INTERNET OF THINGS scalability, cost-effectiveness, and reduced IT (IOT) IMPACT INFORMATION SYSTEMS. infrastructure management. Significance of Computer Resources in MULTI-CORE PROCESSORS: ENABLING PARALLELISM Information Systems Architectural Features  The critical role of computer resources in information systems for data processing,  Multi-Core Processor Architecture: Symmetric decision-making, and communication. Multiprocessing (SMP) or Asymmetric  The impact of resource allocation and Multiprocessing (AMP) configurations. management on the efficiency and effectiveness  Shared Memory vs. Distributed Memory: Cores of an organization's operations. may share memory (SMP) or operate with separate memory (AMP). PARALLEL COMPUTING: ENHANCING PERFORMANCE WITH MULTI-CORE PROCESSORS Enhancing Performance in the Digital Age Parallel computing is a computational approach where multiple tasks or processes are executed concurrently, with the aim of achieving improved computational performance and efficiency. PARALLEL COMPUTING: CORE CONCEPTS 1. Parallelism Fundamentals Definition: Parallel computing is the concurrent execution of multiple tasks to achieve faster and more Benefits of Multi-Core Processors efficient computation.  Enhanced Performance: Execution of multiple Motivation: Sequential computing faces limitations in threads or processes in parallel leads to processing speed and scalability, driving the need for improved performance. parallelism.  Energy Efficiency: Multi-core processors offer Types of Parallelism better performance per watt, reducing power consumption. Task Parallelism: Dividing a problem into smaller tasks that can be executed simultaneously. Challenges Data Parallelism: Processing multiple data elements  Parallel Programming Complexity: Developing concurrently. software that efficiently utilizes multiple cores requires expertise in parallel programming.  Amdahl's Law: Highlights potential bottlenecks in parallelization. PARALLEL PROGRAMMING TECHNIQUES 1. Shared Memory Parallelism Thread-Based Programming: Threads within a single process share memory, allowing for shared data structures and communication. OpenMP: A directive-based API simplifying the creation of multithreaded applications. 2. Message Passing MPI (Message Passing Interface): Facilitates communication among distributed processes via message passing, common in high- performance computing (HPC) and cluster computing. 3. Data Parallelism SIMD (Single Instruction, Multiple Data): Executes the same operation on multiple data elements simultaneously. GPU Computing: Leveraging Graphics Processing Units for data-parallel workloads using frameworks like CUDA and OpenCL. RELEVANCE IN TODAY'S COMPUTING LANDSCAPE 1. Scientific and High-Performance Computing (HPC) Parallelism is essential for complex simulations, scientific research, and weather forecasting. 2. Big Data Analytics Parallel processing accelerates data analysis, enabling real-time decision-making in analytics and data-driven industries. 3. Artificial Intelligence (AI) and Machine Learning Training large Al models benefits from parallelism, reducing training times and enabling faster insights. 4. Gaming and Graphics Multi-core processors and GPUs power immersive gaming experiences and complex 3D rendering. 5. Web Services and Cloud Computing Scalable, parallelized services are vital for meeting the demands of cloud-based applications and web services.

Use Quizgecko on...
Browser
Browser