INFORMATION SYSTEM REVIEWER.docx

Full Transcript

**\[LECTURE\]** Information System - is a collection of hardware software, data, procedures, and people working together. Data vs information - Information technology - refers to technology or application or everyday transaction - information system - refers to the whole...

**\[LECTURE\]** Information System - is a collection of hardware software, data, procedures, and people working together. Data vs information - Information technology - refers to technology or application or everyday transaction - information system - refers to the whole system which includes system processes. **Raw Data** - refers to the data that was initially generated by a system, device, or operation and has not been processed or altered. **5 TYPES OF SYSTEM** 1. **Accuracy** -should reflect reality as closely as possible 2. **Timeliness** -should be available when needed to be effective 3. **Completeness** -information should provide a full picture of the subject of the subject matter 4. **Reliability** - should be dependable and consistent. 5. **Relevance --** information should be pertinent to the task or decision on hand **Framework information system** --various framework helps conceptualize and structure information system **Five components framework** -developed by Peter Senge includes hardware, software, data, procedure, and people. **Information system pyramid** -- This framework categorizes information systems into three levels: operational, managerial, and executive, based on their role in the organization. **Strategic Information System Framework** - focuses on aligning with an organization's strategic goals and competitive advantage. **Components of information systems** **Hardware** - physical devices like computers, servers, and networking equipment - program and application that enables data processing and manipulation. - raw facts and figures that serve as the foundation for information - set of instructions or protocols for handling data and making decisions - users, analysts, developers, and other personnel involved in the IS lifecycle. **Computer resources** - is an enabler of organizational process and decision-making -computer resources encompass hardware, software, networking, data storage, and IT services that drive organizational activities. - These resources enable automation, data management, and communication, all critical for modern. Supporting organizational processes I. Operation management a. **ERP (Enterprise Resource Planning**) systems, automated production lines, and data analysis tools enhance operational efficiency. b. **Communication:** cloud services and collaboration platforms (like Stacks and Microsoft Teams) allow seamless information sharing. c. **Customer Relationships Management(CRM):** CRM SYSTEM management customer interaction. Improve service delivery and foster long-term customer relationships. Driving strategic initiatives i. **Digital transformation**: leveraging cloud computing and digital tools helps businesses stay competitive. ii. **Innovation:** computer resources support R&D through simulation,3D modeling, and prototyping. iii. **Globalization:** cloud-based systems enable companies to operate across borders with shared resources and global collaboration. Facilitating decision-making i. **Data analytics:** computer resources provide access to real-time data, enabling more informed and timely decision-making. ii. **Business intelligence(BI):** tools like Power BI or Tableau help organizations convert data into actionable insights. iii. **Predictive analytics:** forecasting tools and AI-powered algorithms helps decision making-makers anticipate future trends and customer needs 2\. Current Trends and Emerging - technologies impacting computer resources Artificial Intelligence Artificial Intelligence Automation: AI enhances process automation, from robotic process automation (RPA) in business operations to machine learning algorithms that optimize supply chains. Data Analytics: AI-driven analytics tools can detect patterns in large data sets, enabling faster and more accurate decision-making. Personalization: In marketing, AI tailors content and recommendations to individual users, improving customer experiences. 2.2 Internet of Things(IoT) 2.3 Edge Computing 3.1 Strategic Integration of AI AI-Driven Decision Support Systems: Invest in AI to enhance predictive analytics and decision-making frameworks. 3.2 IoT-Enabled Connectivity 3.3 Leveraging Edge Computing Emerging Technologies and Their Impact Enhance Operational Efficiency: by automating routine tasks and optimizing resource allocation, can reduce costs and improve productivity. **Computer Resource** *Overview of Computer Resources in Information Systems* [**Computer** **Resources**] serve as the **backbone of information systems, enabling organizations to process data, make informed decisions**, and **optimize operations**. These resources include *hardware, software, networks, and data storage.* **Components and Elements of Computer Resources** - hardware - **Hardware** **comprises physical devices** such as *central processing units (CPUs), memory, storage devices, input/output peripherals, and network equipment.* - Different types of hardware serve specific functions within an organization\'s information systems, from servers that host applications to end-user devices like desktop computers and smartphones. - software - **Software** encompasses *operating systems, application software, and middleware* that enable hardware components to work together. - Distinguishing between **system software** (e.g., OS) and **application software** (e.g., word processing, CRM) is essential. - The **role of software** in managing hardware resources and facilitating information processing is explored. - Networks - **Networks,** including *LANs (local area networks)* and *wans (wide area networks),* connect hardware components and enable data exchange - Understanding network protocols, topologies, and security measures is crucial - Modern advancements like **wireless networks** and the ***Internet of Things (LoT)*** impact information systems. - Data storage - **Data storage systems** **store and manage the vast amount of data organizations use.** - Types of storage devices (*e.g., hard drives, SSDs),* databases, and data storage technologies are discussed. - Emphasis on *data security, redundancy, and scalability* in data storage solutions. **Evolution of Computer Resources** - Mainframes - **Mainframes** were **the initial computing resources used for large-scale data processing.** - **They are** characterized by *high processing power, reliability, and centralized control.* - Mainframes played a vital role in early information systems for tasks *like accounting and inventory management.* - Personal Computers (PCs) - The advent of PCs decentralized computing, empowering individual users. - PCs facilitated personal productivity and led to the development of client-server architectures. - The importance of desktop applications and graphical user interfaces (GUIs) in information systems. - Servers - **Server** technology evolved to support networked computing environments. - Discussion on file servers, application servers, web servers, and their roles in modern information systems. - Emphasis on scalability and redundancy in server architectures. - Cloud Computing - The paradigm shifts towards cloud computing, where resources are accessed remotely via the internet. - **Advantages of cloud computing,** such as *scalability, cost-effectiveness, and reduced IT infrastructure management.* **Significance of Computer Resources in Information Systems** - The critical role of computer resources in information systems for **data processing, decision-making, and communication.** - The impact of resource allocation and management on the **efficiency and effectiveness of an organization\'s operations** **Charles Babbage** - as the \"father of the computer - who envisioned the Analytical Engine in the 19th century: *\"Errors using inadequate data are much less than those using no data at all.\"* **Early Computing Devices** 1. **Abacus** - marks the beginning of computing as we know it today. - was invented in **Mesopotamia between 2700 and 2300 BCE.** - It was a straightforward counting device that let users move beads along rods to accomplish elementary arithmetic calculations. - Played a significant role in early mathematics and commerce and laid the groundwork for later computational breakthroughs, while lacking the complexity of modern computers. 2. **Babbage\'s Analytical Engine** - developed the idea for the Analytical Engine in the 19th century; a mechanical general-purpose computer that could do numerous calculations automatically. it was - was developed with the theories that laid the foundation for modern computing even if it was never constructed during his lifetime (1822--1871) from **Ada Lovelace,** who is frequently cited as the first computer programmer. - was developed with help from Ada Lovelace **Ada Lovelace** - is frequently cited as the first computer programmer. - Wrote algorithms for the Analytical Engine. 3. **Early Punch Card Systems** - became a widely used data processing method in the late 19th and early 20th centuries. - Early punch card systems emphasized the value of data entry and processing, paving the way for the creation of electronic computers. - created a punch card tabulating device after being influenced by Charles Babbage\'s work. - His creation was applied to the 1890 US Census, greatly cutting down on the amount of time needed for data analysis. **First Generation (1940s-1950s)** **The first generation of electronic computers marked a significant leap forward in computational power. Notable developments during this period include:** - **ENIAC (1945): The Electronic Numerical Integrator and Computer (ENIAC)** - was the world\'s first general-purpose digital computer. - It used vacuum tubes and could perform a wide range of calculations, including ballistics calculations for the military. - **UNIVAC I (1951): The UNIVAC I, short for Universal Automatic Computer** - was the first commercially produced computer. - It played a role in the 1952 presidential election by accurately predicting the outcome. **Second Generation (1950s-1960s)** **The second generation of computers saw the transition from vacuum tubes to transistors, resulting in smaller, faster, and more reliable machines. Key developments included:** - **IBM 1401 (1959)** - This widely used business computer contributed to the automation of data processing tasks, making it accessible to a broader range of organizations. - **Common Business-Oriented Language (COBOL) (1959)** - The development of the COBOL made software more accessible, allowing programmers to write code in a language closer to natural language. **Third Generation (1960s-1970s)** **Advancements in integrated circuits and the development of the microprocessor defined the third generation of computers. Notable milestones included:** - **IBM System/360 (1964)** - **The System/360** family introduced compatibility across different models, **a concept that remains relevant in modern computing.** - **Intel 4004 (1971)** - Intel\'s 4004 microprocessor **marked the birth of micro-computing,** **paving the way for personal computers.** Fourth Generation (1980s-Present) **The fourth generation witnessed the emergence of personal computers and the widespread use of microprocessors. Key developments include:** - **IBM Personal Computer (1981)** - The IBM PC became the industry standard and led to the popularization of personal computing. - **World Wide Web (1989)** - The invention of the World Wide Web by **Tim Berners-Lee** revolutionized communication and information sharing, setting the stage for modern internet-based computing. **MILESTONES IN COMPUTER SCIENCE** Invention of Programming Languages - **Fortran (1957)** - John Backus and his team created Fortran, the first high-level programming language, simplifying the process of writing code. - **C Programming Language (1972)** - Dennis Ritchie developed C, a highly influential language that underlies many modern programming languages. - **DEVELOPMENT OF OPERATING SYSTEMS** - **UNIX (1969)** - Ken Thompson, Dennis Ritchie, and others at Bell Labs developed UNIX, which served as the basis for many subsequent operating systems. - **Microsoft Windows (1985)** - Microsoft\'s Windows operating system became dominant in the personal computer market. **BIRTH OF THE INTERNET** - **ARPANET (1969)** - The U.S. Department of Defense created ARPANET, a precursor to the modern internet. - **World Wide Web (1990s)** - The development of web browsers and HTML by Tim Berners-Lee made the internet accessible to the public. **IMPACT ON SOCIETY** **Business** - The use of computers hasrevolutionized company processes by enabling the automation of jobs like accounting, inventory control, and customer service. - E-commerce has thrived, enabling businessesto connect with a global clientele. **Education** - Computers have revolutionized education with e-learning platforms, online resources, and interactive educational software. - Accessto information has been democratized, facilitating research and learning. **Healthcare** - Electronic health records(EHRs) have eased healthcare administration and enhanced patient care. - Medical imaging and diagnostic tools have advanced, assisting in the early discovery of disease. **Entertainment** - Computers have revolutionized the entertainment industry, enabling video games, digital art, and multimedia experiences. - Streaming services and digital distribution have transformed how we consume music, movies, and television. **CONTEMPORARY DEVELOPMENTS** **Emerging Technologies** - in the field of computing are poised to revolutionize various industries. - **Quantum Computing** - Utilizing the ideas behind quantum physics, new methods of information processing are made possible. - Its possible uses include drug discovery, optimization issues, and cryptography. - This could break existing encryption systems, but they could also be used to create un-hackable communication channels through quantum key distribution. - Quantum simulations may hasten the development of novel substances and medicines. - **Artificial Intelligence (AI)** - AI is still developing quickly and is having an impact on everything from finance to healthcare. Deep learning approaches have decision-making are still crucial to consider. - AI is still developing quickly and is having an impact on everything from finance to healthcare. - Deep learning approaches have recently made significant advances in speech recognition, computer vision, and natural language processing. - Autonomous vehicles powered by AI offer safer mobility, while AI is assisting with medication discovery and disease diagnostics in the medical field. - The ethical issues around bias, data privacy, and AI - **Blockchain** - Blockchain technology is disrupting industries like finance and supply chain management. - Its secure and decentralized ledger technology ensures transparent and tamper-proof transactions. - Beyond cryptocurrencies, blockchain technology can be used to improve trust and security in a variety of industries by enabling identity verification, voting processes, and supply chain traceability. **Recent innovations** in computer hardware, software, and networking are pushing the boundaries of what computers can achieve. **Quantum Supremacy** - Google claimed to have achieved quantum supremacy in 2019 by proving that quantum computers can solve a given issue more quickly than even the mostsophisticated classical computers. This development heralds the beginning of real-world uses for quantum computing. **Edge Computing** - Edge computing reduces latency and enables real-time decision-making by bringing processing capacity closer to data sources. It serves IoT, autonomous driving, and augmented reality applications where split-second reactions are essential. **AI in Generative Models** - Astonishing powers in natural language synthesis, artistic creation, and even coding have been demonstrated by generative models like GPT-3. Although these models are opening the door for innovative AI applications, they also raise questions regarding the development of deep fakes and false information. **Ethical and Societal Considerations** - Advanced computing technologiesraise significant ethical and societal concerns. **Privacy** - Protecting people\'s privacy becomes a top priority as data collecting grows increasingly prevalent. It might be difficult to strike a compromise between privacy protection and data-driven innovation. **Bias in AI** - AI algorithms can pick up biases from training data, leading to unfair results. A crucial ethical requirement is to address bias in AI and ensure fairness in decision-making procedures. **Cybersecurity** - Greater cybersecurity dangers come with more advanced technology. It is still difficult to secure sensitive data, infrastructure, and personal information from cyberattacks and breaches. **Future Trends** - Future computing trends are difficult to predict, but several important themes are starting to emerge. **Quantum Computing Maturity** - Quantum computing is expected to mature, enabling breakthroughs in cryptography, optimization, and materials science. Widespread adoption, however, may be a decade or more away. **AI in Healthcare** - Predictive analytics, medication discovery, and individualized treatment regimens will all become mainstream as AI\'s involvement in healthcare grows. In this progression, regulatory frameworks and data privacy will be crucial. **Ethical AI Regulation** - Governments and organizations will increase their efforts to control AI, concentrating on accountability, transparency, and bias reduction. The establishment of ethical AI standards will guarantee ethical AI development and application. **PARALLEL COMPUTING: ENHANCINGPERFORMANCEWITHMULTI- CORE PROCESSORS** - Parallel computing is a computational approach where multiple tasks or processes are executed concurrently, to achieve improved computational performance and efficiency. **Parallel Computing: Core Concepts** 1. **Parallelism Fundamentals** **Definition:** Parallel computing is concurrently executing multiple tasks to achieve faster and more efficient computation. **Motivation:** Sequential computing faces limitations in processing speed and scalability, driving the need for parallelism. 2. **Types of Parallelism** **Task Parallelism:** Dividing a problem into smaller tasks that can be executed simultaneously. **Data Parallelism:** Processing multiple data elements concurrently. **Multi-Core processors: Enabling parallelism** 1. **Architectural Features** **Multi-Core Processor Architecture:** Symmetric Multiprocessing (SMP) or Asymmetric Multiprocessing (AMP) configurations. **Shared Memory vs. Distributed Memory:** Cores may share memory (SMP) or operate with separate memory (AMP). 2. **Benefits of Multi-Core Processors** **Enhanced Performance:** Execution of multiple threads or processes in parallel leads to improved performance. **Energy Efficiency:** Multi-core processors offer better performance per watt, reducing power consumption. 3. **Challenges** **Parallel Programming Complexity:** Developing software that efficiently utilizes multiple cores requires expertise in parallel programming. **Amdahl\'s Law:** Highlights potential bottlenecks in parallelization. **PARALLEL PROGRAMMING TECHNIQUES** 1. **Shared Memory Parallelism** **Thread-Based Programming:** Threads within a single process share memory, allowing for shared data structures and communication. **Open MP:** A directive-based API simplifying the creation of multithreaded applications. 2. **Message Passing** **MPI (Message Passing Interface):** Facilitates communication among distributed processes via message passing, common in high-performance computing (HPC) and cluster computing. 3. **Data Parallelism** SIMD (Single Instruction, Multiple Data): Executes the same operation on multiple data elements simultaneously. GPU Computing: Leveraging **Graphics Processing Units** for data-parallel workloads using frameworks like CUDA and Open CL. **RELEVANCE IN TODAY\'S COMPUTING LANDSCAPE** 1. **Scientific and High-Performance Computing (HPC)** - Parallelism is essential for complex simulations, scientific research, and weather forecasting. 2. **Big Data Analytics** - Parallel processing accelerates data analysis, enabling real-time decision-making in analytics and data-driven industries. 3. **Artificial Intelligence (AI) and Machine Learning** - Training large AI models benefits from parallelism, reducing training times and enabling faster insights.

Use Quizgecko on...
Browser
Browser