🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

emerging full (3).pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

1 2 Contents...

1 2 Contents Evolution of Technology CHAPTER ONE Role of Data for Emerging Technology Introduction to Programmable Devices Emerging Technology Human to Machine Interaction Future Trends in Emerging Technologies Andualem T. 3 4 Emerging Technology What is the root word of Technology and Evolution? Is a term generally used to describe a new technology Technology It may also refer to the continuing development of existing 1610s, "discourse or treatise on an art or the arts," from Greek technology tekhnologia "systematic treatment of an art, craft, or technique," Originally referring to grammar, from tekhno- (see techno-) + -logy. The term commonly refers to technologies that are currently The meaning "science of the mechanical and industrial arts" is first developing, or that are expected to be available within the recorded in 1859. next five to ten years Evolution The term is usually reserved for technologies that are Evolution means the process of developing by gradual changes. creating or are expected to create significant social or This noun is from Latin evolutio, "an unrolling or opening," combined economic effects. from the prefix e-, "out," plus volvere, "to roll." 5 6 List of some currently available Emerged Technologies Introduction to the Industrial Revolution (IR) A period of major industrialization and innovation that took place Artificial Intelligence during the late 1700s and early 1800s. Blockchain IR occurs when a society shifts from using tools to make products to Augmented Reality and Virtual Reality use new sources of energy, such as coal, to power machines in Cloud Computing factories. Angular and React The revolution started in England, with a series of innovations to make DevOps labor more efficient and productive. Internet of Things (IoT) IR was a time when the manufacturing of goods moved from small Intelligent Apps (I-Apps) shops and homes to large factories. Big Data This shift brought about changes in culture as people moved from Robotic Processor Automation (RPA) rural areas to big cities in order to work. 7 8 Introduction to the Industrial Revolution (IR) Introduction to the Industrial Revolution (IR) The American Industrial Revolution commonly referred to as the Generally, the following industrial revolutions Second Industrial Revolution, started sometime between 1820 and 1870. fundamentally changed and transfer the world around Industries such as textile manufacturing, mining, glass making, us into modern society. and agriculture all had undergone changes. For example, prior to the Industrial Revolution, textiles were The steam engine, primarily made of wool and were handspun. From the first industrial revolution (mechanization through The age of science and mass production, and water and steam power) to the mass production and assembly lines using electricity in the second, the fourth industrial The rise of digital technology revolution will take what was started in the third with the Smart and autonomous systems fueled by data and adoption of computers and automation and enhance it with smart and autonomous systems fueled by data and machine learning. machine learning. 9 10 Introduction to the Industrial Revolution (IR) Historical Background (IR 1.0, IR 2.0, IR 3.0) The Most Important Inventions of the Industrial The industrial revolution began in Great Britain in the late 1770s before spreading to the rest of Europe. Revolution The first European countries to be industrialized after England Transportation: The Steam Engine, The Railroad, The Diesel were Belgium, France, and the German states. Engine, The Airplane. The final cause of the Industrial Revolution was the effects Communication.: The Telegraph, Transatlantic Cable, created by the Agricultural Revolution. Phonograph, the Telephone. As previously stated, the Industrial Revolution began in Britain in Industry: The Cotton Gin. The Sewing Machine. Electric the 18th century due in part to an increase in food production, Lights. which was the key outcome of the Agricultural Revolution. 11 12 Historical Background (IR 1.0, IR 2.0, IR 3.0) Historical Background (IR 1.0, IR 2.0, IR 3.0) The four types of industries are: Industrial Revolution (IR 1.0) The Industrial Revolution (IR) is described as a transition to new Primary industry involves getting raw materials e.g. mining, manufacturing processes. farming, and fishing. IR was first coined in the 1760s, during the time where this revolution Secondary industry involves manufacturing e.g. making cars began. and steel. The transitions in the first IR included going from hand production Tertiary industries provide a service e.g. teaching and nursing. methods to machines, the increasing use of steam power, the development of machine tools and the rise of the factory system. Quaternary industry involves research and development industries e.g. IT. 13 14 Historical Background (IR 1.0, IR 2.0, IR 3.0) Historical Background (IR 1.0, IR 2.0, IR 3.0) Industrial Revolution (IR 2.0) Industrial Revolution (IR 3.0) Also known as the Technological Revolution, began somewhere in the Introduced the transition from mechanical and analog electronic 1870s. technology to digital electronics which began from the late 1950s. The advancements in IR 2.0 included the development of methods for Due to the shift towards digitalization, IR 3.0 was given the nickname, manufacturing interchangeable parts and widespread adoption of pre- “Digital Revolution”. existing technological systems such as telegraph and railroad networks. The core factor of this revolution is the mass production and widespread This adoption allowed the vast movement of people and ideas, enhancing use of digital logic circuits and its derived technologies such as the communication. computer, handphones and the Internet. Moreover, new technological systems were introduced, such as electrical These technological innovations have arguably transformed traditional power and telephones. production and business techniques enabling people to communicate with another without the need of being physically present. Certain practices that were enabled during IR 3.0 is still being practiced until this current day, for example – the proliferation of digital computers and digital record. 15 16 Historical Background (IR 1.0, IR 2.0, IR 3.0, IR 4.0) Role of Data for Emerging Technology Industrial Revolution (IR 4.0) Data is regarded as the new oil and strategic asset since we are living in the Now, with advancements in various technologies such as robotics, Internet age of big data, and drives or even determines the future of science, of Things (IoT), additive manufacturing and autonomous vehicles, the term “Fourth Industrial technology, the economy, and possibly everything in our world today and Revolution” or IR 4.0 was coined by Klaus Schwab, the founder and tomorrow. executive chairman of World Economic Forum, in the year 2016. The technologies mentioned above are what you call – cyber physical Data have not only triggered tremendous hype and buzz but more systems. importantly, presents enormous challenges that in turn bring incredible A cyber-physical system is a mechanism that is controlled or monitored by computer-based algorithms, tightly integrated with the Internet and its innovation and economic opportunities. users. This reshaping and paradigm-shifting are driven not just by data itself but all Examples: Computer Numerical Control (CNC) machines and Artificial Intelligence (AI). other aspects that could be created, transformed, and/or adjusted by CNC Machines: operated by giving it instructions using a computer understanding, exploring, and utilizing data. AI: one of the main elements that give life to Autonomous Vehicles and Automated Robots. 17 18 Role of Data for Emerging Technology Enabling Devices & Network (Programmable Devices) In the world of digital electronic systems, there are four basic kinds of The preceding trend and its potential have triggered new debate about data- devices: memory, microprocessors, logic, and networks. intensive scientific discovery as an emerging technology, the so-called “fourth Memory devices store random information such as the contents of a spreadsheet or database. industrial revolution,” There is no doubt, nevertheless, that the potential of Microprocessors execute software instructions to perform a wide variety of tasks such as running a word processing program or video game. data science and analytics to enable data-driven theory, economy, and Logic devices provide specific functions, including device-to-device professional development is increasingly being recognized. interfacing, data communication, signal processing, data display, timing and control operations, and almost every other function a system must perform. This involves not only core disciplines such as computing, informatics, and Network is a collection of computers, servers, mainframes, network devices, peripherals, or other devices connected to one another to allow the sharing of statistics, but also the broad-based fields of business, social science, and data. Example of a network is the Internet, which connects millions of people all health/medical science. over the world Programmable devices usually refer to chips that incorporate field programmable logic devices (FPLDs), complex programmable logic devices (CPLD) and programmable logic devices (PLD). There are also devices that are the analog equivalent of these called field programmable analog arrays. 19 20 Enabling Devices & Network (Programmable Devices) Enabling Devices & Network (Programmable Devices) Why is a computer referred to as a programmable device? Service Enabling Devices (Network Related Equipment) Because a computer follows a set of instructions. Many electronic devices are computers that perform only one operation, Traditional channel service unit (CSU) and data service unit (DSU) but they are still following instructions that reside permanently in the unit. Modems List of some Programmable devices Achronix Speedster SPD60 Routers Actel’s Altera Stratix IV GT and Arria II GX Switches Atmel’s AT91CAP7L Cypress Semiconductor’s programmable system-on-chip (PSoC) family Conferencing equipment Lattice Semiconductor’s ECP3 Network appliances (NIDs and SIDs) Lime Microsystems’ LMS6002 Silicon Blue Technologies Hosting equipment and servers Xilinx Virtex 6 and Spartan 6 Xmos Semiconductor L series 21 22 Human to Machine Interaction(HMI) Human to Machine Interaction(HMI) HMI refers to the communication and interaction between a human and a machine via 3. How important is human-computer interaction? a user interface. Nowadays, natural user interfaces such as gestures have gained increasing attention as The goal of HCI is to improve the interaction between users and computers they allow humans to control machines through natural and intuitive behaviors. by making computers more user-friendly and receptive to the user's needs. 1. What is interaction in human-computer interaction? HCI (human-computer interaction) is the study of how people interact with The main advantages of HCI are computers and to what extent computers are or are not developed for successful interaction with human beings. Simplicity As its name implies, HCI consists of three parts: the User, the Computer Ease of deployment & operations itself, and the ways they work together. 2. How do users interact with computers? Cost savings for smaller set-ups The user interacts directly with hardware for the human input and output Reduce solution design time and integration complexity. such as displays, e.g. through a graphical user interface. The user interacts with the computer over this software interface using the given input and output (I/O) hardware. 23 24 Human to Machine Interaction(HMI) Future Trends in Emerging Technologies Disciplines Contributing to Human-Computer Interaction (HCI) Emerging Technology trends in 2019 Cognitive psychology: Limitations, information processing, performance 5G Networks prediction, cooperative working, and capabilities. Artificial Intelligence (AI) Computer science: Including graphics, technology, prototyping tools, user Autonomous Devices interface management systems. Blockchain Linguistics Augmented Analytics Engineering and design Digital Twins Artificial intelligence Enhanced Edge Computing and Human factors Immersive Experiences in Smart Spaces 25 Future Trends in Emerging Technologies Some emerging technologies that will shape the future of you and your business The future is now or emerging technologies are taking over our minds more and more each day. These are very high-level emerging technologies though. Chatbots Virtual/augmented reality Blockchain Ephemeral Apps and Artificial Intelligence are already shaping your life whether you like it or not. 1 2 Contents Overview of Data Science CHAPTER TWO Data and Information Data Processing Cycle Data Science Data Types and their Representation Data Value Chain Basic Concepts of Big Data Andualem T. Clustered Computing and Hadoop Ecosystem 3 4 Overview of Data Science Data and Information Data science is a multi-disciplinary field that uses scientific methods, Data processes, algorithms, and systems to extract knowledge and insights Defined as a representation of facts, concepts, or instructions in a formalized manner, from structured, semi-structured and unstructured data. which should be suitable for communication, interpretation, or processing, by human or electronic machines. Data science is much more than simply analyzing data. It can be described as unprocessed facts and figures. Let’s consider this idea by thinking about some of the data involved in Represented with the help of characters such as alphabets (A-Z, a-z), digits (0-9) or special buying a box of cereal from the store or supermarket: characters (+, -, /, *, , =, etc.). Whatever your cereal preferences teff, wheat, or burly you prepare Information for the purchase by writing “cereal” in your notebook. This planned Is the processed data on which decisions and actions are based It is data that has been processed into a form that is meaningful to the recipient purchase is a piece of data though it is written by pencil that you can Information is interpreted data; created from organized, structured, and processed data in read. a particular context. 5 6 Data Processing Cycle Data Processing Cycle Data processing is the re-structuring or re-ordering of data by people or Input In this step, the input data is prepared in some convenient form for processing. machines to increase their usefulness and add values for a particular The form will depend on the processing machine. purpose. For example, when electronic computers are used, the input data can be recorded on any one Data processing consists of the following basic steps - input, processing, of the several types of storage medium, such as hard disk, CD, flash disk and so on. and output. Processing In this step, the input data is changed to produce data in a more useful form. These three steps constitute the data processing cycle For example, interest can be calculated on deposit to a bank, or a summary of sales for the month can be calculated from the sales orders. Output At this stage, the result of the proceeding processing step is collected. The particular form of the output data depends on the use of the data. For example, output data may be payroll for employees. 7 8 Data Types and their Representation Data Types and their Representation Data types can be described from diverse perspectives. 2. Data types from Data Analytics perspective In computer science and computer programming, for instance, a data type is From a data analytics point of view, it is important to understand that there simply an attribute of data that tells the compiler or interpreter how the are three common types of data types or structures: programmer intends to use the data. A. Structured 1. Data types from Computer programming perspective Almost all programming languages explicitly include the notion of data type. B. Semi-structured, and Common data types include C. Unstructured data types Integers(int)- is used to store whole numbers, mathematically known as integers below describes the three types of data and metadata. Booleans(bool)- is used to represent restricted to one of two values: true or false Characters(char)- is used to store a single character Floating-point numbers(float)- is used to store real numbers Alphanumeric strings(string)- used to store a combination of characters and numbers 9 10 Structured Data Semi-structured Data Structured data is data that adheres to a pre-defined data model and is Semi-structured data is a form of structured data that does not conform therefore straightforward to analyze. with the formal structure of data models associated with relational Structured data conforms to a tabular format with a relationship between databases or other forms of data tables, but nonetheless, contains tags or the different rows and columns. other markers to separate semantic elements and enforce hierarchies of Example: Excel files or SQL databases. records and fields within the data. Therefore, it is also known as a self-describing structure. Each of these has structured rows and columns that can be sorted. Examples: JSON and XML are forms of semi-structured data. 11 12 Unstructured Data Metadata (Data about Data) Unstructured data is information that either does not have a predefined From a technical point of view, this is not a separate data structure, but it data model or is not organized in a pre-defined manner. is one of the most important elements for Big Data analysis and big data Unstructured information is typically text-heavy but may contain data solutions. such as dates, numbers, and facts as well. Metadata is data about data. This results in irregularities and ambiguities that make it difficult to It provides additional information about a specific set of data. understand using traditional programs as compared to data stored in metadata is frequently used by Big Data solutions for initial analysis. structured databases. In a set of photographs, for example, metadata could describe when and Example: Audio, video files or NoSQL databases. where the photos were taken. The metadata then provides fields for dates and locations which, by themselves, can be considered structured data. 13 14 Data Value Chain Data Acquisition The Data Value Chain is introduced to describe the information flow It is the process of gathering, filtering, and cleaning data before it is put in within a big data system as a series of steps needed to generate value and a data warehouse or any other storage solution on which data analysis can useful insights from data. be carried out. The Big Data Value Chain identifies the following key high-level Data acquisition is one of the major big data challenges in terms of activities: infrastructure requirements. The infrastructure required to support the acquisition of big data must deliver low, predictable latency in both capturing data and in executing queries; be able to handle very high transaction volumes, often in a distributed environment; and support flexible and dynamic data structures. 15 16 Data Analysis Data Curation It is concerned with making the raw data acquired amenable to use in It is the active management of data over its life cycle to ensure it meets the necessary data quality requirements for its effective usage. decision-making as well as domain-specific usage. Data curation processes can be categorized into different activities such as Data analysis involves exploring, transforming, and modeling data with content creation, selection, classification, transformation, validation, and the goal of highlighting relevant data, synthesizing and extracting useful preservation. hidden information with high potential from a business point of view. Data curation is performed by expert curators that are responsible for Related areas include data mining, business intelligence, and machine improving the accessibility and quality of data. Data curators (also known as scientific curators or data annotators) hold the learning. responsibility of ensuring that data are trustworthy, discoverable, accessible, reusable and fit their purpose. A key trend for the duration of big data utilizes community and crowdsourcing approaches. 17 18 Data Storage Data Usage It is the persistence and management of data in a scalable way that satisfies It covers the data-driven business activities that need access to the needs of applications that require fast access to the data. data, its analysis, and the tools needed to integrate the data Relational Database Management Systems (RDBMS) have been the main, analysis within the business activity. and almost unique, a solution to the storage paradigm for nearly 40 years. Data usage in business decision-making can enhance However, the ACID (Atomicity, Consistency, Isolation, and Durability) properties that guarantee database transactions lack flexibility with regard to competitiveness through the reduction of costs, increased added schema changes and the performance and fault tolerance when data volumes value, or any other parameter that can be measured against and complexity grow, making them unsuitable for big data scenarios. existing performance criteria. NoSQL technologies have been designed with the scalability goal in mind and present a wide range of solutions based on alternative data models. 19 20 Basic Concepts of Big Data What is Big Data? Big data is a blanket term for the non-traditional strategies and Big data is the term for a collection of data sets so large and complex technologies needed to gather, organize, process, and gather that it becomes difficult to process using on-hand database insights from large datasets. management tools or traditional data processing applications. In this context, a “large dataset” means a dataset too large to While the problem of working with data that exceeds the reasonably process or store with traditional tooling or on a single computing power or storage of a single computer is not new, the computer. pervasiveness, scale, and value of this type of computing have This means that the common scale of big datasets is constantly shifting greatly expanded in recent years. and may vary significantly from organization to organization. Big data is characterized by 3V and more: Volume, Velocity, Variety and Veracity 21 22 What is Big Data? Clustered Computing and Hadoop Ecosystem Characteristics of Big Data Clustered Computing Volume: large amounts of data Zeta bytes/Massive datasets Because of the qualities of big data, individual computers are often inadequate for handling the data at most stages. Velocity: Data is live streaming or in motion To better address the high storage and computational needs of big Variety: data comes in many different forms from diverse sources data, computer clusters are a better fit. Veracity: can we trust the data? How accurate is it? etc. Big data clustering software combines the resources of many smaller machines, seeking to provide a number of benefits: Resource Pooling High Availability Easy Scalability 23 24 Clustered Computing and Hadoop Ecosystem Clustered Computing and Hadoop Ecosystem Resource Pooling Using clusters requires a solution for managing cluster membership, Combining the available storage space to hold data is a clear benefit, but coordinating resource sharing, and scheduling actual work on CPU and memory pooling are also extremely important. Processing large datasets requires large amounts of all three of these individual nodes. resources. Cluster membership and resource allocation can be handled by High Availability software like Hadoop’s YARN (which stands for Yet Another Clusters can provide varying levels of fault tolerance and availability Resource Negotiator). guarantees to prevent hardware or software failures from affecting access to data and processing. The assembled computing cluster often acts as a foundation that other This becomes increasingly important as we continue to emphasize the software interfaces with to process the data. importance of real-time analytics. The machines involved in the computing cluster are also typically Easy Scalability involved with the management of a distributed storage system, which Clusters make it easy to scale horizontally by adding additional machines to the group. This means the system can react to changes in resource we will talk about when we discuss data persistence. requirements without expanding the physical resources on a machine. 25 26 Clustered Computing and Hadoop Ecosystem Hadoop and its Ecosystem Hadoop and its Ecosystem The four key characteristics of Hadoop are: Hadoop is an open-source framework intended to make interaction with big data easier. Economical: Its systems are highly economical as ordinary It is a framework that allows for the distributed processing of large datasets computers can be used for data processing. across clusters of computers using simple programming models. Reliable: It is reliable as it stores copies of the data on different It is inspired by a technical document published by Google. machines and is resistant to hardware failure. The four key characteristics of Hadoop are: Scalable: It is easily scalable both, horizontally and vertically. A Economical few extra nodes help in scaling up the framework Reliable Flexible: It is flexible and you can store as much structured and Scalable unstructured data as you need to and decide to use them later. Flexible 27 28 Hadoop and its Ecosystem Hadoop and its Ecosystem Hadoop has an ecosystem that has evolved from its four core components: data management, access, processing, and storage. It is continuously growing to meet the needs of Big Data. It comprises the following components and many others: HDFS: Hadoop Distributed File System YARN: Yet Another Resource Negotiator MapReduce: Programming based Data Processing Spark: In-Memory data processing PIG, HIVE: Query-based processing of data services HBase: NoSQL Database Mahout, Spark MLLib: Machine Learning algorithm libraries Solar, Lucene: Searching and Indexing Zookeeper: Managing cluster Oozie: Job Scheduling 29 30 Big Data Life Cycle with Hadoop Big Data Life Cycle with Hadoop Ingesting data into the system Computing and analyzing data The first stage of Big Data processing is Ingest. The third stage is to Analyze. The data is ingested or transferred to Hadoop from various sources Here, the data is analyzed by processing frameworks such as Pig, such as relational databases, systems, or local files. Hive, and Impala. Sqoop transfers data from RDBMS to HDFS, whereas Flume Pig converts the data using a map and reduce and then analyzes it. transfers event data. Hive is also based on the map and reduce programming and is most Processing the data in storage suitable for structured data. The second stage is Processing. Visualizing the results In this stage, the data is stored and processed. The fourth stage is Access, which is performed by tools such as Hue The data is stored in the distributed file system, HDFS, and the NoSQL distributed data, HBase. and Cloudera Search. Spark and MapReduce perform data processing. In this stage, the analyzed data can be accessed by users. 8/8/2021 Fundamental of Information System 1 8/8/2021 Fundamental of Information System 2 Contents Artificial Intelligence CHAPTER THREE History of AI Levels of AI Artificial Intelligence Types of AI How humans think Andualem T. Application of AI 8/8/2021 Fundamental of Information System 3 8/8/2021 Fundamental of Information System 4 Artificial Intelligence (AI) Artificial Intelligence (AI) Artificial Intelligence is composed of two words Artificial and Knowledge is the information acquired through experience. Intelligence. Experience is the knowledge gained through exposure (training). Artificial defines "man-made," and intelligence defines "thinking power", Therefore, artificial intelligence as the “copy of something natural (i.e., or “the ability to learn and solve problems” human beings) „WHO‟ is capable of acquiring and applying the Artificial Intelligence means "a man-made thinking power." information it has gained through exposure.” Artificial Intelligence exists when a machine can have human-based Artificial Intelligence (AI) as the branch of computer science by which we can create intelligent machines which can behave like a human, think like skills such as learning, reasoning, and solving problems. humans, and able to make decisions. In Artificial Intelligence you do not need to preprogram a machine to Intelligence is the ability to acquire and apply knowledge. do some work, despite that you can create a machine with programmed algorithms which can work with own intelligence. 8/8/2021 Fundamental of Information System 5 8/8/2021 Fundamental of Information System 6 Artificial Intelligence (AI) Artificial Intelligence (AI) Intelligence is composed of: Artificial Intelligence system is composed of Agent and its Reasoning Environment. Learning An agent (e.g., human or robot) is anything that can perceive its Problem Solving environment through sensors and acts upon that environment through effectors. Perception and Intelligent agents must be able to set goals and achieve them. Linguistic Intelligence Machine perception is the ability to use input from sensors (such Artificial Intelligence system is composed of as cameras, microphones, sensors, etc.) to deduce aspects of the Agent and world. e.g., Computer Vision. Environment. 8/8/2021 Fundamental of Information System 7 8/8/2021 Fundamental of Information System 8 Artificial Intelligence (AI) Artificial Intelligence (AI) High-profile Examples of AI include AI deals with the area of developing computing systems that are Autonomous vehicles (such as drones and self-driving cars) capable of performing tasks that humans are very good at. Medical diagnosis Creating art (such as poetry) Example: recognizing objects, recognizing and making sense of Proving mathematical theorems speech, and decision making in a constrained environment. Playing games (such as Chess or Go) Machine Learning is an advanced form of AI where the machine can Search engines (such as Google search) Online assistants (such as Siri) learn as it goes rather than having every action programmed by Image recognition in photographs humans. Spam filtering Machine learning, a fundamental concept of AI research since the Prediction of judicial decisions and Online advertisements field‟s inception, is the study of computer algorithms that improve automatically through experience. 8/8/2021 Fundamental of Information System 9 8/8/2021 Fundamental of Information System 10 Artificial Intelligence (AI) Need for Artificial Intelligence The term machine learning was introduced by Arthur Samuel in 1959. Why we need AI at this time? Neural networks are biologically inspired networks that extract 1. To create expert systems that exhibit intelligent behavior with the features from the data in a hierarchical fashion. capability to learn, demonstrate, explain and advice its users. The field of neural networks with several hidden layers is called deep 2. Helping machines find solutions to complex problems like humans learning. do and applying them as algorithms in a computer-friendly manner. 8/8/2021 Fundamental of Information System 11 8/8/2021 Fundamental of Information System 12 Goals of Artificial Intelligence What Comprises to Artificial Intelligence? Following are the main goals of Artificial Intelligence: To create the AI-first we should know that how intelligence is 1. Replicate human intelligence composed, so Intelligence is an intangible part of our brain which is a 2. Solve Knowledge-intensive tasks 3. An intelligent connection of perception and action combination of Reasoning, learning, problem-solving, perception, 4. Building a machine which can perform tasks that requires human language understanding, etc intelligence such as: To achieve the above factors for a machine or software Artificial Proving a theorem Playing chess Intelligence requires the following disciplines Plan some surgical operation Driving a car in traffic 5. Creating some system which can exhibit intelligent behavior, learn new things by itself, demonstrate, explain, and can advise to its user. 8/8/2021 Fundamental of Information System 13 8/8/2021 Fundamental of Information System 14 Advantages of Artificial Intelligence Disadvantages of Artificial Intelligence High Accuracy with fewer errors: AI machines or systems are prone to fewer High Cost: The hardware and software requirement of AI is very costly errors and high accuracy as it takes decisions as per pre-experience or as it requires lots of maintenance to meet current world requirements. information. Can't think out of the box: Even we are making smarter machines with High-Speed: AI systems can be of very high-speed and fast-decision making, AI, but still they cannot work out of the box, as the robot will only do that because of that AI systems can beat a chess champion in the Chess game. work for which they are trained, or programmed. High reliability: AI machines are highly reliable and can perform the same No feelings and emotions: AI machines can be an outstanding performer, action multiple times with high accuracy. but still it does not have the feeling so it cannot make any kind of Useful for risky areas: AI machines can be helpful in situations such as defusing emotional attachment with humans, and may sometime be harmful for a bomb, exploring the ocean floor, where to employ a human can be risky. users if the proper care is not taken. Digital Assistant: AI can be very useful to provide digital assistant to users such Increase dependence on machines: With the increment of technology, as AI technology is currently used by various E-commerce websites to show the people are getting more dependent on devices and hence they are losing products as per customer requirements. their mental capabilities. Useful as a public utility: AI can be very useful for public utilities such as a self No Original Creativity: As humans are so creative and can imagine driving car which can make our journey safer and hassle-free, facial recognition some new ideas but still AI machines cannot beat this power of human for security purposes, Natural language processing (for search engines, for spelling checker, for assistant like Siri, for translation like google translate ), etc. intelligence and cannot be creative and imaginative. 8/8/2021 Fundamental of Information System 15 8/8/2021 Fundamental of Information System 16 History of AI [Reading Assignment] History of AI A. Maturation of Artificial Intelligence (1943-1952) The year 1943: The first work which is now recognized as AI was done by Warren McCulloch and Walter pits in 1943. They proposed a model of artificial neurons. The year 1949: Donald Hebb demonstrated an updating rule for modifying the connection strength between neurons. His rule is now called Hebbian learning. The year 1950: The Alan Turing who was an English mathematician and pioneered Machine learning in 1950. Alan Turing publishes "Computing Machinery and Intelligence" in which he proposed a test. The test can check the machine's ability to exhibit intelligent behavior equivalent to human intelligence, called a Turing test. 8/8/2021 Fundamental of Information System 17 8/8/2021 Fundamental of Information System 18 History of AI History of AI B. The birth of Artificial Intelligence (1952-1956) C. The golden years-Early enthusiasm (1956-1974) The year 1955: An Allen Newell and Herbert A. Simon created the The year 1966: The researchers emphasized developing algorithms that can "first artificial intelligence program" Which was named "Logic solve mathematical problems. Joseph Weizenbaum created the first chatbot in 1966, Theorist". This program had proved 38 of 52 Mathematics theorems, which was named as ELIZA. and find new and more elegant proofs for some theorems. The year 1972: The first intelligent humanoid robot was built in Japan which The year 1956: The word "Artificial Intelligence" first adopted by was named WABOT-1. American Computer scientist John McCarthy at the Dartmouth D. The first AI winter (1974-1980) Conference. For the first time, AI coined as an academic field. At that The duration between the years 1974 to 1980 was the first AI winter duration. AI winter time high-level computer languages such as FORTRAN, LISP, or refers to the time period where computer scientists dealt with a severe shortage of funding from the government for AI researches. COBOL were invented. And the enthusiasm for AI was very high at During AI winters, an interest in publicity on artificial intelligence was decreased. that time. 8/8/2021 Fundamental of Information System 19 8/8/2021 Fundamental of Information System 20 History of AI History of AI E. A boom of AI (1980-1987) G. The emergence of intelligent agents (1993-2011) The year 1980: After AI winter duration, AI came back with "Expert System". Expert systems were programmed that emulate the decision-making ability of a The year 1997: In the year 1997, IBM Deep Blue beats human expert. world chess champion, Gary Kasparov, and became the In the Year 1980, the first national conference of the American Association of Artificial Intelligence was held at Stanford University. first computer to beat a world chess champion F. The second AI winter (1987-1993) The year 2002: for the first time, AI entered the home in The duration between the years 1987 to 1993 was the second AI Winter the form of Roomba, a vacuum cleaner. duration. Again, Investors and government stopped in funding for AI research due to The year 2006: AI came into the Business world until the high cost but not efficient results. The expert system such as XCON was very year 2006. Companies like Facebook, Twitter, and Netflix cost-effective. also started using AI. 8/8/2021 Fundamental of Information System 21 8/8/2021 Fundamental of Information System 22 History of AI History of AI H. Deep learning, big data and artificial general intelligence (2011- H. Deep learning, big data and artificial general intelligence (2011-present) present) The year 2014: In the year 2014, Chatbot "Eugene Goostman" won a The year 2011: In the year 2011, IBM's Watson won jeopardy, a competition in the infamous "Turing test" quiz show, where it had to solve complex questions as well as The year 2018: The "Project Debater" from IBM debated on complex riddles. topics with two master debaters and also performed extremely well. Watson had proved that it could understand natural language and Google has demonstrated an AI program "Duplex" which was a virtual can solve tricky questions quickly. assistant and which had taken hair dresser appointment on call, and the lady on the other side didn't notice that she was talking with the The year 2012: Google has launched an Android app feature machine. "Google now", which was able to provide information to the user as a prediction. 8/8/2021 Fundamental of Information System 23 8/8/2021 Fundamental of Information System 24 Levels of AI Levels of AI Stage 1 – Rule-Based Systems Stage 3 – Domain-Specific Expertise Uses rules as the knowledge representation Expertise and Domain Specific Knowledge. Is a system that applies human-made rules to store, sort and These systems build up expertise in a specific context taking in manipulate data. In doing so, it mimics human intelligence. massive volumes of information which they can use for decision It‟s a logical program that uses pre-defined rules to make deductions making. and choices to perform automated actions. Eg. AlphaGo Stage 2 – Context Awareness and Retention Stage 4 – Reasoning Machines These algorithms have some ability to attribute mental states to themselves Algorithms that develop information about the specific domain they and others – they have a sense of beliefs, intentions, knowledge, and how are being applied in. their own logic works. They are trained on the knowledge and experience of the best This means they could reason or negotiate with humans and other humans, and their knowledge base can be updated as new situations machines. and queries arise. At the moment these algorithms are still in development, however, Eg. chatbots and “robo advisors” commercial applications are expected within the next few years 8/8/2021 Fundamental of Information System 25 8/8/2021 Fundamental of Information System 26 Levels of AI Levels of AI Stage 5 – Self Aware Systems / Artificial General Intelligence (AGI) Stage 7 –Singularity and Transcendence These systems have human-like intelligence This leads to a massive expansion in human capability. AGI is the intelligence of a machine that has the capacity to understand or Human augmentation could connect our brains to each other and to a future learn any intellectual task that a human being can. successor of the current internet, creating a “hive mind” that shares ideas, solves problems collectively, and even gives others access to our dreams as Stage 6 – Artificial Superintelligence (ASI) observers or participants. AI algorithms can outsmart even the most intelligent humans in every Pushing this idea further, we might go beyond the limits of the human domain. body and connect to other forms of intelligence on the planet – animals, Logically it is difficult for humans to articulate what the capabilities might plants, weather systems, and the natural environment. be, yet we would hope examples would include solving problems we have Some proponents of singularity such as Ray Kurzweil, Google‟s Director failed to so far, such as world hunger and dangerous environmental change. of Engineering, suggest we could see it happen by 2045 as a result of A few experts who claim it can be realized by 2029. exponential rates of progress across a range of science and technology Fiction has tackled this idea for a long time, for example in the film Ex disciplines. Machina or Terminator. The other side of the fence argues that singularity is impossible and human consciousness could never be digitized 8/8/2021 Fundamental of Information System 27 8/8/2021 Fundamental of Information System 28 Levels of AI Types of AI Artificial Intelligence can be divided into various types, these are Based on Capabilities and Based on Functionality 8/8/2021 Fundamental of Information System 29 8/8/2021 Fundamental of Information System 30 Based on Capabilities Based on Capabilities 1. Weak AI or Narrow AI: 2. General AI Is a type of AI which is able to perform a dedicated task with intelligence. Is a type of intelligence that could perform any intellectual task cannot perform beyond its field or limitations, as it is only trained for one with efficiency like a human. specific task. The idea behind the general AI to make such a system that could be Also called as Weak AI. smarter and think like a human on its own. Is the most common and currently available AI in the world Currently, there is no such system exists which could come under Can fail in unpredictable ways if it goes beyond its limits. general AI and can perform any task as perfect as a human. E.g. Apple Siri , IBM's Watson supercomputer ,Google translate, playing As systems with general AI are still under research, and it will take chess, purchasing suggestions on e-commerce sites, self-driving cars, speech lots of effort and time to develop such systems. recognition, and image recognition. 8/8/2021 Fundamental of Information System 31 8/8/2021 Fundamental of Information System 32 Based on Capabilities Based on Functionality 3. Super AI 1. Reactive Machines: Is a level of Intelligence of Systems at which machines could surpass human intelligence, and can perform any task better than a human with cognitive The most basic types of Artificial Intelligence properties. Do not store memories or past experiences for future actions This refers to aspects like general wisdom, problem solving and creativity. Only focus on current scenarios and react on it as per possible best It is an outcome of general AI. action. Some key characteristics of strong AI include capability include the ability to think, to reason solve the puzzle, make judgments, plan, learn, and Eg. IBM's Deep Blue system and Google's AlphaGo communicate on its own. Super AI is still a hypothetical concept of Artificial Intelligence. The development of such systems in real is still a world-changing task. 8/8/2021 Fundamental of Information System 33 8/8/2021 Fundamental of Information System 34 Based on Functionality Based on Functionality 2. Limited Memory 3. Theory of Mind Can store past experiences or some data for a short period of time. Understand human emotions, people, beliefs, and be able to interact socially like humans. These machines can use stored data for a limited time period only. Still not developed As the name suggests they have limited memory or short-lived memory 4. Self Awareness is the future of Artificial Intelligence Eg. Self-driving cars : can store the recent speed of nearby cars, the will be super intelligent and will have their own consciousness, distance of other cars, speed limits, and other information to navigate sentiments, and self-awareness. the road. These machines will be smarter than the human mind. does not exist in reality still and it is a hypothetical concept. 8/8/2021 Fundamental of Information System 35 8/8/2021 Fundamental of Information System 36 How Humans Think Mapping Human Thinking to AI Components Intelligence or the cognitive process is composed of three main Because AI is the science of simulating human thinking, it is possible stages: to map the human thinking stages to the layers or components of AI systems. Observe and input the information or data in the brain. In the first stage, humans acquire information from their surrounding Interpret and evaluate the input that is received from the environments through human senses, such as sight, hearing, smell, surrounding environment. taste, and touch, through human organs, such as eyes, ears, and other Make decisions as a reaction towards what you received as input sensing organs, for example, the hands. and interpreted and evaluated. In AI models, this stage is represented by the sensing layer, which perceives information from the surrounding environment. Eg: Sensors like voice recognition and visual imaging recognition 8/8/2021 Fundamental of Information System 37 8/8/2021 Fundamental of Information System 38 Influencers of Artificial Intelligence Applications of Artificial Intelligence Influencers of AI includes: AI in agriculture AI in Social Media Very helpful for farmers To organize and manage massive Big data: Structured data versus unstructured data amounts of data As agriculture robotics, solid and crop monitoring AI in Data Security Advancements in computer processing speed and New chip AI in Healthcare used to make your data more architectures safe and secure. to make a better and faster Cloud computing and Application Program Interfaces diagnosis than humans. AI in Travel &Transport AI in education Use chatbots which can make Cloud computing is a general term that describes the delivery of on-demand human-like interaction with as a teaching assistant customers services, usually through the internet, on a pay-per-use basis. Eg: chatbot AI in Robotics The emergence of data science AI in Finance and E-commerce AI in Entertainment The goal of data science is to extract knowledge or insights from data in AI in Gaming : eg chess various forms, either structured or unstructured, which is like data mining. AI in the Automotive Industry 8/8/2021 Fundamental of Information System 39 8/8/2021 Fundamental of Information System 40 AI Tools and Platforms Sample AI Application AI has developed a large number of tools to solve the most difficult Commuting Online Shopping problems in computer science, like: Googles AI-powered predictions Search (Amazon) Ridesharing Apps like Uber Search and optimization E-mail recommendations for products Logic Spam filters Probabilistic methods for uncertain reasoning Mobile Use Smart e-mail categorization Classifiers and statistical learning methods Voice-to-text AI in education Neural networks As a teaching assistant Smart personal Assistants (Siri) Control theory Eg: chatbot Languages Social Networking Most common artificial intelligence platforms include Microsoft Facebook AZURE Machine Learning, Google Cloud Prediction API, IBM Watson, TensorFlow, Infosys Nia, Wipro HOLMES, API.AI, Instagram Premonition, Rainbird, Ayasdi, MindMeld, and Meya. 8/8/2021 Fundamental of Information System 1 8/8/2021 Fundamental of Information System 2 Contents Definition of the term IOT CHAPTER FOUR History of IOT Internet of Things Advantage and disadvantage of IOT Explain how IOT works Architecture of IOT Andualem T. Application areas of IOT 8/8/2021 Fundamental of Information System 3 8/8/2021 Fundamental of Information System 4 Overview of Internet of Things Overview of Internet of Things Artificial intelligence The most important features of internet of things (IoT) include: IoT makes anything virtually “smart”, meaning it enhances every aspect Artificial intelligence of life with the power of data collection, artificial intelligence algorithms, Connectivity and networks Connectivity Sensors new enabling technologies for networking and specially IOT networking , Active engagement and mean networks are no longer exclusively tied to major provides therefore IOT creates these small networks between its system devices. Small device Sensors IOT loses its distinction without sensors They act as defining instruments that transform from standard passive network of device in to an active system cabling of real world integration. 8/8/2021 Fundamental of Information System 5 8/8/2021 Fundamental of Information System 6 Overview of Internet of Things Definitions of IOT Active engagement Several groups defined IOT using different definitions IOT introduce a new paradigm for active content , product ,or 1. According to the internet architecture boards definition :- service engagement rather than passive engagement. IOT is networking of smart objects, means a huge number of devices intelligently communicating in the presence of internet protocol that cannot Small device be directly operated by human beings but exist as components in building Those devices has become smaller, cheaper and more vehicles or the environment. powerful over time so IOT exploits purpose built small 2. According to the internet engineering task force (IETF) organizations IOT is the networking of smart objects in which smart objects have some devices to deliver its precision, scalability, and versatility. constraints such as limited bandwidth ,power, and processing accessibility for achieving interoperability among smart objects. 8/8/2021 Fundamental of Information System 7 8/8/2021 Fundamental of Information System 8 Definitions of IOT Definitions of IOT 3. According to the IEEE communications category magazines definition:- Generally, IOT is the network of physical objects or “things” IOT is framework of all things that have a representation of the embedded with electronics , software and network connectivity , presence of the internet in such a way that new applications and services enable the interaction in the physical and virtual world in the which enables these to collect and exchange data. form of machine to machine (M2M)communication in the cloud. Is also a system of interrelated computing devices, mechanical and 4. According to the oxfords definition digital machines, objects, animals or people that are provided with IOT is the interaction of everyday objects computing devices through unique identifiers and the ability to transfer data over a network the internet that enables the sending and receiving of useful data. without requiring human-to-human or human-to-computer interaction The term Internet of things defines according to the 2020 conceptual IOT is a network of devices that can sense, accumulate and transfer frame work is expressed through simple formula such as:- data over the internet without any human intervention. IOT=services + data +networks + sensors 8/8/2021 Fundamental of Information System 9 8/8/2021 Fundamental of Information System 10 Definitions of IOT Areas where IOT is applicable Simply stated, the Internet of Things consists of any device with an on/off In connected industry switch connected to the Internet including anything such as:- In smart-city Ranging from cellphones to building maintenance to the jet engine of an airplane In using smart-home Medical devices, such as a heart monitor implant or a biochip transponder in In smart-energy a farm animal can transfer data over a network and are members of the IOT. In connected car The IOT consists of a gigantic network of internet-connected “things” and devices. In the smart agriculture E.g. Ring, a doorbell that links to your smartphone, provides an excellent In connected building and campus example of a recent addition to the Internet of Things means Ring signals you In the health care when the doorbell is pressed and lets you see who it is and to speak with them. In Logistics and other domains 8/8/2021 Fundamental of Information System 11 8/8/2021 Fundamental of Information System 12 IOT systems allow users to achieve deeper automation, analysis, and integration within a system. It also improve the reach of these areas and their accuracy. IOT utilizes existing and emerging technology for sensing, networking, and robotics. IOT exploits recent advances in software, falling hardware prices, and modern attitudes towards technology Its new and advanced elements bring major changes in the delivery of products, goods, and services; and the social, economic, and political impact of those changes. 8/8/2021 Fundamental of Information System 13 8/8/2021 Fundamental of Information System 14 History of IOT History of IOT The traditional fields of automation (including the automation of The Internet of Things has not been around for very long. buildings and homes), wireless sensor networks, GPS, control systems, since the early 1800s there have been visions of machines communicating with one and others, all support the IoT. another. In 1830s and 1840s Machines have been providing direct communications since the Kevin Ashton, the Executive Director of Auto-ID Labs at MIT, was the telegraph (the first landline) was developed. first to describe the Internet of Things, during his 1999 speech. In June 3, 1900, Described as “wireless telegraphy,” the first radio voice transmission Kevin Ashton stated that Radio Frequency Identification (RFID) was a took place, providing another necessary component for developing the Internet of Things. prerequisite for the Internet of Things. He concluded if all devices were In 1950s The development of computers began. “tagged,” computers could manage, track, and inventory them. In 1962 The Internet, itself a significant component of the IOT, started out as part of To some extent, the tagging of things has been achieved through DARPA (Defense Advanced Research Projects Agency). technologies such as digital watermarking, barcodes, and QR codes. In 1969 evolved into ARPANET. 8/8/2021 Fundamental of Information System 15 8/8/2021 Fundamental of Information System 16 Advantages of IOT Advantages of IOT Advantages of IoT span across every area of lifestyle and business Here Reduced Waste − IoT makes areas of improvement clear. Current is a list of some of the advantages that IoT has to offer: analytics give us superficial insight, but IoT provides real-world Improved Customer Engagement − Current analytics suffer from information leading to the more effective management of resources. blind-spots and significant flaws inaccuracy; and as noted, engagement Enhanced Data Collection − Modern data collection suffers from its remains passive. IoT completely transforms this to achieve richer and limitations and its design for passive use. IoT breaks it out of those more effective engagement with audiences spaces and places it exactly where humans really want to go to analyze Technology Optimization −

Use Quizgecko on...
Browser
Browser