Artificial Intelligence Student Handbook Class XI PDF

Summary

This student handbook provides an introduction to Artificial Intelligence (AI) for Class XI students. It covers foundational concepts, outlines the curriculum's key topics, and includes information on the evolution of AI.

Full Transcript

ARTIFICIAL INTELLIGENCE Class XI STUDENT HANDBOOK Subject Code:843 ARTIFICIAL INTELLIGENCE CURRICULUM Student Handbook for Class XI Acknowledgments Patrons Mr. Rahul Singh, IAS, Chairperson, Central Board of Secondary Education Strategic Guidance Dr. Biswaji...

ARTIFICIAL INTELLIGENCE Class XI STUDENT HANDBOOK Subject Code:843 ARTIFICIAL INTELLIGENCE CURRICULUM Student Handbook for Class XI Acknowledgments Patrons Mr. Rahul Singh, IAS, Chairperson, Central Board of Secondary Education Strategic Guidance Dr. Biswajit Saha, Director (Skill Education), Central Board of Secondary Education Sh. Ravinder Pal Singh, Joint Secretary, Department of Skill Education, Central Board of Secondary Education Strategic Advisory Ms. Shipra Sharma, CSR Leader, India/South Asia, IBM Ms. Joyeeta Das, Lead & Strategist, Global Education and Workforce Development, IBM Dr. Mani Madhukar, Program Lead - SkillsBuild, IBM Curriculum Planning Team Mr. Manav Subodh, Founder & Chief Mentor, 1M1B Mr. Saffin Mathew, Programs Director, 1M1B Lead Curriculum Curator Ms. Mehreen M Shamim, AI Curriculum Manager, 1M1B AI Teacher Advisory for Curriculum Revision Mr. Akhil R, TGT, DAV Public School Pushpanjali Enclave, Delhi Ms. Anni Kumar, PGT, Vikas Bharati Public School, Delhi Ms. Harmeet Kaur, PGT, Vasant Valley School, New Delhi Ms. Jyoti P B, PGT, Jyothis Central School, Thiruvananthapuram, Kerala Ms. Meenu Kumar, PGT, Venkateshwar International School, Delhi Mr. Naveen Gupta, PGT, St. Mark's Sr Sec Public School, Meera Bagh, Delhi Ms. Neeru Mittal, PGT, Shaheed Rajpal Dav Public School, Delhi Ms. Rani Kumari, PGT, Dlf Public School, Ghaziabad Uttar Pradesh Ms. Shelly Batra, TGT, Mount Carmel School, Dwarka, Delhi Ms. Smitha R Athreya, PGT, Delhi Public School Rourkela, Odisha Ms. Soumya Iyer, PGT, Sanskriti School, Pune, Maharashtra Ms. Swati Sharma, TGT, Heritage Xperiential Learning School, Gurugram, Haryana Mr. Tushar Upadhyay, TGT, Navrachana Higher Secondary School, Vadodara Gujarat Ms. Varsha Vijay K, TGT, Delhi Public School Bangalore North, Bangalore, Karnataka Ms. Vineeta Garg, PGT, Shaheed Rajpal Dav Public School, Delhi Foreword The world around us is undergoing a dramatic transformation, driven by the relentless advancement of Artificial Intelligence (AI). From self-driving cars navigating city streets to virtual assistants understanding complex inquiries, AI is rapidly reshaping industries, societies, and the very way we interact with technology. This revised textbook, designed for students in Classes XI and XII, dives into the captivating world of AI, offering a comprehensive exploration of its core concepts, applications, and potential impact. As you embark on this journey, you will not only delve into the fascinating algorithms that power AI systems, but also examine its ethical considerations and its profound implications for the future. This is no longer science fiction. AI is here, and it holds immense potential to improve our lives in countless ways. This textbook equips you, the future generation, with the knowledge and critical thinking skills necessary to navigate this rapidly evolving landscape. Through engaging exercises and thought-provoking questions, you will be challenged to not only understand AI but also to consider its role in your own future. The Central Board of Secondary Education (CBSE) recognizes the transformative power of Artificial Intelligence (AI) and its impact on the future. Building upon this successful introduction, CBSE extended the AI subject to Class XI, starting in the 2020-2021 academic session. Thus, allowing students to delve deeper into the world of AI and develop a more comprehensive understanding. CBSE acknowledges and appreciates the valuable contribution of IBM India in developing the AI curriculum and conducting trainer programs. This collaborative effort ensures educators are well-equipped to deliver the AI curriculum effectively. By working together, CBSE and its partners aim to empower students to embrace the future. By incorporating AI into their learning experience, students gain the knowledge and skills necessary to not only understand AI but also leverage its potential to enhance their learning and future prospects. The future is full of possibilities, and AI is poised to play a pivotal role. Are you ready to be a part of it? Embrace the challenge. Explore the potential. Shape the future with Artificial Intelligence. INDEX TOPICS PAGE NO. Unit 1: Introduction- AI for Everyone 1 Unit 2: Unlocking your Future in AI 18 Unit 3: Python Programming 28 Unit 4: Introduction to Capstone Project 54 Unit 5: Data Literacy -Data Collection to Data Analysis 70 Unit 6: Machine Learning Algorithms 98 Unit 7: Leveraging Linguistics and Computer Science 122 Unit 8: AI Ethics and Values 135 UNIT 1: Introduction: Artificial Intelligence for Everyone Title: Introduction: AI for Everyone Approach: Example-based learning, Hands-on activities, Discussion Summary: This unit covers various aspects of Artificial Intelligence (AI), including its definition, evolution, types, domains, terminologies, and applications. It explains the fundamental concepts of AI, such as supervised learning, cognitive computing, natural language processing (NLP), computer vision etc. Additionally, it delves into machine learning (ML) and deep learning (DL) and discusses their differences, types, and applications. The content also outlines the benefits and limitations of AI, addressing concerns such as job displacement, ethical considerations, explainability, and data privacy. Learning Objectives: 1. Understand the basic concepts and principles of Artificial Intelligence. 2. Explore the evolution of AI and identify the different types of AI. 3. Learn about the domains of AI, such as data science, natural language processing, and computer vision. 4. Gain knowledge of cognitive computing and its role in enhancing human decision-making. 5. Understand the terminologies associated with AI, including machine learning, deep learning, and reinforcement learning. Key Concepts: 1. What is Artificial Intelligence? 2. Evolution of AI 3. Types of AI 4. Domains of AI 5. AI Terminologies 6. Benefits and limitations of AI Learning Outcomes: Students will be able to - 1. Communicate effectively about AI concepts and applications in written and oral formats. 2. Describe the historical development of AI. 3. Differentiate between various types and domains of AI, including their applications. 4. Recognize the key terminologies and concepts related to machine learning and deep learning. 5. Formulate informed opinions on the potential benefits and limitations of AI in various contexts. Pre-requisites: Reasonable fluency in English language and basic computer skills 1 1. What is Artificial Intelligence (AI)? Artificial Intelligence (AI), has evolved drastically over the years, touching various aspects of our lives. It is a technology that has not only fascinated us but also significantly impacted how we live, work, and interact with the world around us. Within the vast landscape of AI, there exist several distinct Domains of Artificial Intelligence, each with its unique characteristics and applications. According to Statista, the global AI market, with a value of billion 113.60 GBP in 2023, is on a continuous growth trajectory, primarily fueled by substantial investments. Artificial intelligence (AI) refers to the ability of a machine to learn patterns and make predictions. In its simplest form, Artificial Intelligence is a field that combines computer science and robust datasets to enable problem-solving. AI does not replace human decisions; instead, AI adds value to human judgment. Think of AI as a smart helper that can understand things, learn from examples, and do tasks on its own without needing to be told exactly what to do each time. For example, AI can: Understand Language: AI can understand and respond to what you say, like virtual assistants such as Siri or Alexa. Recognize Images: AI can look at pictures and recognize what is in them, like identifying animals in photos. Make Predictions: AI can analyze data to make predictions, like predicting the weather or suggesting what movie you might like to watch next. Play Games: AI can play games and learn to get better at them, like playing chess or video games. Drive Cars: AI can help cars drive themselves by sensing the road and making decisions to stay safe. What is not AI? When we talk about machines, not all of them are considered Artificial Intelligence (AI). Here are some examples: Traditional Rule-Based Systems: These machines follow set rules without learning from data. Simple Automation Tools: Basic tools like timers or calculators do specific tasks but do not think or learn. Mechanical Devices: Machines like pulleys or gears work based on physics but do not learn or think. 2 Fixed-Function Hardware: Devices like microwave ovens perform tasks without learning or thinking. Non-Interactive Systems: Machines that do not change based on new information, like a basic electric fan. Basic Sensors: Sensors collect data but do not analyze or understand it. Artificial Intelligence machines are different. They learn from data and can make decisions on their own. For example, a smart washing machine can adjust its settings based on what it is washing. AI goes beyond just following rules; it can learn, adapt, and make decisions based on data and context. 2. Evolution of AI The history of AI can be traced back to ancient times, with philosophical discussions about the nature of intelligence and the possibility of creating artificial beings. However, the modern era of AI began in the mid-20th century with significant developments and milestones: Source:https://www.researchgate.net/figure/Timeline-diagram-showing-the-history-of-artificial-intelligence_fig1_364826401 Time Period Key Events and Developments 1950 was a landmark year for the question of machine intelligence because of Alan Turing's famous paper "Computing Machinery and Intelligence." In this 1950 paper, Turing proposed a thought experiment called the "imitation game" (later known as the Turing test). The Dartmouth Conference was organized by McCarthy that marked the birthplace of AI as a field. The term "Artificial Intelligence" was coined by John 1956 McCarthy. McCarthy, along with Turing, Minsky, and Simon, laid the foundation for AI. Significant progress in AI research that led to the development of expert systems, 1960-1970 early neural networks, exploration of symbolic reasoning, and problem-solving techniques. 3 Time Period Key Events and Developments Mixed optimism and skepticism about AI with breakthroughs in machine learning, 1980-1990 and neural networks led to "AI winter". Resurgence of interest and progress in AI with advancements in computing power, data availability, and algorithmic innovation. Also, there were 21st Century breakthroughs in machine learning, deep learning, and reinforcement learning. That led to transformative applications of AI in healthcare, finance, transportation, and entertainment. 3. Types of AI Computer scientists have identified three levels of AI based on predicted growth in its ability to analyze data and make predictions. 1. Narrow AI: Focuses on single tasks like predicting purchases or planning schedules. Rapidly growing in consumer applications, such as voice-based shopping and virtual assistants like Siri. Capable of handling specific tasks effectively, but lacks broader understanding. 2. Broad AI: Acts as a midpoint between Narrow and General AI. More versatile than Narrow AI, capable of handling a wider range of related tasks. Often used in businesses to integrate AI into specific processes, requiring domain-specific knowledge and data. 3. General AI: Refers to machines that can perform any intellectual task a human can. Currently, AI lacks abstract thinking, strategizing, and creativity like humans. Artificial Superintelligence (ASI) may emerge, potentially leading to self-aware machines, but this is far from current capabilities. 4. Domains of AI Artificial Intelligence (AI) encompasses various fields, each focusing on different aspects of replicating human intelligence and performing tasks traditionally requiring human intellect. These fields are classified based on the type of data input they handle: 4 a) Data Science: Data Science deals with numerical, alphabetical, and alphanumeric data inputs. It involves the collection, analysis, and interpretation of large volumes of data to extract insights and patterns using statistical methods, machine learning algorithms, and data visualization techniques. b) Natural Language Processing (NLP): NLP focuses on processing text and speech inputs to enable computers to understand, interpret, and generate human language. It involves tasks such as language translation, sentiment analysis, text summarization, and speech recognition, facilitating communication between humans and machines through natural language interfaces. c) Computer Vision: Computer Vision deals with visual data inputs, primarily images and videos. It enables computers to interpret and understand visual information, and perform tasks such as object detection, image classification, facial recognition, and scene understanding, enabling applications such as autonomous vehicles, medical imaging, and augmented reality. Activity: Divide the students into groups and provide them with a list of real-world applications without specifying which domain each application belongs to. Ask each group to categorize the applications into the three domains: Data Science, Natural Language Processing (NLP), and Computer Vision. 1. Gesture recognition for human-computer interaction 2. Chatbots for customer service 3. Spam email detection 4. Autonomous drones for surveillance 5. Google Translate 6. Fraud detection in financial transactions 7. Augmented reality applications (e.g., Snapchat filters) 8. Sports analytics for performance optimization 9. Object detection in autonomous vehicles 10. Recommendation systems for e-commerce platforms 11. Customer segmentation for targeted marketing 12. Text summarization for news articles 13. Automated subtitles for videos 14. Medical image diagnosis 15. Stock prediction 5 Natural Language Data Science Computer Vision Processing a. Data Science Data might be facts, statistics, opinions, or any kind of content that is recorded in some format. This could include voices, photos, names, and even dance moves! It surrounds us and shapes our experiences, decisions, and interactions. For example: Your search recommendations, Google Maps history are based on your previous data. Amazon's personalized recommendations are influenced by your shopping habits. Social media activity, cloud storage, textbooks, and more are all forms of data. It is often referred to as the "new oil" of the 21st century. Did you know? 90% of the world's data has been created in just the last 2 years, compared to the previous 6 million years of human existence. Type of Data Structured Data Unstructured Data Semi-structured Data Structured data is like a neatly arranged table, with rows and columns that make it easy to understand and work with. It includes information such as names, dates, addresses, and stock prices. Because of its organized nature, it is straightforward to analyze and manipulate, making it a preferred format for many data-related tasks. On the other hand, unstructured data lacks any specific organization, making it more challenging to analyze compared to structured data. Examples of unstructured data include images, text documents, customer comments, and song lyrics. Since unstructured data does not follow a predefined format, extracting meaningful insights from it requires specialized tools and techniques. Semi-structured data falls somewhere between structured and unstructured data. While not as organized as structured data, it is easier to handle than unstructured data. Semi-structured data uses metadata to identify certain characteristics and organize data into fields, allowing for some level of organization and analysis. An example of semi- structured data is a social media video with hashtags used for categorization, blending structured elements like hashtags with unstructured content like the video itself. 6 Source: https://www.researchgate.net/figure/Unstructured-semi-structured-and-structured-data_fig4_236860222 b. Natural Language Processing: It refers to the field of computer science and AI that focuses on teaching machines to understand and process languages in both written and spoken form, just like humans do. The goal of an NLP-Trained model is to be capable of “understanding” the contents of documents, including the slangs, sarcasm, inner meaning, and contextual definitions of the language in which the text was written. Differences Between NLP, NLU, and NLG? Source: https://www.baeldung.com/cs/natural-language-processing-understanding-generation Natural Language Processing (NLP): This is the broad umbrella term encompassing everything related to how computers interact with human language. Think of it as the "what" - what computers can do with human language. It is like the whole library - filled with different tools and techniques for working with language data. Natural Language Understanding (NLU): This is a subfield of NLP that focuses on understanding the meaning of human language. It analyzes text and speech, extracting information, intent, and sentiment. NLU helps computers understand the language and what it means. Imagine finding a specific book in the library. Natural Language Generation (NLG): This is another subfield of NLP, but instead of understanding, it focuses on generating human language. It takes structured data as input and turns it into coherent and readable text or speech. Think of this as writing a new book based on the information gathered in the library. 7 c. Computer Vision: Computer Vision is like giving computers the ability to see and understand the world through digital images and videos, much like how humans use their eyes to perceive their surroundings. In this domain, computers analyze visual information from images and videos to recognize objects, understand scenes, and make decisions based on what they "see." When we take a digital image, it is essentially a grid of tiny colored dots called pixels. Each pixel represents a tiny portion of the image and contains information about its color and intensity. Resolution is expressed as the total number of pixels along the width and height of the image. For example, an image with a resolution of 1920x1080 pixels has 1920 pixels horizontally and 1080 pixels vertically. Higher resolution images have more pixels, providing more detail. Now, here's where AI comes in. To make sense of these images, computers convert them into numbers. They break down the image into a series of numbers that represent the color and intensity of each pixel. This numerical representation allows AI algorithms to process the image mathematically and extract meaningful information from it. For instance, AI algorithms might learn to recognize patterns in these numbers that correspond to specific objects, like cars or faces. By analyzing large amounts of labeled image data, AI systems can "learn" to identify objects accurately. Cognitive Computing (Perception, Learning, Reasoning) Cognitive Computing is a branch of Artificial Intelligence (AI) that aims to mimic the way the human brain works in processing information and making decisions. It involves building systems that can understand, reason, learn, and interact with humans in a natural and intuitive way. 2. The platform (Cognitive computing) uses 1.This is a platform based on Artificial Machine Learning, Reasoning, Natural Intelligence and Signal processing. Language Processing (NLP) and Computer Vision to compute results. 3.Cognitive computing improves human 4.Cognitive computing tries to mimic the decision making human brain Examples of cognitive computing software: IBM Watson, Deep mind, Microsoft Cognitive service etc. 8 In summary, Cognitive Computing integrates Data Science, Natural Language Processing, and Computer Vision to create intelligent systems that can understand and interact with humans in a human-like manner. By combining these technologies, Cognitive Computing enables machines to process and interpret diverse types of data, communicate effectively in natural language, and perceive and understand visual information, thereby extending the capabilities of traditional AI systems. 5. AI Terminologies Artificial intelligence machines don’t think. They calculate. They represent some of the newest, most sophisticated calculating machines in human history. It is a computer system that can perform tasks that ordinarily require human intelligence or human interference. Some can perform what is called machine learning as they acquire new data. Machine learning is a subset of artificial intelligence (AI) that focuses on developing algorithms and models that enable computers to learn from data and make predictions or decisions without being explicitly programmed. Others, using calculations arranged in ways inspired by neurons in the human brain, can even perform deep learning with multiple levels of calculations. Deep learning is an AI function that imitates the working of the human brain in processing data and creating patterns for use in decision making. o The structure of Deep Learning is inspired by the structure of the neurons and neuron connections in the human brain. o Neural networks, also known as Artificial Neural Networks (ANNs), are a subset of Machine Learning and the core heart and concept of Machine Learning. o They comprise of node layers, containing an input layer, one or multiple hidden layers, and an output layer. o If the output of any node is above a specified threshold, that node is activated, sending data to the next layer of the network. o Otherwise, no data is passed along to the next layer of the network. o If the number of Layers including the Input and Output Layer is more than three, then it is called a Deep Neural Network. 9 MACHINE LEARNING DEEP LEARNING 1. Works on small dataset for accuracy 1. Works on Large dataset 2. Dependent on Low-end machine 2. Heavily dependent on high-end machine 3. Divides the tasks into sub-tasks, 3. Solves problem end to end solves them individually and finally combine the results 4. Takes less time to train 4. Takes longer time to train 5. Testing time may increase 5. Less time to test the data Example: Imagine you are given the job to sort items in the meat department at a grocery store. You realize that there are dozens of products and very less time to sort them manually. How will you use artificial intelligence, machine learning, and deep learning to help with your work? To separate the chicken, beef, and pork, you could create a programmed rule in the format of if-else statements. This allows the machine to recognize what is on the label and route it to the correct basket. 10 To improve the performance of the machine, you expose it to more data to ensure that the machine is trained on numerous characteristics of each type of meat, such as size, shape, and color. The more data you provide for the algorithm, the better the model gets. By providing more data and adjusting parameters, the machine minimizes errors by repetitive guess work. Deep learning models eliminate the need for feature extractions. Decide the algorithms based on deep learning to sort meat by removing the need to define what each product looks like. Feature extraction is built into the process without human input. Once you have provided the deep learning model with dozens of meat pictures, it processes the images through different layers of neural networks. The layers can then learn an implicit representation of the raw data on their own. Types of Machine Learning 11 Supervised learning Supervised learning is a type of machine learning where the model learns from labelled data, which means that the input data is accompanied by the correct output. In supervised learning, the algorithm learns to map input data to output labels based on example input-output pairs provided during the training phase. The goal of supervised learning is to learn a mapping function from input variables to output variables so that the model can make predictions on unseen data. Examples of supervised learning algorithms include linear regression, logistic regression, decision trees, support vector machines (SVM), and neural networks. Unsupervised Learning: Unsupervised learning is a type of machine learning where the model learns from unlabelled data, which means that the input data is not accompanied by the correct output. In unsupervised learning, the algorithm tries to find hidden patterns or structure in the input data without explicit guidance. The goal of unsupervised learning is to explore and discover inherent structures or relationships within the data, such as clusters, associations, or anomalies. Examples of unsupervised learning algorithms include k-means clustering, hierarchical clustering, principal component analysis (PCA), and autoencoders. 12 Reinforcement Learning: 1. 2. 3. 4. Reinforcement learning is a type of machine learning where an agent learns to make decisions by interacting with an environment to maximize cumulative rewards. In reinforcement learning, the agent learns through trial and error by taking actions and receiving feedback from the environment in the form of rewards or penalties. The goal of reinforcement learning is to learn a policy or strategy that guides the agent to take actions that lead to the highest cumulative reward over time. Reinforcement learning is commonly used in scenarios where the agent must make a sequence of decisions over time, such as playing games, controlling robots, or managing financial portfolios. Examples of reinforcement learning algorithms include Q-learning, deep Q-networks (DQN), policy gradients, and actor-critic methods. 13 6. Benefits and limitations of AI BENEFITS: 1. Increased efficiency and productivity: AI automates tasks, analyzes data faster, and optimizes processes, leading to increased efficiency and productivity across various sectors. 2. Improved decision-making: AI analyzes vast amounts of data and identifies patterns that humans might miss, assisting in data-driven decision-making and potentially leading to better outcomes. 3. Enhanced innovation and creativity: AI tools can generate new ideas, explore possibilities, and automate repetitive tasks, freeing up human resources for more creative pursuits and innovation. 4. Progress in science and healthcare: AI aids in drug discovery, medical diagnosis, and personalized medicine, contributing to advancements in healthcare and scientific research. LIMITATIONS: 1. Job displacement: Automation through AI raises concerns about job displacement and the need for workforce retraining and upskilling. 2. Ethical considerations: Concerns exist around bias in AI algorithms, potential misuse for surveillance or manipulation, and the need for ethical guidelines and regulations. 3. Lack of explainability: Some AI models, particularly complex ones, lack transparency in their decision-making, making it difficult to understand how they arrive at their outputs. 4. Data privacy and security: Large-scale data collection and use for AI development raise concerns about data privacy and security vulnerabilities. o Earn a credential on IBM Skills Build on the topic Artificial Intelligence Fundamentals using the link: https://students.yourlearning.ibm.com/activity/PLAN-CC702B39D429 o Semantris, is an NLP-Based game by Google based on Word association powered by semantic search. https://experiments.withgoogle.com/semantris o This is a game built with machine learning. We draw, and a neural network tries to guess what you're drawing. https://quickdraw.withgoogle.com/ o The experiment based on the computer vision domain of AI. It identifies that you draw and suggests the related images. To play the game, visit the following link on any computing device with speakers. https://www.autodraw.com/ 14 Extension Activities: These activities provide opportunities for students to explore various aspects of artificial intelligence, develop critical thinking skills, and engage in hands-on learning experiences in the classroom. 1. AI in the News: Have students research recent news articles or stories related to artificial intelligence? They can explore topics such as AI advancements, ethical dilemmas, or AI applications in various industries. Students can then present their findings to the class and facilitate discussions on the implications of these developments. 2. AI Applications Showcase: Divide students into small groups and assign each group a specific AI application or technology (e.g., virtual assistants, self-driving cars, healthcare diagnostics). Ask students to research and create presentations or posters showcasing how their assigned AI technology works, its benefits, potential drawbacks, and real-world examples of its use. 3. AI Coding Projects: Introduce students to basic coding concepts and tools used in AI development, such as Python programming language and machine learning libraries like TensorFlow or scikit-learn. Guide students through hands-on coding projects where they can build simple AI models, such as image classifiers or chatbots. Encourage experimentation and creativity in designing and training their AI systems. 4. AI Film Analysis: Screen and analyze films or documentaries that explore themes related to artificial intelligence, such as "Ex Machina," "Her" "I, Robot," or "The Social Dilemma." After watching the films, facilitate discussions on how AI is portrayed, its potential impact on society, and ethical considerations raised in the narratives. EXERCISE A. Multiple-choice questions (MCQs): 1. Who is often referred to as the "Father of AI"? a. Alan Turing b. John McCarthy c. Marvin Minsky d. Herbert A. Simon 2. In which year was the term "Artificial Intelligence" first used by John McCarthy? a. 1930 b. 1955 c. 1970 d. 2000 3. What does the term "Data is the new oil" imply? a. Data is as valuable as oil. b. Data is used as fuel for machines. c. Data is a non-renewable resource. d. Data and oil are unrelated. 15 4. Divya was learning neural networks. She understood that there were three layers in a neural network. Help her identify the layer that does processing in the neural network. a. Output layer b. Hidden layer c. Input layer d. Data layer 5. Which category of machine learning occurs in the presence of a supervisor or teacher? a. Unsupervised Learning b. Reinforcement Learning c. Supervised Learning d. Deep Learning 6. What does Deep Learning primarily rely on to mimic the human brain? a. Traditional Programming b. Artificial Neural Networks c. Machine Learning Algorithms d. Random Decision Making 7. What is the role of reinforcement learning in machine learning? a. Creating rules automatically b. Recognizing patterns in untagged data c. Rewarding desired behaviors and/or penalizing undesirable ones d. Mimicking human conversation through voice or text 8. Which AI application is responsible for automatically separating emails into "Spam" and "Not Spam" categories? a. Gmail b. YouTube c. Flipkart d. Watson B. Fill in the Blanks: 1. To determine if a machine or application is AI-based, consider its ability to perform tasks that typically require _______________ intelligence. 2. Artificial intelligence (AI) enables a machine to carry out cognitive tasks typically performed by ________. 3. Supervised, unsupervised, and reinforcement learning are three categories of ________. 4. ________________ is a subset of artificial intelligence that is entirely based on artificial neural networks. 5. Machine learning can be used for online fraud detection to make cyberspace a ________ place. 16 C. True or False: 1. Chatbots like Alexa and Siri are examples of virtual assistants. 2. Supervised learning involves training a computer system without labeled input data. 3. Unstructured data can be easily analyzed using traditional relational database techniques. 4. Deep learning typically requires less time to train compared to machine learning. 5. Machine learning is not used in everyday applications like virtual personal assistants and fraud detection. D. Short Answer Questions: 1. How is machine learning related to AI? 2. Define Data. List the types of data. 3. Define machine learning. 4. What is deep learning, and how does it differ from traditional machine learning? 5. What do you mean by Reinforcement Learning? Write any two applications of Reinforcement Learning at School. 6. How do you understand whether a machine/application is AI based or not? Explain with the help of an example. E. Case-study/Application Oriented Questions: 1. A hospital implemented an AI system to assist doctors in diagnosing diseases based on medical images such as X-rays and MRI scans. However, some patients expressed concerns about the accuracy and reliability of the AI diagnoses. How can the hospital address these concerns? 17 UNIT 2: Unlocking Your Future in AI Title: Unlocking your Future in AI Approach: Team Discussion, Web search Summary: This lesson explores the global demand for Artificial Intelligence (AI) professionals, highlighting the diverse career opportunities available across various industries. It discusses common job roles in AI, essential skills and tools for prospective AI careers, and opportunities for AI professionals in different sectors. Additionally, it provides a curated list of resources for individuals interested in exploring AI further and staying updated with the latest developments in the field. Learning Objectives: 1. Understand the increasing demand for AI professionals in today's global market. 2. Identify common job roles in the field of Artificial Intelligence and their respective responsibilities. 3. Recognize the essential skills and tools required for a successful career in AI. 4. Explore the diverse opportunities for AI professionals across various industries. 5. Discover curated resources for further learning and staying updated in the field of AI. Key Concepts: 1. The Global Demand 2. Some Common Job Roles In AI 3. Essential Skills and Tools for Prospective AI Careers 4. Opportunities in AI Across Various Industries Learning Outcomes: Students will be able to: 1. Articulate the demand for AI professionals and the diverse career opportunities available in the field. 2. Identify the requisite skills and tools needed to pursue a career in Artificial Intelligence. 3. Understand the potential roles and responsibilities of AI professionals across different industries. 4. Explore resources for further learning and skill development in the field of AI. 5. Evaluate their own interests and skills to determine potential pathways for a career in AI. Pre-Requisites: Basic understanding of Artificial Intelligence concepts and applications, familiarity with programming languages such as Python, and interest in exploring career opportunities in the field of Artificial Intelligence. 18 THE GLOBAL DEMAND Artificial Intelligence (AI) was once confined to the realms of science fiction, but today, it permeates our daily lives in ways we often take for granted. From personalized recommendations on streaming platforms to the algorithms powering autonomous vehicles, AI has become an indispensable part of the modern society. As the field continues to evolve and expand, so do the opportunities it presents for career growth and development. Amidst the concerns about automation and job displacement, it is essential to recognize the significant demand for AI professionals across various industries. While it is true that AI technologies may replace some traditional roles, they also create a multitude of new and exciting career paths. Rather than viewing AI as a threat, individuals should embrace it as an opportunity for advancement and innovation. Image Source: https://media.licdn.com The global demand for AI talent is skyrocketing, driven by the rapid pace of technological advancements and the increasing integration of AI solutions into diverse sectors. From healthcare and finance to transportation and retail, organizations are harnessing the power of AI to streamline operations, optimize processes, and deliver enhanced services to consumers. One of the most significant advantages of pursuing a career in AI is the sheer breadth of opportunities it offers. Whether you are passionate about machine learning, natural language processing, robotics, or data analytics, there is a niche within the AI field suited to your interests and skills. Moreover, as AI technologies continue to mature, new specialties and job roles are emerging, creating avenues for specialization and expertise. SOME COMMON JOB ROLES IN AI: In today's market, there is a wide range of job roles within the field of Artificial Intelligence (AI) that are in high demand. Some common job roles include: Machine Learning Engineer: Machine learning engineers bridge software engineering and data science, utilizing big data tools and programming frameworks to develop scalable data science models capable of handling vast volumes of real- time data. Strong mathematical skills, experience in machine learning and deep learning, and proficiency in programming languages like Java, Python, and Scala are essential for success in this role. Data Scientist: Data scientists leverage machine learning and predictive analytics to extract insights from large datasets, to take proper business decisions. Proficiency in big data platforms like Hadoop, Pig, and Spark, fluency in programming languages such as SQL, Python, and Scala, and a solid understanding of descriptive and inferential statistics are the key requirements for this role. 19 Business Intelligence Developer: Business intelligence (BI) developers design, model, and maintain complex data sets to analyse business and market trends, enhance organizational profitability and efficiency. Strong technical and analytical skills, along with expertise in data warehouse design and BI technologies, are essential for success in this role. Robotics Engineer: They design and maintain AI-powered robots, develop mechanical devices capable of performing tasks with human commands. Proficiency in programming, along with expertise in disciplines like robotic engineering, mechanical engineering, and electrical engineering, is crucial for success in this field. Software Engineer: AI software engineers build and maintain software products for AI applications, staying updated on the latest artificial intelligence technologies. Proficiency in software engineering, programming languages, and statistical analysis is essential, typically requiring a bachelor's degree in computer science, engineering, or related fields. Natural Language Processing (NLP) Engineer: NLP engineers specialize in human language processing, working on voice assistants, speech recognition, and document processing. A specialized degree in computational linguistics or a combination of computer science, mathematics, and statistics is typically required for this role. Computer Vision Engineer: Computer vision engineers specialize in developing algorithms and systems that enable computers to analyse and interpret visual information from images or videos. Their expertise lies in creating software solutions that can understand and process visual data, requiring proficiency in image processing techniques and programming languages such as Python and C++. AI Ethicist: AI ethicists address ethical considerations and implications related to the development and deployment of AI technologies, ensuring that they are used responsibly and ethically. They provide guidance on ethical frameworks, policies, and practices to promote fairness, transparency, and accountability in AI systems, often requiring a background in ethics, philosophy, or law, combined with expertise in AI technology. AI Consultant: AI consultants offer expert guidance and advice to organizations on how to leverage AI technologies to solve business challenges and drive innovation. They assess business needs, identify opportunities for AI integration, and develop strategic AI initiatives, requiring a deep understanding of AI technologies, business processes, and industry trends, along with strong communication & analytical skills. 20 Activity 1: Divide the class into small groups and distribute the list of AI job roles to each group. Using the roles written in the chit, the teams will identify ten companies currently hiring employees for those specific AI positions. ESSENTIAL SKILLS AND TOOLS FOR PROSPECTIVE AI CAREERS A successful career in artificial intelligence requires a diverse set of skills that encompass both technical expertise and soft skills. According to industry leaders, here are some of the top skills that AI professionals need: Technical Skills: Expertise in neural networks, machine learning, and deep learning is essential for developing advanced AI applications. Knowledge of big data technologies and techniques for handling and analysing large datasets is crucial in AI applications. Understanding of frameworks and libraries like TensorFlow, SciPy, and NumPy is vital for building and deploying AI solutions. Familiarity with programming languages such as Python, R, Java, and C++ is necessary for developing AI models and algorithms. Proficiency in linear algebra, probability, statistics, and signal processing is essential for understanding the mathematical principles underlying AI algorithms. Soft Skills: Effective communication skills are crucial for conveying complex technical concepts to non-technical stakeholders and collaborating with multidisciplinary teams. Strong teamwork and collaboration abilities are essential for working effectively in cross-functional teams to develop AI products and solutions. Problem-solving, decision-making, and analytical thinking skills are critical for identifying and addressing challenges in AI projects. Time management and organizational skills are essential for managing multiple projects and meeting deadlines. Business intelligence and critical thinking skills are valuable for understanding business requirements and translating them into AI solutions that deliver tangible value. 21 Your Professional Toolkit In addition to acquiring the necessary skills, it is essential for AI professionals to familiarize themselves with popular AI tools, platforms, and programming languages. Here are some essential tools and their purposes: Python: A versatile programming language with pre-made libraries for advanced computing and scientific computation. R: A programming language for data collection, organization, and analysis, particularly useful for machine learning and statistical functions. Java: Widely used in AI for implementing intelligence programming, neural networks, and machine learning solutions. C++: Known for its flexibility and object-oriented functions, used for procedural programming and hardware manipulation in AI. TensorFlow: An open-source machine learning platform with tools and libraries for developing sophisticated AI applications. SciPy and NumPy: Python libraries for scientific computing and mathematical operations, ideal for manipulating and visualizing data. By acquiring the right combination of technical skills and tools, aspiring AI professionals can position themselves for success in this dynamic and rapidly growing field. Whether you are interested in developing AI algorithms or implementing AI solutions in real-world applications, building a strong foundation of skills and expertise is the key to unlocking exciting career opportunities in artificial intelligence. Activity 2: In continuation with the previous activity, list the technical skills and soft skills listed by any two companies for the specific AI position. Technical Skills Soft Skills Link to the website 22 OPPORTUNITIES IN AI ACROSS VARIOUS INDUSTRIES Artificial intelligence professionals design and develop AI systems that use machine learning and neural networks to predict trends, provide better customer experiences and recommendations, and offer solutions to difficult problems. While some AI professionals work towards the goal of General AI—systems interconnected and able to be nearly as creative as human beings—others focus on narrower applications. This following table gives you a variety of opportunities to choose from depending upon your choice of subject. Employment Relevant Subjects Industry Some Existing/Expected Job Roles Opportunities in School 1. Autonomous Vehicle Engineer: Develops AI algorithms for self-driving cars. Mathematics, Design, 2. Simulation Engineer: Creates virtual Physics, manufacturing, Automobile environments for testing autonomous Computer and sale of motor vehicle technologies. Science/Artificial vehicles. 3. Robotics Engineer: Designs AI-powered Intelligence robots for automotive tasks. 1. Precision Agriculture Specialist: Uses AI- powered drones and sensors for monitoring Biology, Monitoring crop crops. Mathematics, health, optimizing Agriculture 2. Crop Yield Prediction Analyst: Forecasts Computer irrigation, and crop yields using AI models. Science/Artificial maximizing yields. 3. Livestock Monitoring Specialist: Tracks Intelligence the health and productivity of farm animals. 1. Inventory Management Specialist: Optimizing Optimizes inventory levels using AI Business Studies, inventory, sales algorithms. Mathematics, forecasting, and 2. Sales Forecasting Analyst: Forecasts Retail Computer enhancing sales using AI models. Science/Artificial customer 3. Customer Experience Designer: Intelligence experience. Enhances customer experience using AI- driven insights. 1. Visual Effects Artist: Uses AI tools for Fine Arts, Media Creating visual creating visual effects. Studies, effects, content 2. Content Creator: Generates content Media Computer generation, and using AI-generated insights. Science/Artificial audience analysis. 3. Audience Analyst: Analyzes audience Intelligence behavior using AI algorithms. Developing AI 1. Machine Learning Engineer: Develops AI Computer algorithms, algorithms and systems. Science/Artificial Information systems, and 2. AI Software Developer: Builds AI- Intelligence, Technology infrastructure for powered applications. Mathematics, various 3. AI Infrastructure Specialist: Maintains Physics applications. and optimizes AI infrastructure. 23 1. Medical Imaging Analyst: Analyzes Medical imaging Biology, medical images using AI algorithms. analysis, Chemistry, 2. Virtual Nurse Assistant: Provides Healthcare personalized Computer personalized healthcare recommendations. healthcare, and Science/Artificial 3. Drug Discovery Researcher: Identifies drug discovery. Intelligence potential drug candidates using AI. 1. Quantitative Analyst: Analyzes market Market analysis, trends using AI algorithms. Economics, fraud detection, 2. Fraud Detection Analyst: Identifies Mathematics, Finance risk management, fraudulent activities using AI models. Computer and investment 3. Financial Advisor: Offers personalized Science/Artificial recommendations. investment recommendations using AI- Intelligence driven analytics. 1. National Security Analyst: Uses AI- Surveillance, powered surveillance systems. Political Science, predictive 2. Defense Contractor: Develops AI- Computer Government analytics, citizen enabled military technologies. Science/Artificial & Military services, and 3. Government AI Specialist: Implements Intelligence, military AI for citizen services and regulatory Mathematics technologies. compliance. 1. Travel Recommendation Engine Personalized Developer: Provides personalized travel Geography, travel recommendations. Business Studies, recommendations, Tourism 2. Chatbot for Customer Service: Assists Computer customer service, travelers with booking and inquiries. Science/Artificial and itinerary 3. Smart Travel Itinerary Planner: Optimizes Intelligence planning. travel routes and schedules. 1. AI-powered Skincare Assistant: Provides Chemistry, Skincare analysis, personalized skincare recommendations. Biology, Beauty & virtual styling, and 2. Virtual Hair Stylist: Simulates different Computer Wellness wellness hairstyles using AI. Science/Artificial guidance. 3. Wellness Chatbot: Offers guidance on Intelligence nutrition and fitness. 1. Loan Approval Specialist: Automates loan Loan approval Economics, approval process using AI. automation, fraud Mathematics, 2. Fraud Detection Analyst: Identifies Banking detection, and Computer fraudulent transactions using AI algorithms. personalized Science/Artificial 3. Financial Advisor: Offers personalized financial advice. Intelligence financial advice using AI-driven analytics. Spatial data Geography, 1. Geographic Information Systems (GIS) analysis, remote Geology, Specialist: Analyzes spatial data using AI. Geospatial sensing, and Computer 2. Remote Sensing Analyst: Interprets mapping Science/Artificial satellite imagery using AI algorithms. technologies. Intelligence 24 3. Mapping Technician: Uses AI-enabled drones for mapping. 1. AI-powered Fabric Design Specialist: Creates innovative textile patterns using AI. Fabric design, 2. Textile Quality Control Inspector: Chemistry, Art & quality control, Ensures product quality using AI-enabled Design, Computer Textile and inventory systems. Science/Artificial management. 3. Smart Inventory Management Specialist: Intelligence Optimizes inventory levels using AI algorithms. 1. Generative Design Assistant: Optimizes Design Art & Design, design solutions using AI algorithms. optimization, user Computer 2. AI-powered UX Designer: Enhances user Design experience Science/Artificial experience using AI-driven insights. enhancement, and Intelligence, 3. AI-powered Content Creator: Generates content creation. Mathematics content using AI tools. 1. Marketing Campaign Automation Specialist: Automates marketing Campaign Business Studies, campaigns using AI. automation, Mathematics, Sales & 2. Customer Segmentation Analyst: customer Computer Marketing Segments customers based on behavior segmentation, and Science/Artificial using AI. sales forecasting. Intelligence 3. Sales Forecasting Analyst: Forecasts sales using AI models. 1.AI-powered Fashion Stylist: Personalized Recommends personalized clothing Fashion Design, fashion combinations using AI. Mathematics, Fashion recommendations, 2. Trend Analyst: Analyzes fashion trends Computer trend analysis, and using AI algorithms. Science/Artificial virtual try-on. 3. Virtual Clothing Try-on Specialist: Allows Intelligence virtual try-on of clothing using AI and AR. ADDITIONAL LEARNING RESOURCES: Here are some resources you can explore, bookmark, and keep in mind if you would like to explore more about AI and stay in touch with the latest developments in the field. This is a curated listing. There are many organizations and websites to explore, depending on your interests. News and blogs to stay current Analytics Insight offers insights, latest news, and a magazine featuring opinions and views of top industry leaders and executives who share their journeys, experiences, success stories, and knowledge to grow profitable businesses. Towards Data Science is an online publication in which independent authors who follow their rules and guidelines can publish their work, share their knowledge and expertise, and engage a wide audience on Medium. 25 KDnuggets is a leading site on data science, machine learning, AI, and analytics. It contains excellent tutorial materials, courses, webinars, online events. Data Science Central is a leading online resource for data practitioners. From statistics and analytics to machine learning and AI, Data Science Central provides a community experience that includes a rich editorial platform, social interaction, forum-based support, and the latest information on technology, tools, trends, and careers. Datanami is a news portal dedicated to providing insight, analysis, and up-to-the- minute information about emerging trends and solutions in big data. Free learning opportunities to build skills Note: You will need to sign up for a free account for the following online learning opportunities. You can take advantage of IBM SkillsBuild to power your future in tech with job skills, courses, digital credentials, and more. Kaggle offers free online micro courses to help you gain the skills you need to do independent data science projects. Kaggle also allows you to grow your data science and machine learning skills by competing in Kaggle competitions. Find help in the documentation or learn about Community Competitions. Udemy offers a variety of free video-based courses on artificial intelligence, including a short, practical hands-on course on artificial intelligence, called Kickstart Artificial Intelligence. Udemy also offers a course called Artificial Intelligence: Preparing Your Career for AI, which covers what you should be doing now to prepare for the coming of AI. freeCodeCamp.org offers a rundown of All the Math You Need to Know in Artificial Intelligence. Jason Dsouza gives you an overview of the core math principles you need to focus on to work in AI. DataCamp offers a free, two-hour Machine Learning for Everyone course which introduces machine learning without coding involved. W3Schools is the world’s largest web developer site that offers a variety of free online tutorials with hands-on practice. The site includes tutorials on some popular data science programming languages, such as Python, R, and SQL. Codecademy offers free coding classes on 12 different programming languages including Python, Java, and C++. Additional information regarding colleges offering professional course in AI. ✓ IIT Madras – Four year Bachelor of Science Degree in Data Science and Applications https://study.iitm.ac.in/ds/ ✓ AICTE – All India Council for Technical Education’s unique website – “Digital Skilling”. Explore this site for a wide variety of course and internships. https://1crore.aicte-india.org/ ✓ Most of the top colleges in India now offer B.Tech courses in AI and Ml, Data Science, Robotics and Computer Science with specializations. Students can visit college websites to know more about these courses. Also, some colleges are offering BSc in AI and ML. REFERENCES : IBM Skills Build 26 EXERCISES: A. Multiple Choice Questions 1. Which of the following is a job role in AI related to the automobile industry? a. Robotics Engineer b. Virtual Nurse Assistant c. Sales Forecasting Analyst d. Autonomous Vehicle Engineer 2. Identify the important soft skill required for AI professionals. a. Expertise in neural networks b. Effective communication c. Proficiency in Python d. Knowledge of big data technologies 3. Which industry uses AI for personalized travel recommendations? a. Tourism b. Banking c. Healthcare d. Geospatial 4. What is the purpose of the website “Data Science Central”? a. Providing a community experience for data practitioners b. Offering free video-based courses on AI c. Analyzing market trends using AI algorithms d. Providing insight into emerging trends in big data 5. Which industry uses AI for market analysis and fraud detection? a. Finance b. Media c. Textile d. Design B. Short answer questions: 1. Name some common job roles in the field of artificial intelligence (AI). 2. What are some essential technical skills required for a successful career in AI, and why are they important? 3. What is the role of AI professionals in healthcare, finance, and retail industry? 4. List some popular AI tools and programming languages used by AI professionals. 5. What soft skills do AI professionals need, and how do they help them succeed? 6. Why is continuous learning crucial in AI, and how do professionals stay updated with the latest advancements? C. Long answer questions. 1. How does the global demand for AI professionals affect career opportunities in the field? 2. What are some common job roles in the field of AI, and how do they contribute to the development and implementation of AI solutions across various industries? 27 UNIT 3: Python Programming Title: Python Programming Approach: Group Discussion, Hands on Practice using the software Summary: This unit will introduce students to the fundamentals/ basics of Python programming language, its history, evolution, operators, variables, constants, lists, strings, iterative and select statements. Students will explore three essential Python libraries: NumPy, Pandas, and Scikit-learn. Students will learn how Python is used to create programs. They will also learn how to use NumPy for numerical computing, Pandas for data manipulation and analysis, and Scikit-learn for implementing machine learning algorithms. Learning Objectives: Students will be able to 1. Understand the basics of python programming language- tokens, data types, lists, string manipulation, iterative and decision statements. 2. Learn how to use NumPy for mathematical operations and numerical computing. 3. Explore Pandas for data manipulation, analysis, and exploration of structured data. 4. Gain proficiency in using Scikit-learn for implementing machine learning algorithms, including classification. 5. Develop the skills necessary to use Python libraries effectively in Data Science and machine learning projects. Key concepts: 1. Basics of python programming language 2. Understanding of character sets, tokens, modes, operators and data types 3. Control Statements 4. CSV Files 5. Libraries – NumPy, Pandas, Scikit-learn Learning Outcomes: Students will be able to – 1. Explain the basics of python programming language and write programs with basic concepts of tokens. 2. ⁠Use selective and iterative statements effectively. 3. Gain practical knowledge on how to use the libraries efficiently. Pre-requisites: Reasonable fluency in English language and basic computer skills 28 Introduction to Python Python is a general-purpose, high level programming language. It was created by Guido van Rossum, and released in 1991. Python got its name from a BBC comedy series – “Monty Python’s Flying Circus” Features of Python High Level language Interpreted Language Free and Open Source Platform Independent (Cross-Platform) – runs virtually in every platform if a compatible python interpreter is installed. Easy to use and learn – simple syntax similar to human language. Variety of Python Editors – Python IDLE, PyCharm, Anaconda, Spyder Python can process all characters of ASCII and UNICODE. Widely used in many different domains and industries. Python Editors There are various editors and Integrated Development Environments (IDEs) that you can use to work with Python. Some popular options are PyCharm, Spyder, Jupyter Notebook, IDLE etc. Let us look how we can work with Jupyter Notebook. Jupyter Notebook is an open-source web application that allows you to create and share documents containing live code, equations, visualizations, and narrative text. It's widely used in data science and research. It can be installed using Anaconda or with pip. For more details of installation use the link https://docs.jupyter.org/en/latest/install/notebook-classic.html Those who are familiar with Python, open the command prompt in administrative mode and type pip install notebook To run the notebook, Open the command prompt and type jupyter notebook Following window will open 29 You can type the code in the cell provided. Then click to see the output just below it. Getting Started with Python Programs Python program consists of Tokens. It is the smallest unit of a program that the interpreter or compiler recognizes. Tokens consist of keywords, identifiers, literals, operators, and punctuators. They serve as the building blocks of Python code, forming the syntactic structure that the interpreter parses and executes. During lexical analysis, the Python interpreter breaks down the source code into tokens, facilitating subsequent parsing and interpretation processes. https://www.studytrigger.com/wp-content/uploads/2022/08/Tokens-in-Python.jpg 30 Keywords Reserved words used for special purpose. List of keywords are given below. Identifier An identifier is a name used to identify a variable, function, class, module or other object. Generally, keywords (list given above) are not used as variables. Identifiers cannot start with digit and also it can’t contain any special characters except underscore. Literals: Literals are the raw data values that are explicitly specified in a program. Different types of Literals in Python are String Literal, Numeric Literal (Numbers), Boolean Literal (True & False), Special Literal (None) and Literal Collections. Operators: Operators are symbols or keywords that perform operations on operands to produce a result. Python supports a wide range of operators: Arithmetic operators (+, -, *, /, %) Logical operators (and, or, not) Relational operators (==, !=, , =) Identity operators (is, is not) Assignment operators (=, +=, -=) Membership operators (in, not in) Punctuators: Common punctuators in Python include : ( ) [ ] { } , ;. ` ' ' " " / \ & @ ! ? | ~ etc. Example output Tokens in the above program are given below Keyword - import Identifier - num , root (Here it can be said as variables also) Literal - 625 Operator - = Punctuator - “ “ , ( ). 31 In the above program print () is used to display the output on the screen # symbol is used to write comments which are used to increase readability and will not be executed import statement is used to load the functions from the library (math) Variables – Named labels whose value can be used and processed during the execution of the program. Sample Program-1 Display the string “National Animal-Tiger” on the screen Sample Program-2 Write a program to calculate the area of a rectangle given the length and breadth are 50 and 20 respectively. Data Types: Data types are the classification or categorization of data items. It represents the kind of value that tells what operations can be performed on a particular data. Python supports Dynamic Typing. A variable pointing to a value of certain data type can be made to point to a value/object of another data type. This is called Dynamic Typing. The following are the standard or built-in data types in Python: 32 Data Type Description Integer Stores whole number a=10 Boolean is used to represent the truth values of the Result = True Boolean expressions. It has two values True & False Floating point Stores numbers with fractional part x=5.5 Complex Stores a number having real and imaginary part num=a+bj Immutable sequences (After creation values cannot name= “Ria”) String be changed in-place) Stores text enclosed in single or double quotes Mutable sequences (After creation values can be lst=[ 25, 15.6, “car”, changed in-place) “XY”] List Stores list of comma separated values of any data type between square [ ] Immutable sequence (After creation values cannot tup=(11, 12.3, “abc”) be changed in-place) Tuple Stores list of comma separated values of any data type between parentheses ( ) Set is an unordered collection of values, of any type, s = { 25, 3, 3.5} Set with no duplicate entry. Unordered set of comma-separated key:value pairs dict= { 1 : “One”, 2: Dictionary within braces {} “Two”, 3: “Three”} Accepting values from the user The input() function retrieves text from the user by prompting them with a string argument. For instance: name = input("What is your name?") Return type of input function is string. So, to receive values of other types we have to use conversion functions together with input function. Sample Program-3 Write a program to read name and marks of a student and display the total mark. output In the above example float( ) is used to convert the datatype into floating point. The explicit conversion of an operand to a specific type is called type casting. 33 Control flow statements in Python Till now, the programs you've created have followed a basic, step-by-step progression, where each statement executes in sequence, every time. However, there are many practical programs where we have to selectively execute specific sections of the code or iterate over parts of the program. This capability is achieved through selective statements and looping statements. Selection Statement The if/ if..else statement evaluates test expression and the statements written below will execute if the condition is true otherwise the statements below else will get executed. Indentation is used to separate the blocks. Syntax: Let’s check out different examples to see the working of if and if-else statements Sample Program-4 Asmita with her family went to a restaurant. Determine the choice of food according to the options she chooses from the main menu. Case 1: All Members are vegetarians. They prefer to have veg food. No other options. (menu-veg) Program & Output 34 Case 2: Family Members may choose non-vegetarian foods also if veg foods are not available. (menu-veg/Nonveg) Case 3: Family members can choose from variety of options Sample Program-5 Write a program to get the length of the sides of a triangle and determine whether it is equilateral triangle or isosceles triangle or scalene triangle, 35 Looping Statements Looping statements in programming languages allow you to execute a block of code repeatedly. In Python, there are mainly two types of looping statements: for loop and while loop. For loop For loop iterates through a portion of a program based on a sequence, which is an ordered collection of items. The “for” keyword is used to start the loop. The loop variable takes on each value in the specified sequence (e.g., list, string, range). The colon (:) at the end of the for statement indicates the start of the loop body. The statements within the loop body are executed for each iteration. Indentation is used to define the scope of the loop body. All statements indented under the for statement are considered part of the loop. It is advisable to utilize a for loop when the exact number of iterations is known in advance. Syntax for in : Example -1 Example-2 In the above program range (5) returns the values 0,1,2,3,4 For each iteration of the loop variable i receives these values. First iteration of the loop i=0 (one time print(“Python”) executes, similarly with i=1,2,3,4 also print statement works. Whatever is given inside the loop executes repeatedly. In the first example, 5 times Python was printed, but in example-2, as i is to be printed it displayed 0 1 2 3 4 The for loop iterates over each item in the sequence until it reaches the end of the sequence or until the loop is terminated using a break statement. It's a powerful construct for iterating over collections of data and performing operations on each item. 36 Sample Program-6 Write a program to display even numbers and their squares between 100 and 110. Sample Program-7 Write a program to read a list, display each element and its type. (use type( ) to display the data type.) In the above program the control variable word gets each element of the list. Hence in print statement each element and its type is displayed Same program can be written using the following code also for i in range (len (lst)): print ( lst[i] , type ( lst[i] ) Here we take i as index number, lst= 25 & lst[-1] = 100 Sample Program-8 Write a program to read a string. Split the string into list of words and display each word. 37 Sample Program-9 Write a simple program to display the values stored in dictionary UNDERSTANDING CSV file (Comma Separated Values) CSV files are delimited files that store tabular data (data stored in rows and columns). It looks similar to spread sheets, but internally it is stored in a different format. In csv file, values are separated by comma. Data Sets used in AI programming are easily saved in csv format. Each line in a csv file is a data record. Each record consists of more than one fields(columns). The csv module of Python provides functionality to read and write tabular data in CSV format. Let us see an example of opening, reading and writing formats for a file student.csv with file object file. student.csv contains the columns rollno, name and mark. importing library import csv Opening in reading mode file= open(“student.csv”, “r”) Opening in writing mode file= open(“student.csv”, “w”) closing a file file.close( ) writing rows wr=csv.writer(file) wr.writerow( [ 12, “Kalesh”, 480] ) Reading rows details = csv.reader(file ) for rec in details: print(rec) Sample Program-10 Write a Program to open a csv file students.csv and display its details 38 INTRODUCING LIBRARIES A library in Python typically refers to a collection of reusable modules or functions that provide specific functionality. Libraries are designed to be used in various projects to simplify development by providing pre-written code for common tasks. Concept of libraries are very easy to understand. In Python, functions are organized within libraries similar to how library books are arranged by subjects such as physics, computer science, and economics. For example, the "math" library contains numerous functions like sqrt(), pow(), abs(), and sin(), which facilitate mathematical operations and calculations. To utilize a library in a program, it must be imported. For example, if we wish to use the sqrt() function in our program, we include the statement "import math". This allows us to access and utilize the functionalities provided by the math library. Python offers a vast array of libraries for various purposes, making it a versatile language for different domains such as web development, data analysis, machine learning, scientific computing, and more. Now, let us explore some libraries that are incredibly valuable in the realm of Artificial Intelligence. NUMPY NumPy, which stands for Numerical Python, is a powerful library in Python used for numerical computing. It is a general-purpose array-processing package. NumPy provides the ndarray (N-dimensional array) data structure, which represents arrays of any dimension. These arrays are homogeneous (all elements are of the same data type) and can contain elements of various numerical types (integers, floats, etc.) Where and why do we use the NumPy library in Artificial Intelligence? Suppose you have a dataset containing exam scores of students in various subjects, and you want to perform some basic analysis on this data. You can utilize NumPy arrays to store exam scores 39 for different subjects efficiently. With NumPy's array operations, you can perform various calculations such as calculating average scores for each subject, finding total scores for each student, calculating the overall average score across all subjects, identifying the highest and lowest scores. NumPy's array operations streamline these computations, making them both efficient and convenient. This makes NumPy an indispensable tool for data manipulation and analysis in data science applications. NumPy can be installed using Python's package manager, pip. pip install numpy Creating a Numpy Array - Arrays in NumPy can be created by multiple ways. Some of the ways are programmed here: Using List of Tuples Using values from the user (using empty( )) )-- The empty() function in Python is used to return a new array of a given size) PANDAS The name "Pandas" has a reference to both "Panel Data", and "Python Data Analysis”. Pandas is a powerful and versatile library that simplifies tasks of data manipulation in Python. Pandas is built on top of the NumPy library which means that a lot of structures of NumPy are used or replicated in Pandas and Pandas is particularly well- suited for working with tabular data, such as spreadsheets or SQL tables. Its versatility and ease of use make it an essential tool for data analysts, scientists, and engineers working with structured data in Python. 40 Where and why do we use the Pandas library in Artificial Intelligence? Suppose you have a dataset containing information about various marketing campaigns conducted by the company, such as campaign type, budget, duration, reach, engagement metrics, and sales performance. We use Pandas to load the dataset, display summary statistics, and perform group-wise analysis to understand the performance of different marketing campaigns. We then visualize the sales performance and average engagement metrics for each campaign type using Matplotlib, a popular plotting library in Python. Pandas provides powerful data manipulation and aggregation functionalities, making it easy to perform complex analysis and generate insightful visualizations. This capability is invaluable in AI and data-driven decision-making processes, allowing businesses to gain actionable insights from their data. Pandas can be installed using: pip install pandas Pandas generally provide two data structures for manipulating data, they are: Series and DataFrame. Series A Series is a one-dimensional array containing a sequence of values of any data type (int, float, list, string, etc.) which by default have numeric data labels starting from zero. The data label associated with a particular value is called its index. We can also assign values of other data types as index. We can imagine a Pandas Series as a column in a spreadsheet as given here. In data science, we often encounter datasets with two-dimensional structures. This is where Pandas DataFrames come into play. A Data Frame is used when we need to work on multiple columns at a time, i.e., we need to process the tabular data. 41 For example, the result of a class, items in a restaurant’s menu, reservation chart of a train, etc. A DataFrame is a two-dimensional labeled data structure like a table of MySQL. It contains rows and columns, and therefore has both a row and column index. Each column can have a different type of value such as numeric, string, boolean, etc., as in tables of a database. Creation of DataFrame There are several methods to create a DataFrame in Pandas, but here we will discuss two common approaches: Using NumPy ndarrays- Using List of Dictionaries ➔ Dictionary keys become column labels by default in a DataFrame, and the lists become the rows. ➔ NaN (Not a Number) is inserted if a corresponding value for a column is missing. ➔ Pandas uses isnull() function to identify NaN values in a DataFrame. 42 Dealing with Rows and Columns Based on the DataFrame 'Result' provided below, we can observe various operations related to rows and columns. Each operation statement is accompanied by its corresponding output from the Result DataFrame DataFrame: Result Adding a New Column to a DataFrame: We can add a new column ‘Fathima’, by mentioning column name as given below Adding a New Row to a DataFrame: We can add a new row to a DataFrame using the DataFrame.loc[ ] method. Let us add marks for English subject in Result ➔ Deleting Rows and Columns from a DataFrame: We need to specify the names of the labels to be dropped and the axis from which they need to be dropped. To delete a row, the parameter axis is assigned the value 0 and for deleting a column, the parameter axis is assigned the value 1. Deleting a row “Hindi” 43 Delete the columns having labels ‘Rajat’, 'Meenakshi' and ‘Karthika’’: During Data Analysis, DataFrame.drop() method is used to remove the rows and columns. Accessing DataFrame Elements Data elements in a DataFrame can be accessed using different ways. Two common ways of accessing are using loc and iloc. DataFrame.loc[ ] uses label names for accessing and DataFrame.iloc[ ] uses the index position for accessing the elements of a DataFrame. Let us check an example Understanding Missing Values Missing Data or Not Available data can occur when no information is provided for one or more items or for a whole unit. During Data Analysis, it is common for an object to have some missing attributes. If data is not collected properly it results in missing data. In DataFrame it is stored as NaN (Not a Number). For example, while collecting data, some people may not fill all the fields while taking the survey. Sometimes, some attributes are not relevant to all. Pandas provide a function isnull() to check whether any value is missing or not in the DataFrame. This function checks all attributes and returns True in case that attribute has missing values, otherwise returns False. Now, we can explore different operations related to missing values based on the DataFrame 'listDict' provided below. Dance Music Painting StudCCA. isnull( ) X True True False XI True Tre True XII False False True 44 Finding any missing value in a column ➔ StudCCA[‘Music’]. isnull(). any( ) →True Finding total number of NaN ➔ StudCCA. isnull(). sum() → 3 Deleting entire row with NaN values ➔ StudCCA. dropna( ) Replacing NaN values (here by 1) ➔ StudCCA. fillna ( 1 ) Attributes of DataFrames Attributes are the properties of a DataFrame that can be used to fetch data or any information related to a particular DataFrame. The syntax of writing an attribute is: DataFrame_name. attribute Let us understand the attributes of DataFrames with the help of DataFrame Teacher DataFrame:Teacher Displaying Row Indexes - Teacher.index Displaying column Indexes - Teacher.columns Displaying datatype of each - Teacher.dtypes Displaying data in Numpy Array form - Teacher.values 45 Displaying total number of rows and columns (row, column) - Teacher.shape Displaying first n rows (here n = 2) - Teacher. head (2) Displaying last n rows (here n = 2) - Teacher. tail (2) Importing and Exporting Data between CSV Files and DataFrames We can create a DataFrame by importing data from CSV files. Similarly, we can also store or export data in a DataFrame as a.csv file. Importing a CSV file to a DataFrame Using the read_csv() function, you can import tabular data from CSV files into pandas dataframe by specifying a parameter value for the file name Syntax: pd.read_csv("filename.csv") Example: Reading file students.csv read_csv() is used to read the csv file with its correct path. sep specifies whether the values are separated by comma, semicolon, tab, or any other character. The default value for sep is a space. The parameter header marks the start of the data to be fetched. header=0 implies that column names are inferred from the first line of the file. By default, header=0. 46 Exporting a DataFrame to a CSV file We can use the to_csv() function to save a DataFrame to a text or csv file. For example, to save the DataFrame Teacher into csv file resultout, we should write Teacher.to_csv(path_or_buf='C:/PANDAS/resultout.csv', sep=',') When we open this file in any text editor or a spreadsheet, we will find the above data along with the row labels and the column headers, separated by comma. Scikit-learn (Learn) Note for Teachers: This topic can be taught after teaching the Machine Learning Unit. Scikit-learn (Sklearn) is the most useful and robust library for machine learning in Python. It provides a selection of efficient tools for machine learning and statistical modeling via a consistent interface in Python. Sklearn is built on (relies heavily on) NumPy, SciPy and Matplotlib.. Key Features: Offers a wide range of supervised and unsupervised learning algorithms. Provides tools for model selection, evaluation, and validation. Supports various tasks such as classification, regression, clustering, dimensionality reduction, and more. Integrates seamlessly with other Python libraries like NumPy, SciPy, and Pandas. Install scikit-learn using the statement pip install scikit-learn load_iris (In sklearn.datasets) The Iris dataset is a classic and widely used dataset in machine learning, particularly for classification tasks. It comprises measurements of various characteristics of iris flowers, such as sepal length, sepal width, petal length, and petal width, along with the corresponding species of iris to which they belong. The dataset typically includes three species: setosa, versicolor, and virginica. from sklearn.datasets import load_iris importing iris dataset iris = load_iris() calls the “load_iris()” function to load the iris dataset X = iris.data X is a variable and assigned as feature vector. The feature vectors contain the input data for the machine learning model y= iris.target Y is a variable and assigned as target variable. The target variable contains the output or the variable we want to predict with the model. 47 Sample output – First 10 rows of X Here, each row represents a sample (i.e., an iris flower), and each column represents a feature (i.e., a measurement of the flower). For example, the first row [ 5.1 3.5 1.4 0.2] corresponds to an iris flower with the following measurements: Sepal length: 5.1 cm Sepal width: 3.5 cm Petal length: 1.4 cm Petal width: 0.2 cm train_test_split (In sklearn.model_selection) Datasets are usually split into training set and testing set. The training set is used to train the model and testing set is used to test the model. Most common splitting ratio is 80: 20. (Training -80%, Testing-20%) from sklearn.model_selection import importing train_test_split train_test_split X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 1) X_train, y_train the feature vectors and target variables of the training set respectively. X_test, y_test the feature vectors and target variables of the testing set respectively. test_size = 0.2 specifies that 20% of the data will be used for testing, and the remaining 80% will be used for training. random_state = 1 Ensures reproducibility by fixing the random

Use Quizgecko on...
Browser
Browser