IT Era: Lesson 1 - Computers and Programming PDF

Document Details

FreedAgate3463

Uploaded by FreedAgate3463

University of the East

Tags

information technology computer history programming languages IT era

Summary

This document provides a comprehensive overview of the evolution of information and communication technology (ICT), from early counting tools to modern computers and programming languages. It outlines the historical progression of these technologies and their impact on modern society. The lesson covers eras of technological change, emphasizing pivotal moments like the creation of crucial inventions such as the Abacus and the Analytical Engine.

Full Transcript

REVIEWER FOR I.T ERA What is Information and Communications Technology? Integration and use of technology to manage and facilitate the exchange of information and communication. Information- processed data or knowledge that is communicated or stored in various forms, such as text, image...

REVIEWER FOR I.T ERA What is Information and Communications Technology? Integration and use of technology to manage and facilitate the exchange of information and communication. Information- processed data or knowledge that is communicated or stored in various forms, such as text, images, audio, or video. Communication- transmission and sharing of information between individuals or systems. Technology- tools, devices, and systems that support the creation, transmission, and management of information and communication, such as computers, mobile phones, software, and networks. Applications of ICT Business – Managing operations, customer interactions and e-commerce Healthcare – Telemedicine, electronic health records and medical records Government - E-governance services and public information systems. Everyday Life - social media, online banking, and smart home systems. Impact of ICT in the Society Positive impacts of ICT Negative impacts of ICT Access to information > Job Loss Improved education > Reduced Physical Activity New tools, new opportunities > Reduced Personal Interaction Communication > Cost Information Management > Competition Security Distance learning History of Computer Early Counting Tools (pre-1600s) Tally Sticks (c. 35,000 BCE) – Used for recording and counting numbers, one of the earliest mathematical tools. Abacus (c. 3000 BCE) – A counting device used in Babylon, China, and Rome to perform arithmetic operations. Napier’s Bones (1617) – A manually operated tool invented by John Napier to simplify multiplication and division. Slide Rule (1622) – Invented by William Oughtred, this logarithmic device was widely used for mathematical calculations before digital computers. Mechanical Computing Devices(1600s-1800s) Pascaline (1642) – Designed by Blaise Pascal, one of the first mechanical calculators, capable of performing addition and subtraction. Stepped Reckoner (1673) – Created by Gottfried Wilhelm Leibniz, capable of performing all four arithmetic operations. Arithmometer (1820) – Invented by Thomas de Colmar, the first commercially successful mechanical calculator. Difference Engine (1822) – Conceptualized by Charles Babbage to automate polynomial calculations. Analytical Engine (1837) – Designed by Charles Babbage, considered the first general- purpose mechanical computer. Scheutzian Calculation Engine (1853) – Created by Per Georg Scheutz, an improved version of the Difference Engine. Tabulating Machine (1890) – Developed by Herman Hollerith, used punched cards to process census data, influencing early data processing. Electromechanical and Early Electronic Computers (1930s - 1940s) Zuse Z1 (1936-1938) – Created by Konrad Zuse, the first programmable mechanical computer using binary arithmetic. Atanasoff-Berry Computer (ABC) (1937-1942) – Built by John Atanasoff and Clifford Berry, the first electronic digital computer. Colossus (1943) – Used by British cryptographers to break German codes during WWII. Harvard Mark I (1944) – An electromechanical computer designed by Howard Aiken and IBM, used for military calculations. ENIAC - Electronic Numerical Integrator and Computer (1946) – The first fully electronic general-purpose computer, used for military ballistic calculations. First Generation Computers (1940s-1950s) EDVAC - Electronic Discrete Variable Automatic Computer (1949) – Designed by John von Neumann, introduced the stored-program concept, a fundamental advancement in computing. UNIVAC I - Universal Automatic Computer (1951) – The first commercially produced computer, used for business and government applications. Second Generation of Computers (1950s-1960s) IBM 650 (1953) – One of the first mass-produced computers, widely used in businesses and universities. IBM 1401 (1959) – A successful commercial business computer, contributing to the rise of data processing. Third Generation of Computers (1960s-1970s) IBM System/360 (1964) – Introduced by IBM, one of the first computers to use integrated circuits. DEC PDP-8 (1965) – A minicomputer that made computing more affordable for small businesses and universities. Fourth Generation of Computers (1970s - 1980s) Intel 4004 (1971) – The first commercially available microprocessor, revolutionizing computing by integrating a CPU on a single chip. Altair 8800 (1975) – Considered the first personal computer, inspired hobbyist programmers, including Bill Gates and Paul Allen. Apple I (1976) – Designed by Steve Wozniak and Steve Jobs, one of the first personal computers with a graphical display. Apple II (1977) – A widely successful personal computer that helped establish Apple as a major technology company. IBM PC (1981) – Introduced by IBM, became the standard for personal computing. Osborne 1 (1981) – The first commercially available portable computer, introduced by Osborne Computer Corporation. Fifth Generation of Computers (1990s - Present) Advancements in artificial intelligence (AI), cloud computing, and quantum computing define this era. World Wide Web (1991) – Created by Tim Berners-Lee, transformed computing by connecting computers globally. IBM Deep Blue (1997) – The first computer to defeat a world chess champion, showcasing early AI capabilities. Google Search (1998) – Revolutionized information retrieval and data processing. Smartphones and Mobile Computing (2000s - Present) – Devices like the iPhone (2007) transformed computing into a mobile-first experience. Quantum Computing (2010s - Present) – Companies like IBM, Google, and D-Wave are developing quantum processors to solve complex problems faster than classical computers. History of Programming 1940s Machine Code (1883) – Ada Lovelace created an algorithm for Charles Babbage's Analytical Engine, for calculating Bernoulli numbers. Assembly Language (1949) - Assembly language, a more human-readable form of machine code, was introduced to make programming somewhat easier. EDSAC (1949) - Electronic Delay Storage Automatic Calculator was one of the first stored-program computers. It was built at the University of Cambridge, England, under the leadership of Maurice Wilkes. 1950s Fortran (1957): Developed by IBM, Fortran (short for Formula Translation) - was the first high-level programming language. It aimed at making scientific and engineering calculations easier, using English-like commands. LISP (1958): LISP (List Processing) - was designed for symbolic reasoning and artificial intelligence. It introduced the concept of symbolic expressions and played a significant role in the development of programming languages. Developed at MIT. COBOL (1959): COBOL (Common Business-Oriented Language) - was created for business data processing. It aimed to be easily readable and understood by non- programmers. 1960s ALGOL (1958-1960): ALGOL (Algorithmic Language) - was a family of programming languages that influenced many others. It introduced block structures and lexical scoping. Most influential programming language at that time, it is basis for pascal, c and c++ BASIC (1964): Beginners' All-purpose Symbolic Instruction Code (BASIC) - was designed to make programming accessible to non-experts. It became popular on early personal computers. Basic is basis for Microsoft basic 1970s C (1972): Developed by Dennis Ritchie at Bell Labs, C was influential in the development of Unix and later served as the basis for languages like C++, C#, and Objective-C. Pascal (1970): Created by Niklaus Wirth, Pascal aimed at encouraging good programming practices. It was widely used in education and for software development. Used in early apple development SQL - Structured Query Language (SQL) is a programming language that manages data in relational databases. It was developed by IBM in the 1970s 1980s C++ (1983): An extension of C, C++ introduced object-oriented programming (OOP) features. It became popular for systems programming and application development. Used in Adobe photoshop MATHLAB (1984): is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. Perl is a high-level, interpreted programming language developed by Larry Wall in 1987. Originally designed for text processing and report generation, Perl has since evolved into a powerful language used in system administration, web development, network programming, and bioinformatics. 1990s Java (1995): Developed by Sun Microsystems, Java was designed to be platform- independent. Its "write once, run anywhere" philosophy made it popular for web applications. Make it run in any Java Virtual Machine, 3 Billion device run in java Python (1991): Created by Guido van Rossum, Python focused on readability and simplicity. It became widely used for various applications, including web development, data science, and artificial intelligence. Monty python. Easier to code and less line JavaScript (1995): Initially created for web browsers, JavaScript has become a versatile scripting language, enabling dynamic content and interactivity on websites. Developed under 10 days Ruby (1995): Created by Yukihiro Matsumoto, Ruby is known for its elegant syntax and focus on developer happiness. It gained popularity, especially with the Ruby on Rails web framework. 2000s C# - developed by Microsoft in 2000, is a versatile, object-oriented language used for building a wide range of applications on the.NET platform, supporting both desktop and web development. Go - developed by Google in 2007 and released in 2009, is a statically typed, compiled language designed for simplicity, efficiency, and scalability, particularly for building concurrent systems and cloud-based applications. 2010s Rust - 2010, developed by Mozilla. A systems programming language designed for safety, performance, and concurrency. It prevents memory-related issues like null pointer dereferencing and buffer overflows. Rust is ideal for building fast and reliable software, particularly for system-level programming, web assembly, and blockchain technologies. Dart - 2011, developed by Google. A client-optimized programming language used for building mobile, desktop, and web applications. It is most commonly associated with Flutter, a UI framework for building natively compiled applications. Dart offers fast development cycles with features like hot reload and is designed to be highly performant. TypeScript - 2012, developed by Microsoft. A superset of JavaScript that adds static typing to the language, making it easier to catch errors early during development. It is often used for building large-scale web applications, offering features such as interfaces and decorators, and is compatible with existing JavaScript code. Swift - 2014, developed by Apple. A fast, modern programming language primarily used for developing iOS, macOS, watch OS, and tvOS applications. It is designed to be easy to use, safe, and efficient, providing a powerful alternative to Objective-C. Swift supports object-oriented and functional programming paradigms.

Use Quizgecko on...
Browser
Browser