GE IT Chapter 1: Introduction to ICT PDF
Document Details
Nueva Vizcaya State University
Tags
Summary
This document is a chapter from a course on information technology. It covers the use of ICT in daily life, its impact on society, and its applications in various fields, such as education and job opportunities. The chapter also discusses the history of computers. It's likely part of an instructional module for students.
Full Transcript
**COLLEGE OF ARTS AND SCIENCES** **Bayombong Campus** **DEGREE PROGRAM** BSBA/BSCE **COURSE NO.** GE IT **SPECIALIZATION** **COURSE TITLE** Living in the IT ERA **YEAR LEVEL** 2nd **TIME FRAME** 6 hrs. **WK** **NO.** 1-2 **IM NO.** 01 I. **LESSON TITLE-** Chapter I - INTRODUCTION...
**COLLEGE OF ARTS AND SCIENCES** **Bayombong Campus** **DEGREE PROGRAM** BSBA/BSCE **COURSE NO.** GE IT **SPECIALIZATION** **COURSE TITLE** Living in the IT ERA **YEAR LEVEL** 2nd **TIME FRAME** 6 hrs. **WK** **NO.** 1-2 **IM NO.** 01 I. **LESSON TITLE-** Chapter I - INTRODUCTION TO ICT A. Uses of ICT In Our Daily Lives B. Impact of ICT in The Society C. Applications of ICT (Computers) in Our Daily Lives D. History of Computer: Basic Computing Periods E. Key Technology Trends II. **LESSON OVERVIEW** The quickening pace of evolution in technology is very evident in this era. It seems that it is progressing faster than ever. From year to year, the evolution of technology is one of staggering promise and opportunity\--as well as uncertainty. Basically, technology has been around before, and as long as there are people, information technology will be there also because there were always ways of communicating through technology available at that point in time. The future may be unknown, but digital advancement continues to reshape our world in ways that encourage people to form new habits, find new ways to work together, and become better human beings. And, in most cases, these changes translate into a range of opportunities and disruptions across every industry. Humans have always been quick to adapt technologies for better and faster communication. III. **DESIRED LEARNING OUTCOMES** At the end of the lesson, the student should be able to: 1. illustrate the importance of ICT and its impact in our daily life 2. recall the people, companies and personalities behind computer technology development 3. describe the antiquity and development of computer technology IV. **LESSON CONTENT** A. **[Uses of ICT In Our Daily Lives ]** [Communication ] We all know that ICT take a major role for us by means of communicating, way back in the past our parents use to make letter and send it via post mail. But now with the help of ICT it is easier to communicate with our love ones. We can use cellular phones that design for communicating with other people even they are miles away far from you. Nowadays people are in touch with the help of ICT. Through chatting, E-mail, voice mail and social networking people communicate with each other. It is the cheapest means of communication. ICT allows students to monitor and manage their own learning, think critically and creatively, solve simulated real-world problems, work collaboratively, engage in ethical decision-making, and adopt a global perspective towards issues and ideas. It also provides students from remote areas access to expert teachers and learning resources, and gives administrators and policy makers the data and expertise they need to work more efficiently. [Job Opportunities ] In the employment sector, ICT enables organizations to operate more efficiently, so employing staff with ICT skills is vital to the smooth running of any business. Being able to use ICT systems effectively allows employees more time to concentrate on areas of their job role that require soft skills. For example, many pharmacies use robot technology to assist with picking prescribed drugs. This allows highly trained pharmaceutical staff to focus on jobs requiring human intelligence and interaction, such as dispensing and checking medication. Nowadays, employers expect their staff to have basic ICT skills. This expectation even applies to job roles where ICT skills may not have been an essential requirement in the past. Nowadays, finding a job is different, you can just use your smart phone, laptop, desktop or any gadgets that is available in the comfort of your home. [Education ] Information and Communications Technology (ICT) can impact student learning when teachers are digitally literate and understand how to integrate it into curriculum. Schools use a diverse set of ICT tools to communicate, create, disseminate, store, and manage information. In some contexts, ICT has also become integral to the teaching learning interaction, through such approaches as replacing chalkboards with interactive digital whiteboards, using students' own smartphones or other devices for learning during class time, and the "flipped classroom" model where students watch lectures at home on the computer and use classroom time for more interactive exercises. When teachers are digitally literate and trained to use ICT, these approaches can lead to higher order thinking skills, provide creative and individualized options for students to express their understandings, and leave students better prepared to deal with ongoing technological change in society and the workplace. [Socializing ] Social media has changed the world. The rapid and vast adoption of these technologies is changing how we find partners, how we access information from the news, and how we organize to demand political change. The internet and social media provide young people with a range of benefits, and opportunities to empower themselves in a variety of ways. Young people can maintain social connections and support networks that otherwise wouldn\'t be possible and can access more information than ever before. The communities and social interactions young people form online can be invaluable for bolstering and developing young people\'s self-confidence and social skills. As the ICT has become ubiquitous, faster and increasingly accessible to non-technical communities, social networking and collaborative services have grown rapidly enabling people to communicate and share interest in many more ways, sites like Facebook, Twitter LinkedIn You tube, Flicker, second life delicious blogs wiki's and many more let people of all ages rapidly share their interest of the movement without others everywhere. But Facebook seems to be the leading areas of where people communicate and share their opinions. What a change! "Nothing is permanent, but change" (As Heraditus in the 4thcentury BC). Internet can be seen as the international networks of interconnection of computer networks, the main purpose for the institution of internet are quest for information i.e. browsing, electronic mail, knew groups fill transfer and access and use of other computer. Socialization can be seen as a process by which a child adapts a behavior to be an effective member of the society, which can only be achieved through learning or education. B. **[Impact of ICT in The Society ]** Security: ICT solves or reduces some security problems, e.g. Encryption methods can keep data safe from unauthorized people, both while it is being stored or while it is being sent electronically. C. **[Applications of ICT (Computers) in Our Daily Lives ]** 1\. Business 2\. Education 3\. Healthcare 4\. Retail and Trade 5\. Government 6\. Marketing 7\. Science 8\. Publishing 9\. Arts and Entertainment 10\. Communication 11\. Banking and Finance 12\. Transport 13\. Navigation 14\. Working from Home 15\. Military 16\. Social and Romance 17\. Booking Vacations 18\. Security and Surveillance 19\. Weather Forecasting 20\. Robotics D. **[History of Computer: Basic Computing Periods ]** Earliest Computers originally calculations were computed by humans, whose job title was computers. - The first use of the word \"computer\" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. a. Tally sticks A tally stick was an ancient memory aid device to record and document numbers, quantities, or even messages. ![](media/image2.png) e Figure 1.1 Tally Sticks b. Abacus An abacus is a mechanical device used to aid an individual in performing mathematical calculations. The abacus was invented in Babylonia in 2400 B.C. The abacus in the form we are most familiar with was first used in China in around 500 B.C. It used to perform basic arithmetic operations. Figure 1.2 Abacus c. Napier's Bones - Invented by John Napier in 1614. - Allowed the operator to multiply, divide and calculate square and cube roots by moving the rods around and placing them in specially constructed boards. ![](media/image4.png) Figure 1.3 Napier's Bones d. Slide Rule Invented by William Oughtred in 1622. - Is based on Napier\'s ideas about logarithms. - Used primarily for -- multiplication -- division -- roots -- logarithms -- Trigonometry - Not normally used for addition or subtraction. Figure 1.4 Slide Rule e. Pascaline - Invented by Blaise Pascal in 1642. - It was its limitation to addition and subtraction. - It is too expensive. ![](media/image6.jpg) Figure 1.5 Pascaline f. Stepped Reckoner - Invented by Gottfried Wilhelm Leibniz in 1672. - The machine that can add, subtract, multiply and divide automatically. Figure 1.6 Stepped Reckoner g. Jacquard Loom - The Jacquard loom is a mechanical loom, invented by Joseph-Marie Jacquard in 1881. - It is an automatic loom controlled by punched cards. ![](media/image8.png) Figure 1.7 Jacquard Loom h. Arithmometer - A mechanical calculator invented by Thomas de Colmar in 1820, - The first reliable, useful and commercially successful calculating machine. - The machine could perform the four basic mathematic functions. - The first mass-produced calculating machine. Figure 1.8 Arithmometer i. Difference Engine and Analytical Engine - It an automatic, mechanical calculator designed to tabulate polynomial functions. Invented by Charles Babbage in 1822 and 1834 - It is the first mechanical computer. ![](media/image10.png) Figure 1.9 Difference Engine & Analytical Engine j. First Computer Programmer - In 1840, Augusta Ada Byron suggests to Babbage that he use the binary system. - She writes programs for the Analytical Engine. Figure 1.10 Augusta Ada Byron k. Scheutzian Calculation Engine - Invented by Per Georg Scheutz in 1843. - Based on Charles Babbage\'s difference engine. - The first printing calculator. ![](media/image12.png) Figure 1.11 Scheutzian Calculation Engine l. Tabulating Machine Invented by Herman Hollerith in 1890. To assist in summarizing information and accounting. Figure 1.12 Tabulating Machine m. Harvard Mark 1 - Also known as IBM Automatic Sequence Controlled Calculator (ASCC). - Invented by Howard H. Aiken in 1943 The first electro-mechanical computer. ![](media/image14.png) Figure 1.13 Harvard Mark 1 n. Z1 The first programmable computer. Created by Konrad Zuse in Germany from 1936 to 1938. To program the Z1 required that the user insert punch tape into a punch tape reader and all output was also generated through punch tape. Figure 1.14 Z1 o. Atanasoff-Berry Computer (ABC) It was the first electronic digital computing device. Invented by Professor John Atanasoff and graduate student Clifford Berry at Iowa State University between 1939 and 1942. ![](media/image16.png) Figure 1.15 Atanasoff-Berry Computer (ABC) p. ENIAC ENIAC stands for Electronic Numerical Integrator and Computer. It was the first electronic general-purpose computer. Completed in 1946. Developed by John Presper Eckert and John Mauchly. Figure 1.16 ENIAC q. UNIVAC 1 The UNIVAC I (UNIVersal Automatic Computer 1) was the first commercial computer. Designed by John Presper Eckert and John Mauchly. ![](media/image18.png) Figure 1.17 UNIVAC 1 r. EDVAC EDVAC stands for Electronic Discrete Variable Automatic Computer The First Stored Program Computer Designed by Von Neumann in 1952. It has a memory to hold both a stored program as well as data. Figure 1.18 EDVAC s. The First Portable Computer Osborne 1 -- the first portable computer. Released in 1981 by the Osborne Computer Corporation. ![](media/image20.png) Figure 1.19 The First Portable Computer t. The First Computer Company The first computer company was the Electronic Controls Company. Founded in 1949 by John Presper Eckert and John Mauchly. **Basic Computing Periods - Ages** a\. Premechanical Figure 2.1 Petroglyph b\. Mechanical ![](media/image22.png) Figure 2.2 Difference Engine c\. Electromechanical Figure 2.3 Harvard Mark 1 d\. Electronic ![](media/image24.png) Figure 2.4 Apple 2 **[Generations of Computer ]** The first computers used vacuum tubes for circuitry and [magnetic drums](http://www.webopedia.com/TERM/M/magnetic_drum.html) for [memory](http://www.webopedia.com/TERM/M/memory.html), and were often enormous, taking up entire rooms. These computers were very expensive to operate and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on [machine language](http://www.webopedia.com/TERM/M/machine_language.html), the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. It would take operators days or even weeks to set-up a new problem. Input was based on punched cards and paper tape, and output was displayed on printouts. The UNIVAC and [ENIAC](http://www.webopedia.com/TERM/E/ENIAC.html) computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. The world would see transistors replace vacuum tubes in the second generation of computers. The transistor was invented at Bell Labs in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. ### *From Binary to Assembly* Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry. Examples: UNIVAC III, RCA 501, Philco Transact S-2000, NCR 300 series, IBM 7030 An early transistor *Third Generation: Integrated Circuits **(1964-1971)*** ------------------------------------------------------- The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. *Fourth Generation: Microprocessors **(1971-Present)*** -------------------------------------------------------- The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer---from the central processing unit and memory to input/output controls---on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. ***Fifth Generation: Artificial Intelligence (Present and Beyond)*** -------------------------------------------------------------------- Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. **HISTORY OF WEB** The web is a wonderful place. It connects people from across the globe, keeps us updated with our friends and family, and creates revolutions never before seen in our lifetime. It has certainly come a long way since its humble beginnings back in the early 1980\'s. The World Wide Web uses three protocols: - - \://\/\ -------------------------------------- - ***In the Beginning...*** In1946 when Murray Leinster wrote a short story which described how computers (that he referred to as \'Logics\') lived in every home, with each one having access to a central device where they could retrieve information. Although the story does have several differences to the way the web works today, it does capture the idea of a huge information network available to everyone in their homes. 40 years later in 1980 when an English chap by the name of Tim Berners Lee worked on a project known as \'Enquire\'. **Enquire** was a simple database of people and software who were working at the same place as Berners Lee. It was during this project that he experimented with hypertext. Hypertext is text that can be displayed on devices which utilize hyperlinks. The Berners Lee Enquire system used hyperlinks on each page of the database, each page referencing other relevant pages within the system. Berners Lee was a physicist and, in his need, to share information with other physicists around the world found out that there was no quick and easy solution for doing so. With this in mind, in 1989 he set about putting a proposal together for a centralized database which contained links to other documents. This would have been the perfect solution for Tim and his colleagues, but it turned out nobody was interested in it and nobody took any notice - except for one person. Tim\'s boss liked his idea and encouraged him to implement it in their next project. This new system was given a few different names such as TIM (The Information Mine) which was turned down as it abbreviated Tim\'s initials. After a few suggestions, there was only one name that stuck; the World Wide Web. **The First Browsers** By December 1990 Tim had joined forces with another physicist Robert Cailliau who rewrote Tim\'s original proposal. It was their vision to combine hypertext with the internet to create web pages, but no one at that time could appreciate how successful this idea could be. Despite little interest, Berners Lee continued to develop three major components for the web; HTTP, HTML and the world first web browser. Funnily enough, this browser was also called \"the World Wide Web\" and it also doubled as an editor. ![](media/image26.jpg)On June 8th 1991, the World Wide Web project was announced to the world where the man himself described it: *the WWW project was started to allow high energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data.* -------------------------------------------------- *A screenshot of the world\'s first web browser* -------------------------------------------------- On August 6, 1991 the world\'s first web page was launched. Boring, perhaps, but this is the world\'s first website. The page outlined the plans for the World Wide Web. It was also this year that HTML was born and the first publicly available description of HTML was released. Shortly afterwards other browsers were released, each bringing differences and improvements. Let\'s take a look at some of these browsers. - - - - - - - **Hello JavaScript and CSS** When the World Wide Web first started, web pages were simply text documents. Nowadays web pages are much more than documents; they now have the ability to be full blown applications. Part of this ability is due to the additions of JavaScript and CSS. It was on August 19th 1995 that JavaScript was first announced, originally code named \'Mocha\'. This addition to the World Wide Web was not fully supported by some people, including Robert Cailliau who had worked with Berners Lee on the WWW project. This addition to the browser came along with Netscape Navigator 2 and was developed by Brendan Eich. Despite some people having reservations about it Javascript paved the way for the web to become less static and be more dynamic and interactive. JavaScript made way for websites to think, but it was CSS that introduced the style and look of the web. Stylesheets have been around in some way shape or form since the early 80\'s. Cascading style sheets were introduced as a way to create a consistent styling approach to the web. They allowed the separation of the document content (HTML) and the styling of it. CSS1 was introduced in 1995 but had trouble being adopted due to the inconsistencies amongst browsers of the time. Internet Explorer 5 was released in March 2000 and was the first browser to support the almost complete CSS1 specification (it covered 99% of it). It was a year later in 1996 that CSS level one becomes an official recommendation of the W3C. **Dot Com Boom** It was between the years of 1995--2000 that a group of businesses started to change their focus onto the web. Investors started throwing money at anything related to the web; in many cases, if a company was seen to be on the web, then their stock prices would shoot up. This was known as the internet boom which marked the commercial growth of the Internet since the advent of the World Wide Web. However, as more and more money were pumped into these startups, lots of investors overlooked basic business fundamentals and instead focused their confidence on the advancements in technology in the hope that they would one day see a return on their investments. Unfortunately, this wasn\'t the case and the collapse of the dot com bubble between 2000-2001 was inevitable. **Social Networks** The original form of anything resembling any type of social network that we know today was a bulletin board system (BBS). By the turn of the millennium the race was on to become the world\'s most popular social network. Social networks became especially popular on the web between the years of 1995-2000. More importantly, an internet company in the States paved the way for social networks as they are known today. AOL had features that you might see on many modern social networks today, such as member profiles and forums where users could chat about any kind of subject that they chose. It wasn\'t until around 2002 that the race to become the world's most popular social network began. Sites like Friendster, LinkedIn and myspace popped up. Friendster was arguably one of the most popular original sites boasting three million users just a year after its launch. However, competitors soon overtook Friendster, Myspace launched in 2003 and was soon gaining popularity as the world\'s most popular social networking site. If any social networking website has revolutionized the way that we socially interact on the web, that accolade has to go to Facebook. Facebook managed to set itself apart from its competitors by coming up with innovative features and executing smart business decisions. One of those smart business moves is one that is also shared by Twitter; that is the offering of an API which allows other developers to extend the sites\' functionality and create apps that support the platforms. Decisions like this allowed the social web to become a major milestone in the history of the World Wide Web. **The Web Goes Mobile** ![](media/image41.jpg) In 2007 the iPhone was released which revolutionized the mobile web as we know it today. One of the most recent milestones in the history of the World Wide Web is accessibility via mobile devices. Until this point accessing the web had fundamentally been from computers or laptops. Now the number of users accessing the web from mobile devices is growing rapidly and is set to overtake desktop access by 2015. Of course, people have been connecting to the web from mobile devices since the mid 90\'s but this was in no way like the access that we are used to now. It was in 2007 that the iPhone first became available, revolutionizing the way that we access the web from our phones and introducing the concept of mobile apps. The World Wide Web was now able to understand whereabouts on the planet we were, it allowed us to upload a photo that we have just taken straight onto our social networking profile. The mobile web has added another layer to the already useful web. **Brief History of Mobile Computing** Mobile Computing = Computing + Mobility + Connectivity Mobile Computing is a technology that allows transmission of data, voice and video via a computer or any other wireless enabled device without having to be connected to a physical linked. Mobile computing is an \'umbrella\' term used to describe technologies that enable people to access network services anyplace, anywhere, anytime. The idea of mobile computing has only been around since the 1990s. Since then, Mobile computing has evolved from two-way radios that use large antennas to communicate simple messages to three-inch personal computers that can do almost everything a regular computer does **MILESTONE** 1993 Apple: Newton Message Pad -First hand-held computer. Display rotated 90, 180, or 270 degrees, depending on device orientation. Device ran the Newton operating system. Used handwriting recognition powered by Calligrapher handwriting recognition engine. 1996 US Robotics Palm Pilot The Palm Pilot was called a PDA (Personal Digital Assistant) 1999 BlackBerry First Blackberry device was a two-way pager. 2002 The commonly known Convergent BlackBerry supported push email, mobile telephone, text messaging, Internet faxing, Web browsing and other wireless services. 2003 BlackBerry Quark first device with integrated email and phone. iPhone 1980s Objective-C language created by Cox and Love at their software company Stepstone 2007 First integrated smart phone; iPhone apps written in Objective C language 2014 Swift language released for writing iPhone apps. Swift has been characterized as Objective-C without the C. 2019 Currently there are close to 1 billion iPhone and iPad devices in use. Google and Android 1998 Google was founded by graduate students at Stanford. 2003 Android was founded by Rubin and Miner in Palo Alto, California. 2005 Google buys Android. 2007 Android operating system released. Based on the Linux kernel. 2007 Open Handset Alliance formed. A consortium of 84 companies joined to develop Android as an open and free mobile platform. Members include Dell, Google, Intel, Motorola, Nvidia, Qualcom, Sony, Sprint, T-Mobile. The member companies agreed to produce compatible devices. 2008 First Android Device was T-Mobile G1, complete with fold out QUERTY keyboard. 2011 Apple sues Android device manufacturer Samsung, for patent infringement. This resulted in a 1billion dollar settlement, which was reduced on appeal. The court battle was finally settled in 2018. 2019 Currently there are more than 2 billion Android devices in use, more than more than 4,000 different devices, and more than 400 different manufacturers. E. **Key Technology Trends** **1. Quantum Computing Achieves Real-World Application** Quantum computing companies received [almost \$1.02 billion in VC investment](https://www.defianceetfs.com/new-breakthroughs-in-2021-bring-quantum-closer-to-fulfilling-its-promise-and-hope-in-2022/) last year. A 68% jump relative to 2020. Further, the number of QC deals jumped from 37 in 2020 to a record of 54 in 2021. Here are some of the ways we can expect quantum computing technology to impact our lives over the coming years. - **Hacker-Proof Encryption** - - **2. Cleantech Continues to Grow** In 2021 an impressive [\$27 billion+ was invested into cleantech](https://www.privateequitywire.co.uk/2022/04/22/313961/pe-investment-us-clean-tech-rose-over-27-billion-2021-says-new-report) (sustainable technology). Fast forward one short year and that number skyrocketed to **\$60 billion**, achieving the type of growth many industry insiders thought would take 5-10 years. In fact, this tech trend has gained so much momentum **14% of all VC dollars now flow to cleantech companies.** Here's a small sample of how companies in the industry are creating meaningful change. - **Eliminating Single-Use Plastics** According to reports, a full [**50% of plastic products**](https://www.unep.org/interactives/beat-plastic-pollution/) were designed for a "single-use" purpose. With approximately one million water bottles being purchased each minute, and more than five trillion plastic bags distributed each year, the manufacturing of plastic is a hot topic. To combat this, a company known as Footprint is using technology to create bio-based, 100% biodegradable consumer product packaging. With more than 90 scientists and engineers, and 30+ US patents, Footprint uses a cutting-edge blend of fibers known as [**Barrier Technology**](http://www.footprintus.com/science) to mimic the durability, water resistance, and oxygenation-resistance characteristics that have made plastics so popular. - **The Electrification of Cars** With electric vehicle sales rapidly [**closing in**](https://electronics360.globalspec.com/article/17631/hybrid-electric-cars-reach-sales-record-in-2021) on the 25-year-old hybrid car market, interest in electric vehicles is at an all-time high. **Tesla** is the brand most consumers associate with EVs. However, a variety of start-ups are innovating in new and exciting ways. As an example, instead of attacking the entire car building process, US company [**Magna**](https://www.magna.com/products/power-vision/electrified-powertrain-technologies/bev---battery-electric-vehicle-solutions) is innovating around the one problem that plagues EV car manufacturers most: **[powering the drivetrain](https://www.magna.com/products/power-vision/electrified-powertrain-technologies/bev---battery-electric-vehicle-solutions).** In particular, they work with auto manufacturers to [**add electric powertrains**](https://www.ces.tech/Articles/2021/April/Sustainability-Tech-Is-Trending-This-Earth-Day.aspx) to the front and rear axles of their ***existing*** combustion engine vehicles. Further innovating on electric vehicle technology is US-based Sonos. By combining traditional EV batteries with solar panels, Sonos hopes to solve what is arguably the largest problem EV drivers have to deal with: Constantly needing to recharge their car's battery. - **Carbon Capture and Storage** It's well documented that carbon emissions are one of the [**largest drivers**](https://www.climate.gov/news-features/understanding-climate/climate-change-atmospheric-carbon-dioxide) of modern climate change. ![](media/image52.png)According to the IEA, carbon emissions surged by 1.5 billion tons in 2021. Further, while green initiatives - like planting trees and switching to electric vehicles - will reduce carbon emissions over time, [many experts believe](https://mindseteco.co/carbon-capture-companies/) the impact of such initiatives will be too little too late. On the flip side, climate scientists are using what's known as Carbon Capture technology to make immediate progress toward reducing and even reversing emissions. The process involves working with super emitters - like power plants and concrete manufacturing facilities - to capture carbon molecules when they would normally be released into the air. From there, CC companies isolate and extract the carbon through a [variety of chemical processes](https://mindseteco.co/carbon-capture-companies/) before reselling it or depositing it deep into the earth (where it can be transformed back into stone). **3. Early Detection Technology Advances** While it's exciting to talk about cyber warfare and quantum computing, the one technology likely to benefit humanity the most is early disease detection. Mainly because the earlier a disease is detected, the more likely it is the patient survives. As an example, when ovarian cancer is detected but has not spread beyond the ovaries, [over 90% of patients survive five or more years](https://www.canaryfoundation.org/wp-content/uploads/EarlyDetectionFactSheet.pdf). Once it spreads, however, only 28% survive that long. The same goes for colon cancer, where early detection boosts five-year survival rates from 11% to 91%. In fact, according to the Canary Foundation, "five-year survival rates" are substantially higher for all types of cancer when diagnosed early. Unfortunately, certain types of cancer - like ovarian and kidney - can be [difficult to detect early](https://www.thehealthy.com/cancer/cancer-screening/). Because of that, doctors and scientists are dedicating an increasing amount of resources to cutting-edge technologies designed to detect cancer cells earlier in their development. As an example, in (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6080308/), a team of researchers from John Hopkins released a ground-breaking study. In it, they explained how a brand-new technology known as **[CancerSEEK](https://www.nature.com/articles/nrclinonc.2018.21) **combined blood tests and MRI imaging technology to detect eight different types of cancer with 70% accuracy. Most recently, biotech company Grail attracted over \$2 billion in [funding](https://www.crunchbase.com/organization/grail) for their pan-cancer test Galleri, which has shown the potential to [detect](https://grail.com/press-releases/grail-presents-interventional-pathfinder-study-data-at-2021-asco-annual-meeting-and-introduces-galleri-a-groundbreaking-multi-cancer-early-detection-blood-test/) **50+** types with a high degree of accuracy. And it's not just cancer. In the United States, Alzheimer\'s is the sixth-leading [cause of death](https://pubmed.ncbi.nlm.nih.gov/33756057/) nationwide and the fifth for adults aged 65 and older. Adding urgency to the need for early detection, it's expected that as baby boomers age, Alzheimer's among adults 65 and older [will increase from **5.8 million** in 2020 to **7.1 million **by 2025](https://www.usatoday.com/story/money/2020/08/10/alzheimers-disease-states-where-cases-will-rise-most-by-2025/42187567/). Sadly, because Alzheimer\'s is a neurodegenerative disease, early "symptoms" involve minuscule changes in the brain that can be difficult to detect. To address this, a team of researchers from the United States Veterans Association is now [combining fiber-optics and spectroscopy](https://blogs.va.gov/VAntage/92688/new-scanning-technology-could-help-diagnose-alzheimers-disease-using-light/#:~:text=Alzheimer's%20is%20hard%20to%20definitively,analyzing%20brain%20tissue%20after%20death.) to examine how light passes through brain matter. In doing so, they hope to identify these minute brain alterations in a way that is more affordable, accessible, and easier for medical professionals to use. And while they're approaching early detection from a different angle, scientists in Lithuania are now combining AI with Deep Learning to ["predict the possible onset of Alzheimer's with 99% accuracy"](https://www.euronews.com/next/2021/09/06/this-technology-can-predict-early-alzheimer-s-with-near-100-accuracy). **4. Blockchain Technology Goes Mainstream** In 2021 blockchain technology went mainstream. In particular, 2021 was the year: - VCs made record investments **([to the tune of \$33 billion](https://blockworks.co/report-vcs-invested-33b-in-crypto-and-blockchain-startups-in-2021/)**) into blockchain and/or cryptocurrency companies - NFTs (which are based on crypto/blockchain technology) began to go mainstream - Decentralized Finance (aka DeFi) achieved a **\$100 billion** (and then **\$200 billion**) Total Value Locked market cap - Crypto companies launched aggressive, [mainstream marketing](https://www.latimes.com/business/story/2021-11-16/crypto-staples) - And more So, as blockchain technology achieves more practical use cases, here are just a few of the ways it will be used in the coming years. - **Improved Logistics** As today's global supply chains become increasingly complex, with more parties directly or indirectly involved in transporting goods, logistics companies face an ever-growing number of challenges. On the tech side, most errors happen as a result of one party's tech stack [not integrating with the tech stack](https://theloadstar.com/industry-leaders-call-for-supply-chain-interoperability-between-modes/) of a second party involved in the chain of custody (interoperability issues). Also, because many forms are filled out and typed into a computer manually, human error is a serious problem ([accounting](https://www.supplychainbrain.com/blogs/1-think-tank/post/33936-where-tech-outpaces-people-in-supply-chains-three-examples) for approximately **80%** of supply chain issues). Blockchain, however, holds the potential to solve both issues. First, by requiring participating providers to get on board with a specific, blockchain-based solution, every party involved in the transportation process would by definition be using the same tech stack. Second, because blockchain technology creates a linear chain of custody, the issue of there being "gaps" in the end-to-end visibility chain would become a thing of the past. Instead, someone can search for an identifier on the blockchain (similar to today's tracking numbers) and observe where a shipment is and who came into contact with it (including when, where, who signed off on the receipt/delivery, etc.). - **Blockchain Voting** As we've seen in multiple presidential elections over the past two decades, voter fraud (more precisely, [the perception of voter fraud](https://www.pewresearch.org/journalism/2020/09/16/legitimacy-of-voting-by-mail-politicized-leaving-americans-divided/)) has become an increasingly controversial topic**.** In particular, the ability of nefarious parties to submit duplicate votes, use the names and social security numbers of deceased persons to vote, etc. Blockchain's properties, however, make it a great candidate for reducing - if not eliminating - the possibility of voter fraud. Mainly because blockchain creates a permanent, publicly visible ledger that cannot be hacked or manipulated by a third party. Instead, governments could integrate a private blockchain network with the social security administration database to verify every living person over the age of 18 gets one vote (and one vote only). Admittedly, a blockchain voting system wouldn't be perfect. However, by eliminating human vote counters (and therefore human error), and reducing the possibility of both fraudulent and duplicate votes, blockchain-based voting would be a positive step forward towards eliminating voter fraud controversy (while at the same time, delivering reliable results much faster). - **Transparent Budgeting** In both corporations and governments, the threat of corruption is a constant. From minor offenses to **Enron-level** accounting fraud, there's no shortage of ways bad actors can funnel funds into their own pockets. However, because blockchain creates an observable, permanent, and linear chain of custody, hiding illegal or inaccurate transactions becomes exponentially more difficult. Instead, blockchains record everything: the sending wallet, the amount, the date and time, and the receiving wallet. Because of that, by connecting an employee or politician's identity to a wallet (say a company treasury or city budget) - in which funds cannot be moved without a secret code - that person would assume full responsibility for each and every one of that wallet's transactions. In fact, while some media [pundits](https://www.businessinsider.com/treasury-warns-banks-russian-oligarchs-officials-evade-sanctions-cryptocurrency-2022-3) propose Russian oligarchs could be using crypto to evade sanctions, many have quickly pointed out its public, transparent nature makes it an [ineffective](https://www.coindesk.com/policy/2022/03/08/crypto-still-isnt-helping-russian-oligarchs-evade-sanctions/) vehicle for doing so. **5. Cyber Threats Grow More Advanced** From attacks on random consumers to government-sponsored cyber warfare, cybercrime is a persistent and continually evolving threat. In fact, virtual crime has become so prevalent the **\$153 billion** cyber security industry is expected [to grow **139%** by 2028](https://www.globenewswire.com/news-release/2022/01/05/2361317/0/en/Cyber-Security-Market-to-Reach-USD-366-10-Billion-by-2028-Surging-Number-of-E-Commerce-Platforms-to-Amplify-Market-Growth-Says-Fortune-Business-Insights.html#:~:text=Pune%2C%20India%2C%20Jan.,12.0%25%20during%20the%20forecast%20period.) (to **\$336 billion**). ![](media/image54.png)Here are just a few of the ways industry players are using cutting-edge technology to defend against - and get ahead of - cybercrime. - **Work From Home Security** Since March of 2020, the WFH/remote work movement has advanced dramatically. According to one survey, [4 out of 5 remote employees consider the practice "successful"](https://explodingtopics.com/blog/remote-work-statistics). And while the pandemic has been a boon for employees, it's created an unexpected surge in problems for cyber security experts. Mostly because most home office set-ups [are nowhere near as secure](https://www.jdsupra.com/legalnews/the-five-biggest-work-from-home-23415/) as the office. First, many people are working from home on their own devices. Devices that may not have the proper firewall and/or malware/ransomware/virus detection protocols in place. Second, working from home means working over what is likely a poorly secured broadband connection. From weak WiFi network passwords to insufficient encryption, most people's home network setups are nowhere near as secure as their company's office. Because of this, and because it looks like remote work is here to stay, 75-80% of security leaders [expected to ramp up security spending in 2022](https://www.cybersecuritydive.com/news/remote-work-security-spending/606999/). - **AI-based Cyber Security** Because cyber security involves reacting to new and incoming threats, it is very much a defensive game. Sadly, this means security professionals are always one step behind attackers. However, through the use of artificial intelligence, cyber security pros are [working](https://www.securitymagazine.com/articles/97059-three-top-of-mind-cybersecurity-trends-in-2022) to detect - and thwart - attacks much earlier in the process. A good analogy here is that of an earthquake warning detection system. By measuring minute vibrations on the earth's surface, earthquake warning detection systems can alert both scientists and the public to an incoming earthquake much earlier than would otherwise be possible. Along the same lines, AI programs can be programmed to [detect the early signs](https://www.computer.org/publications/tech-news/trends/the-use-of-artificial-intelligence-in-cybersecurity) of an incoming attack much more effectively than today's technology is capable of doing. The consequences of this for cyber security pros cannot be overstated. First, in addition to detecting the attack, AI programs can be configured to stop/thwart the attack before it gets out of control. Second, similar to the earthquake device, the AI can [alert](https://www.computer.org/publications/tech-news/trends/the-use-of-artificial-intelligence-in-cybersecurity) relevant personnel much earlier than is possible now. Because of that, industry professionals can go into defense mode faster, potentially protecting their database from what could become a disastrous attack. **6. Robotic Process Automation Achieves Mainstream Corporate Adoption** As the lines between artificial intelligence and machine learning continue to blur, businesses are finding an increasing number of ways to integrate automation into their processes. And one of the emerging technologies executives are most excited about is **[Robotic Process Automation](https://www2.deloitte.com/pt/en/pages/strategy-operations/articles/robotic-process-automation.html) (RPA).** RPA involves training software programs to perform/execute mundane, repetitive tasks. Here are just a few practical use cases regarding how businesses will be integrating RPA over the coming years. - **Automating Tech Stack Mergers** As any IT professional knows, many of today's largest organizations - from corporations to governments - are using highly outdated legacy systems. And when two companies that have different legacy systems merge, the process of integrating multiple databases can become an IT nightmare. However, thanks to robotics process automation, software engineers can train a program to both [identify and fix](https://www.informationweek.com/digital-business/3-technologies-that-can-ease-the-m-a-process) any repetitive errors that pop up as part of the merger process. This, in turn, can save hundreds if not thousands of (potentially expensive) man-hours. - And it's not just big data where RPA could prove useful. Instead, RPA can be used to automate many of the mind-numbing tasks that are currently outsourced to minimum wage employees. Employees whose skills and talents are most likely better applied elsewhere. As an example, by training a program to correctly identify form fields and handwritten text, RPA can be used [to convert text documents (e.g. paper-based mortgage applications) into electronic files](https://robocorp.com/docs/development-guide/pdf/how-to-fill-pdf-forms) without a human having to manually input that same information. - ![](media/image56.png)While many of today's invoices are already automated, others require a human to cross-reference multiple data points (sales receipts, etc.) before an invoice can be created. On the flip side, software engineers can train an RPA to check those same sources, [create the invoice](https://www.aspiresys.com/rpa-finance-and-accounting/usecase-invoice-processing-automation), and send it off (with once again, no human effort required). **Updating CRMs** As any sales professional knows, having accurate prospecting information is critical to expediting the sales process. Mainly because time spent double-checking phone numbers (for example) is time that could be better spent nurturing prospects or making pitches. With RPA, however, a program could be trained to constantly monitor a company's website (for its address) or an executive's LinkedIn profile (for their most recent email address). **Improving Employee Happiness** For years, we've heard how AI and robotics will replace tens of millions of jobs. In reality, studies show technologies like RPA are liberating employees from mundane tasks, allowing them to focus on more stimulating (and productive) work. In fact, **68%** of global workers believe RPA will make them [more productive](https://www.uipath.com/blog/digital-transformation/new-research-shows-workers-concerned-skills-gaps) while [**57%** of executives say RPA has increased employee engagement](https://www.uipath.com/resources/automation-analyst-reports/forrester-employee-experience-rpa). **7. New Use Cases For Augmented Reality** After decades of hype and promises, both augmented and virtual reality seem to be gaining steam. According to IDC, [global spending on AR/VR in 2020 was estimated to be up to **\$18.8 billion**](https://www.idc.com/getdoc.jsp?containerId=prUS45679219), up **78.5%** from 2019. Further, demand for AR headsets (which use the same tech as VR headsets) [grew](https://www.idc.com/getdoc.jsp?containerId=prUS48969722) a stunning **92.1%** year-over-year in 2021 (with **11.2 million** units shipped). And while virtual reality may hold more potential long-term, studies [show consumers are adopting AR at a much faster pace](https://deloitte.wsj.com/articles/augmented-reality-shifts-from-toy-to-practical-tool-01627066807). Here are just a couple of examples of how. - While using GPS to drive has become commonplace, navigating large areas on foot can be daunting. From amusement parks to airports, and ski resorts to conference centers, augmented reality developers are working [to create the equivalent of GPS for foot traffic](https://mobidev.biz/blog/augmented-reality-trends-future-ar-technologies)**.** As an example, imagine typing in the name of a product you want at the store and having an app direct you to the precise shelf where that product is located. Along the same lines, imagine an airport app that shows you where you can pick up your luggage and the fastest route to the Uber pickup area. In the future, all of this and more will become commonplace. Instead of relying on our phones to guide us, we'll eventually have foot traffic instructions overlaid on our field of vision thanks to AR goggles. - Let's face it: Most anyone who's shopped online has been disappointed once or twice. From furniture on a professionally designed set to clothes on a professional model, many purchases fail to live up to our expectations once they arrive on our doorstep. Thanks to AR, however, an increasing number of retailers are allowing virtual shoppers to virtually test their products before buying. In fact, as of 2020, **61%** of consumers said they [prefer](https://www.threekit.com/20-augmented-reality-statistics-you-should-know-in-2020) retailers that offer AR experiences. To accommodate them, companies like Macy's and Adidas are [investing resources](https://www.businessinsider.com/retailers-like-macys-adidas-are-turning-to-virtual-fitting-rooms-2020-8) into virtual fitting rooms that allow shoppers to "try on" clothes before making a purchase. The same goes for Target and Ikea, both of which are investing in similar technology that allows users to snap photos of their rooms so they can visualize what a new piece of furniture would look like before buying it. - Over the past few years, auto manufacturers have taken steps to introduce more advanced Heads Up Displays (HUDs). From alerting drivers to obstacles to overlaying directions on the road, using augmented reality to increase safety (while improving the driving experience) is a trend we are likely to see continue. As an example, Mercedes Benz's [Augmented Reality HUDs](https://www.raycatenaunion.com/head-up-display-mercedes-benz/) not only display information but actually work to control the vehicle as well. In particular, their Active Lane Keeping Assist technology prevents swerving, while their Active Distance Assist technology prevents rear-end accidents. Along the same lines, Tesla's [self-driving beta](https://futurism.com/the-byte/tesla-full-self-driving-code-secret-augmented-reality-view) allowed "drivers" to watch as the company's AI program interacted with nearby obstacles to power the vehicle on its own (via an augmented reality display). **8. Companies Build the Metaverse** While many of the trends above have been percolating for years (if not decades), the idea of a metaverse was virtually unheard of prior to 2021. Fast forward to 2022 and everyone is talking about it, with Facebook, Google, Microsoft, and others [making billion-dollar investments](https://www.makeuseof.com/companies-investing-in-metaverse/) into building their own metaverse. On the one hand, [some experts predict](https://news.yahoo.com/2021-was-the-year-of-the-metaverse-but-itll-take-years-before-its-a-reality-170559280.html) we won't have a fully functioning "metaverse" for at least a decade. On the other, some [argue](https://www.mondaq.com/italy/fin-tech/1153290/the-evolution-of-the-metaverse-from-simcitysecondlife-to-sandbox-decentraland-blocktopia-and-cryptovoxels) primitive versions have been around for years already. Regardless of which way you see things, here are the hottest trends we see playing out over the coming years. - As the COVID pandemic forced large gatherings to shut down worldwide, musicians desperate to generate revenue began exploring the idea of virtual concerts. And while it\'s unlikely the v1 experience was anything like the real thing, diehard fans still showed up in droves. In fact, in what was one of the first metaverse concerts, rapper Travis Scott reportedly [earned almost \$20 million](https://cointelegraph.com/magazine/2021/12/27/vr-animal-concerts-metaverse-lead-next-wave-crypto-adoption) as a result of holding his virtual concert on the insanely popular Fortnite video game platform. As did Ariana Grande, who had a record-breaking [78 million](https://theface.com/music/ariana-grande-fortnite-rift-tour-performance-gaming-vr-mac-miller-travis-scott-lil-nas-x) fans show up to her virtual Fortnite concert in 2021. Admittedly, most users interacted with these experiences via their 2D devices (computers, phones, etc.). However, in the future, virtual / metaverse concerts will be fully immersive via Virtual Reality goggles (making them a true "metaverse" experience). - While the fashion industry is not known for being "high-tech," industry players have warmed up to the metaverse in a big way. As an example, in March of 2022, dozens of high-profile fashion brands will be displaying their clothing at the [first inaugural](https://wersm.com/decentraland-will-host-inaugural-virtual-fashion-week/) Virtual Fashion Week (hosted by industry-leading metaverse company Decentraland). Further, because the metaverse is meant to be an immersive, virtual world, some fashion brands are taking a fully digital approach. As an example, Nike [recently acquired an NFT collectibles studio](https://techcrunch.com/2021/12/13/nike-acquires-nft-collectibles-studio-rtfkt/) in a bid to create digital shoes collectors can use to dress their avatars in the metaverse. The same goes for luxury brand Balenciaga, which recently [partnered](https://www.lifestyleasia.com/ind/gear/tech/these-brands-already-have-a-presence-in-the-metaverse/) up with Fortnite (above) to create virtual/metaverse clothes. In fact, in May of 2021, someone [spent](https://www.nytimes.com/2021/07/10/style/metaverse-virtual-worlds.html) \$4,100 on a virtual Gucci bag (which is more than the physical item currently costs in stores). - One of the largest problems with industry conferences is the fact they require participants to travel. Between airfare, hotels and eating out, attending a conference can cost thousands of dollars above and beyond the price of an entry ticket. Because of this, companies like [Orbits](https://orbits.live/) and [VIBE Agency](https://www.bizbash.com/13410080) are taking steps to duplicate the live event experience online. From virtual networking spaces to corporate-sponsored booths, [some players believe the future of industry events lies in the digital world](https://www.bizbash.com/13410080). **9. More Developers Use Low/No Code Tools** Given the constant release of new and exciting technologies, it's easy to assume tech companies are rolling out new products and services as fast as they can. However, there's a serious shortage of developers. In fact, according to the Bureau of Labor and Statistics, [the industry will face a shortage of 1.2 million computer engineers](https://www.daxx.com/blog/development-trends/software-developer-shortage-us) by 2026. Further, the same report emphasizes that amongst existing applicants, only 39.6% of people will qualify for a given role. With a persistent and growing shortage of talent, creating solutions that allow developers to work more efficiently has become critical to moving initiatives forward. Fortunately, that's precisely what Low-Code and No-Code software programs can help with. - By combining visual models with AI-powered tools, software developers will be able to skip (or dramatically expedite) the time-consuming process of writing thousands of lines of code from scratch. Instead, Low and No Code solutions allow developers to streamline the development of new SaaS applications. - And it's not just full-time software developers that benefit. Because low-code and no-code solutions are by design meant to be user-friendly, an increasing number of non-developers are able to build software programs they otherwise would not have the skills for. In fact, one survey shows [60-70% of companies report](https://explodingtopics.com/topic/no-code-1) non-developers using low-code and no-code to build software systems the companies now use internally. - ![](media/image58.png)At the core of low-code and no-code solutions are what's known as Low Code Development Platforms (aka [Low Code Application Platforms](https://en.wikipedia.org/wiki/Low-code_development_platform)). Like a carpenter's toolbox, LCAPs are what enable users to create software using the low / no-code approach. With an increasing shortage of software developers and an increasing number of people successfully building applications using LCAPs, it\'s safe to assume LCAP technology will continue to advance at a blistering pace. As evidence of this, in 2019, LCAPs [were a **\$3.47 billion** industry](https://cyclr.com/blog/low-code-is-revolutionising-the-software-industry). Within two short years, however, the industry grew an impressive **66%** (to **\$5.75 billion**). **10. Ambient Computing Integrated into More Devices** According to the Merriam-Webster [dictionary](https://www.merriam-webster.com/dictionary/ambient), the word "ambient" refers to something that is "existing or present on all sides." In short, it refers to something that's all around us. Which is a perfect term for what ambient computing promises for the future: An AI-driven network of devices and software that runs in the background (all around us) with little to no human intervention required. With the potential to transform how we interact with everything from coffee makers to freight trucks, it comes as no surprise the ambient intelligence industry [is expected to grow at an impressive **33%** CAGR through 2028](https://www.reportsanddata.com/report-detail/ambient-intelligence-ami-market). - With that said, ambient computing differs from AI, machine learning and Robotic Process Automation. Mainly because these three trends are - for the most part - entirely software-based (ignoring how they could instruct the hardware to perform certain functions). On the flip side, ambient computing uses both AI and machine learning to interpret data gathered from ***physical*** devices (like smart thermometers) and make decisions on its own. And it's this emphasis on interacting with real-world smart devices - known as the Internet of Things - that sets ambient computing apart. - From the Jetsons to Minority Report, many portrayals of an advanced future involve devices that interact with both individuals and themselves. Some of which - like smartphone-controlled thermostats - have come to fruition. Others, like smart ovens that know precisely how long to cook a turkey for, have yet to be invented. In the future, however, all of this and more will be possible thanks to ambient computing. As an example, in the future, it\'s unlikely you'll need a garage door opener. Instead, your smartphone will communicate your location to a device in your home, which will open the garage door for you as you approach. As another example, a smart sensor could alert you to when your pet has a fever or is walking with a slight limp. Admittedly, this is a young technology and these are hypothetical use cases. In the future, however, ambient computing will most likely affect every device we interact with (from coffee makers to smart beds and more). - On the commercial side, many companies have already begun incorporating (primitive) versions of ambient computing into their processes. As an example, office buildings with [advanced LED lighting systems](https://www.interact-lighting.com/en-gb/customer-stories/the-edge) can use sensors to provide ideal lighting for employees while turning off lights in the sections of the building where no one is present (thereby optimizing their electricity usage/carbon footprint). And in what is yet another fascinating use case, tech company Nuance is [using camera-activated smart speakers to transcribe interactions between doctors and patients](https://news.nuance.com/2018-01-22-Nuance-Unveils-AI-Powered-Virtual-Assistant-Solution-Designed-for-Healthcare-Providers). In doing so, they free up the doctor's attention so he or she can focus on the patient instead of taking notes. **11. Edge Computing Adoption Grows** In 2017, the global data centers [processed approximately 100 exabytes of data per month](https://www.statista.com/statistics/267202/global-data-volume-of-consumer-ip-traffic/). Last year, that number skyrocketed to 267 exabytes. As the world becomes increasingly digital - and the 5G Internet enables larger data transfers at faster speeds - today's IT infrastructure requires more processing power than ever. In particular, growth in [data-intensive technologies](https://www.globenewswire.com/news-release/2021/03/08/2188582/28124/en/Data-Center-IP-Traffic-Estimated-to-Reach-20-6-Zettabytes-by-the-End-of-2021.html#:~:text=filingsmedia%20partners-,Data%20Center%20IP%20Traffic%20Estimated%20to%20Reach,by%20the%20End%20of%202021&text=One%20result%20of%20the%20pandemic,nearly%207%20ZB%20since%202016.) like remote health monitoring, telecommuting and long-distance learning is expected to continue pushing global data processing requirements to record highs. To help process all that data (at faster speeds), companies are pouring more of their budgets into Edge Computing. With a focus on speed and network distribution, edge computing is designed to "[improve response times and save bandwidth](https://en.wikipedia.org/wiki/Edge_computing)" by moving processing power physically closer to the source of data. - ![](media/image60.png)Over the past decade, cloud computing has gained widespread adoption (as of 2022, it is estimated that **94%** of [enterprise companies use the cloud](https://webtribunal.net/blog/cloud-adoption-statistics/#gref)). However, cloud computing is both [expensive](https://www.zdnet.com/article/cloud-computing-more-costly-complicated-and-frustrating-than-expected-but-still-essential/) and [resource-intensive](https://www.otava.com/blog/bandwidth-and-the-cloud/). This is especially true for companies dealing with large amounts of data (given enterprise-level cloud storage services base their fees on usage). Because of this, corporations are looking at ways to reduce their dependency on cloud computing by [moving to edge computing](https://www.forbes.com/sites/forbestechcouncil/2022/01/26/2022-predictions-edge-computing-takes-center-stage/?sh=32d3bbef38b0) instead. By leaning more heavily on the edge - and in particular its ability to [reduce bandwidth transmissions](https://www.dataversity.net/edge-computing-overview/) - companies will be able to reduce their cloud computing bill while providing a faster user experience to end-users. - While the cloud will continue to do the heavy lifting in terms of data storage, edge computing is expected to take care of the data processing and speed issues. One way edge computing providers are working to enhance processing speeds is using what's known as [Data Reduction](https://en.wikipedia.org/wiki/Data_reduction) (aka Data Thinning). In simple terms, data thinning [involves identifying the importance of a bit of data and moving it to the appropriate place](https://www.forbes.com/sites/forbestechcouncil/2022/01/26/2022-predictions-edge-computing-takes-center-stage/?sh=5788bce138b0) (server) to enhance processing speeds. The technology behind edge computing is quite complex, however, an example of how data can be moved between servers to reduce speeds can be found [here](https://www.forbes.com/sites/forbestechcouncil/2022/01/26/2022-predictions-edge-computing-takes-center-stage/?sh=d2ec85338b0c). - In 2021, Meta (aka Facebook) sold a [record number](https://screenrant.com/meta-double-quest-sales-2022-facebook-metaverse/) of virtual reality headsets. And as concepts like the metaverse gain steam, a growing number of retailers are looking to incorporate augmented reality into their physical locations. Similar to telehealth and long-distance learning, in-store, on-demand augmented reality applications (like [virtual dressing rooms](https://www.shopify.co.id/retail/virtual-fitting-rooms)) will require massive amounts of high-speed data processing. To address this, a growing number of retailers are turning to companies like [StorMagic](https://stormagic.com/) to help them integrate edge computing into their upcoming augmented reality applications. **12. Extended Reality Expands Beyond Entertainment** As the lines blur between mixed reality, augmented reality, virtual reality, etc., the term "Extended Reality"' has become an umbrella term designed to encompass all of the above and more. From the metaverse to virtual concerts, here are ways that extended reality is likely to be used over the next 2 to 3 years. - - - - - **13. Wearable Devices Get Smarter** Since the launch of the FitBit in 2007, wearable technology has become increasingly popular. From sleep tracking rings to niche-specific devices (like Rip Curl's GPS-based surf watch), consumers have shown a growing interest in wearable technology. Tech that allows them to track (and potentially improve) their health and fitness. In the early days, most wearables were limited in their abilities. However, as more sophisticated sensors become available, wearable devices are becoming smarter and smarter every year that goes by. - Up until recently, most smart rings - and in particular the best-selling Oura - were focused on sleep tracking. Over the past few years, however, we've seen a variety of smart rings launched with features that go much beyond sleep tracking. As an example, the [McLear RingPay](https://mclear.com/) acts as a contactless payment device, allowing users to swipe their ring in the same way they swipe a contactless cart. Currently, RingPay is only available in the UK. Given how [popular](https://thefintechtimes.com/the-growth-of-contactless-payments-during-the-covid-19-pandemic/) contactless payment became during the pandemic, we expect similar technology to roll out in the United States and other Western economies in the coming months and years. With built-in [Near-field Communication](https://en.wikipedia.org/wiki/Near-field_communication) technology, the [NFC Opn](https://store.nfcring.com/products/opn?variant=500585103361) ring enables wearers to share data (and even unlock smart-key doors) with the swipe of their finger. In particular, the ring includes [two data sensors](https://www.wareable.com/fashion/best-smart-rings-1340). On the outside, less sensitive data (like the wearers\' public email) is available. On the inside, a second sensor stores more sensitive data, requiring users to intentionally expose their finger/wrist to the device they'd like to share information with. - In 2021, wearable medical devices accounted for approximately [\$20 billion in global sales](https://www.statista.com/statistics/1289674/medical-wearables-market-size-by-region/). By 2026, that number is expected to quadruple (to nearly \$84 billion). Admittedly, basic fitness trackers account for the majority of [health wearable sales](https://www.insiderintelligence.com/insights/top-healthcare-wearable-technology-trends/). Over the past two years, however, the ability of these devices to track more detailed health measurements has become increasingly sophisticated. As an example, Apple's [Series 7 Smart Watch](https://www.insiderintelligence.com/insights/top-healthcare-wearable-technology-trends/) now contains an FDA-approved electrocardiogram sensor, the ability to call 911 if the wearer experiences a fall, and even the ability to alert owners if the watch detects atrial fibrillation. Along the same lines, one study showed that [37% of buyers use their wearable device to monitor their heart health](https://www.insiderintelligence.com/static/664f96e6e6df670610aae99983608b8e/eMarketer-what-healthfitness-metrics-do-us-wearable-users-track-with-their-wearable-devices-of-respondents-march-2021-266822.jpeg). A practice some are using in conjunction with their doctors to [monitor heart issues more closely](https://www.insiderintelligence.com/insights/top-healthcare-wearable-technology-trends/). Further, it's not just consumers that are interested in smart wearables. A [survey](https://www.business.att.com/learn/updates/the-prognosis-for-telehealth-and-connected-care.html) of hospital executives found that 47% of hospitals provide wearable devices to patients with chronic diseases (for the purpose of ongoing monitoring and tracking). - While smart clothes as a sub-niche of wearables are very much in their infancy, the fact they have more contact with the body (due to their surface area) opens up a variety of new and exciting use cases. As an example, the [Nadi X yoga pants](https://www.wearablex.com/collections/nadi-x-smart-yoga-pants) use sensors and stretch bands to analyze and provide feedback on the owner's posture. Like how long-distance runners were early adopters of smart fitness watches, they've also become early adopters of smart shoes. In particular, [high-tech shoes](https://crazyfit.tech/smart-running-shoes/) that track impact on a runner's knees or heels to reduce pain associated with conditions like plantar fasciitis. **Conclusion** From labor shortages to a computer chip crisis, tech companies across the globe have faced a variety of challenges over the last two years. Despite these challenges, however, the tech sector continues to grow at a blistering speed. With a focus on solving today's most pressing problems and a seemingly infinite pool of cash available to the companies that succeed, it's unlikely the tech sector will slow down anytime soon. Instead, the tech trends we've outlined above are set to play a large role in molding our future over the next 5 years. V. **LEARNING ACTIVITIES** ![](media/image62.png) Republic of the Philippines **NUEVA VIZCAYA STATE UNIVERSITY** 3700 Bayombong, Nueva Vizcaya **COLLEGE OF ARTS AND SCIENCES** STUDENT NO: COURSE/YR/SECTION: ------------- -- -- -------------------- -- NAME: DATE SUBMITTED: **General Directions:** 1. **READ AND UNDERSTAND the directions carefully and give what is asked.** 2. **ALWAYS use this template, properly filled-up, in submitting your work.** 3. **SAVE this as a PDF file using your initials. (Example: CGS\_LA1.pdf)** **LEARNING ACTIVITY \#1** **Chapter 1** **After reading the given module 1, you are task to work on this learning activity. Make use of the given rubrics below for your reference in answering the given questions below. Please take note of the deadline for this requirement in our MS Teams.** **Rubrics: Content (Did it answer what is asked?) -- 12 points** **Originality (Must not be copied from classmates) -- 5 points** **Presentation (punctuations, line spacing, justification, etc.) -- 2 points** **References (Must cite sources, references) -- 1 point** **Total = 20 points** 1. Aside from the above-mentioned positive and negative impact of ICT, think of other impact (negative or positive) that is personal to you. ***Discuss by relating your own story***. 2. After reading the given link below about advantage and disadvantage of technology in agriculture, you are task to state your opinion or your case on TECHNOLOGICAL ADVANCEMENT IN AGRICULTURE focusing in the Philippine setting. (Make your statement brief and concise) [Advantages and Disadvantages of Modern Technology in Agriculture » Tech Stonz](https://techstonz.com/advantages-disadvantages-technology-agriculture/#:~:text=One%20of%20the%20worst%20disadvantages%20of%20technology%20in,production%20rate%20but%20slowly%20it%20damages%20soil%20fertility.) [24 Advantages and Disadvantages of Technology in Agriculture - 1001 Artificial Plants](https://www.1001artificialplants.com/2019/06/06/24-advantages-and-disadvantages-of-technology-in-agriculture/) 3. Reflecting on the history of computers and how our technology evolved, what TWO distinct and unique words would you use to appreciate and at the same time describe the people involved in the evolving of technology. Explain further why you choose those two words in just three sentences. VI. **ASSIGNMENT** **STUDENT NO:** **NAME:** **Course** ----------------- ----------- ------------ 1. **READ AND UNDERSTAND the directions carefully and give what is asked.** 2. **ALWAYS use this template, properly filled-up, in submitting your work.** 3. **SAVE this as a PDF file using your initials.** 4. **Only one of you will submit your assignment in our MS teams but you must include in the file/document the names of all the members.** 1. 2. 4. *NOTE: ChatGPT may answer all these questions so well, but ALWAYS REMEMBER THAT AI IS NOT THE LEARNER HERE,* ***IT'S YOU*** ![](media/image66.png) VII. **EVALUATION *See MS Teams for updates. / To be announce.*** VIII. **REFERENCES** - https://ftms.edu.my/v2/wp-content/uploads/2019/02/csca0201\_ch01.pdf - https://www.sutori.com/story/history-of-ict-information-and-communications-technology-N7J51bQqSU7vLWcVfdn5M9qa - https://www.livescience.com/20718-computer-history.html - https://www.explainthatstuff.com/historyofcomputers.html