Green IT and Sustainability module 1 & module 2.docx
Document Details
Uploaded by SparklingRoseQuartz
Tags
Full Transcript
**Green IT and Sustainability (BCS456A)** **Module-1** **Green ICT -History, Agenda, and Challenges Ahead:** Introduction: - The dawn of the Industrial Revolution and the growth of machine-based industries changed the face of our planet for good. - The Industrial Revolution also fundam...
**Green IT and Sustainability (BCS456A)** **Module-1** **Green ICT -History, Agenda, and Challenges Ahead:** Introduction: - The dawn of the Industrial Revolution and the growth of machine-based industries changed the face of our planet for good. - The Industrial Revolution also fundamentally changed Earth's ecology and humans' relationship with their environment. - One of the most immediate and drastic repercussions of the Industrial Revolution was the explosive growth of the world's population. - The transformation from cottage industry and agricultural production to mass factory-based production led to the depletion of certain natural resources, large-scale deforestation, depletion of gas and oil reserves, and the ever-growing problem of carbon emissions---mainly the result of our reckless use of fossil fuels and secondary products. - The pollution problem following the Industrial Revolution led to the atmospheric damage of our planet's ozone layer as well as air, land, and water pollution. - The two world wars in the early twentieth century brought with them catastrophic human and natural disasters as well as rapid development of military technologies. These developments laid the groundwork for the emergence of what some would like to call the Second Industrial Revolution. The Second Industrial Revolution---The Emergence of Information and Communication Technologies Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, \"memex\" will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory. In 1946, the first new generation of computers emerged from US military research. Financed by the United States Army, Ordnance Corps, and Research and Development Command, the Electronic Numerical Integrator and Computer (ENIAC) was announced and nicknamed the "giant brain" by the press. In reality, ENIAC, as compared to the smartphones of today, had very limited functionality and capabilities. According to Martin Weik (December 1955), ENIAC contained 17,468 vacuum tubes, 7200 crystal diodes, 1500 relays, 70,000 resistors, 10,000 capacitors, and around 5 million hand-soldered joints. It weighed more than 27 tons, was roughly 8\_3\_100 ft (2.4 m\_0.9 m\_30 m), took up 1800 ft2 (167 m2), and consumed 150 kW of power. This led to the rumor that whenever the computer was switched on, the lights in Philadelphia dimmed. ENIAC stored a maximum of twenty 10-digit decimal numbers. Its accumulators combined the functions of an adding machine and storage unit. No central memory unit existed per se. Storage was localized within the functioning units of the computer. The Agenda and Challenges Ahead The key agenda items for the "Green Information and Communication Technology" - While there is focus on big infrastructure and big data computing, the debate quite often tends to overlook the large number of existing "legacy" systems. Even our current laptops are viewed as "old legacy" systems. While the statistics show that there are around 2 billion PCs in operation currently, most of these systems suffer from "old" hardware (i.e., power hungry) designs. It is not surprising that manufacturers such as Intel have spent billions of dollars designing the next generation of microprocessors ("Haswell") and moving toward fanless, less power-hungry systems. - Another key challenge is how we measure ICT performance and sustainability and what tools we can use to provide reliable data. It is clear that unless we have a reliable methodology to measure ICT suitability, we will be experiencing what environmental campaigners have been experiencing over the past decades: claims and counterclaims and data being dismissed for not being accurate or substantial enough to corroborate our arguments. - One of the most critical challenges facing green Information Technology is the legal framework within which system developers and providers need to work. Although recently the European Union (January 2015) brought in regulation to European Union rules to oblige new devices such as modems and internet-connected televisions to switch themselves off when not in use, nevertheless, we are far from having robust sets of enforceable regulations for green IT on the national or international level. - While cloud computing was hailed as the "green way" of moving away from device dependent clunky power-hungry applications and data storage approaches, and although it could be claimed that the use of virtualized resources can save energy, a typical cloud data center still consumes an enormous amount of energy. Also, because the cloud is comprised of many hardware and software elements placed in a distributed fashion, it is very difficult to precisely identify one area of energy optimization. - Some of the most interesting areas of research in green Information Technology concern "energy harvesting" or "energy scavenging," and the concept of the internet of things (IoT). Energy harvesting is exploring how we can take advantage of various ambient power sources scattered all around us. The concept of the internet of things examines a series of technologies that enable machine-to-machine communication and machine-to-human interaction via internet protocols. Being able to use the World Wide Web and the global connectivity of billions of machines, smart phones, and tablets to monitor and control energy usage can have a tremendous impact on developing a much more environmentally friendly computing world that is sustainable. **MODULE 2** **Emerging Technologies and Their Environmental Impact:** Introduction: It is impossible to envisage life without things such as mobile phones, digital TV on demand, or computer programs and applications such as Google, eBay, and Facebook. It is also remarkable that these technologies have become such an integral part of everyday life (at least in "developed nations") over such a short period. A full discussion of capitalism and market economics is not appropriate here, but suffice it to note that the cost of investing in improved production techniques demands that the production system make more units of product and sell them at the demanded profit and, in turn, that new outlets and uses be found for them. In some of these cases, the improvement process is continuous, interrupted only by sudden changes (the replacement of human labor by robots in much car manufacturing is one such example). In others, new products overtake the old: many former cast metal products are now made in plastic. The number of devices provides an additional multiplying factor. In common with many "developed" countries, some years ago the United Kingdom passed the mark of having more registered mobile phones than it has citizens. The billionth PC was shipped in 2008, and the data in its many different formats---that this growing collection of devices generates and processes also grows year on year. The volume of stored data is actually growing more quickly as the twin factors of easier production and greater precision make it simpler to produce more while the declining cost per unit of storage reduces the need to be selective about what is kept. The final factor to be considered is obsolescence. In some cases, obsolescence is physical, caused as technology "wears out": each time an on/off switch is used, wear (metal fatigue) occurs; memory read/write operations can be carried out reliably only a limited number of times before the magnetic characteristics of the device become unreliable. The purpose of this chapter is to show that all these changes tend to increase the amount of computing processing and memory needed and that delivering that processing and managing that memory lead to an increase in energy requirements. Number of Connected Devices The growth in Internet-connected devices is clear and some of the numbers that capture this bear repetition: Of the world's population, 50% possesses a mobile phone (World Bank, 2014). In many countries, there is already more than one registered mobile phone per capita (Stonington and Wong, 2011) although the data do not show how many of these remain active, and it is likely that a significant proportion is unused, having been superseded but not disposed of. By the end of 2013, 1.78 billion PCs and 1.82 billion units of browser-equipped mobile devices were in use (Gartner, 2013). In the United Kingdom in 2012, there were an estimated 10 million office PCs, and it was estimated that 50% of the working population used a PC in their daily work; this is expected to increase to 70% by 2020 (1E, n.d.). The number of smartphones in use is predicted to pass the number of PCs in use during 2014 (Blodget, 2013). Sales of tablet devices are likely to exceed those of PCs during 2015 (Blodget). By 2020, there are predicted to be 50 billion Internet-connected devices; much of this will be a consequence of the Internet of things explored in subsequent paragraphs, but almost all of the devices (phones, PCs, and tablets) mentioned possess some form of Internet connectivity (Chiu et al., 2010). Most of these devices and most of the emerging applications are not stand alone; they rely on technologies such as cloud computing, device connectivity, and information sharing to deliver their functionality. This makes it pertinent to include the growth in networking and storage within the consideration of the environmental impact of emerging technologies. The increase in the numbers of devices, the features each one possesses, and inbuilt complexity all tend to increase the volume of data. In addition to the obvious fact that more devices are likely to produce more data, registering, tracking, interconnecting, monitoring, and securing large numbers of devices are all data-generating activities. Additional functionality also tends to result in more data: a one-color document (on screen or printed) is a rare sight in most offices; people who have digital cameras in their smartphones use them to capture events that would have gone unrecorded by the previous generation of analog camera users. The widespread adoption of cloud storage means that the need to be selective about which data to retain---a need already reduced by growth of on-board storage capacity---has now been pushed even further into the distance. Clouds offer almost limitless data storage at low additional cost, so there is little incentive for "housekeeping" to remove redundant, duplicate, or out-of-date information. The inevitable result of all this has been an increase in the overall volume of stored data: in 2011, it was estimated that a zettabyte (1021 bytes) of digital data existed in storage systems (Wendt, 2011); by the time this book is completed, it is expected that this will have quadrupled. Consideration of data storage leads us to data centers, whose power consumption doubled over the four years between 2007 and 2011; in the single year of 2012 (a year ofmajor investment in cloud technologies), this increased by an additional 63% (Venkataraman, 2013). Note that these figures relate to power consumption, not the actual number of data centers or the units installed within them. If we accept that efficiency has improved over that time, we are forced to conclude that the power consumption is an underestimate of the installed units because each unit should require less power in 2012 than its equivalent in 2007. Since then, growth has continued, albeit at a lower rate, but still in excess of 10%per annum. Not surprisingly in view of this, data centers are responsible for a significant portion of the total emissions for the IT sector (14% of the total in 2007). It is estimated that this will rise to 18% by 2020. Greenpeace's 2011 estimate (Greenpeace, 2013) that, considered as a country, "the Internet" would be in fifth place, ahead of Russia and below Japan in a report of electricity use, is a compelling illustration of the energy consumption required to support the world's demand for IT. The future seems likely to be less one of new technologies and more of the ever-increasing use of existing technologies to create more data. Cisco has dubbed the five years ending in 2018 as "the zettabyte era," a recognition of the fact that global networks will carry that amount of traffic in the calendar year 2016, increasing to 1.6 ZB in 2018 (Cisco, 2014). In light of this, it is interesting to estimate the likely electricity requirement of the global telecommunications system of 2016. 14 Chapter 2 In a study of conventional wired communications technology (mixed fiber and copper) Coroama et al. (2013) offer a "pessimistic" estimate that overall Internet transmission uses 0.2 kWh of electricity per gigabyte (GB) of data. "Pessimistic" means that the estimate is a worst-case scenario, and improvements in energy efficiency will probably reduce the estimated use over time. However, even if we assume that this pessimism has doubled the actual figure to move 1 ZB at 0.1 kWh/GB will require 100 TWh of electricity per year. An estimate by Raghavan and Ma (2014) uses a different definition of the "Internet" that includes end devices and a full life cycle costing (both excluded by Cororama et al. whose focus is solely on the communication links). As a result, their estimate is significantly higher at 8 kWh/GB; applying this figure to Cisco's "zettabyte Internet" would mean an energy requirement a factor of 100 higher than the preceding figure. For comparison, the total domestic energy consumption of the UK for 2010 was 112,856 Wh/GB (Rose and Rouse, 2012). Additional growth in what might be considered hidden connectivity will occur. The internet of things is a concept in which almost any object can or will possess Internet capability. Much of this could create very positive forces for good by delivering reductions in energy use without requiring operator (human) intervention. Obvious examples are the washing machine, which is able to determine the optimum combination of time, heat, and water use for its current load or the in-vehicle systems that will provide the most fuel-efficient route and driving patterns for current road and weather conditions. The very need to provide such connectivity means that each device effectively becomes a computer in its own right. Providing devices in this quantity will require the creation of more IT with consequential increases in the demand for raw materials and the energy needed in operation as well as the need for disposal. Allowing these devices to communicate with each other, whether wired or wirelessly, will make a significant contribution to the growth in traffic predicted by Cisco. Increased Functionality Added functionality is widely used as a selling point for new products. The development and upgrade cycle of consumer electronic products is the most apparent illustration, for example, the way in which improvements in camera resolution, creating higher-quality images, have been a selling point for new devices. A higher resolution camera will generate more data (each pixel is 3 bits, so a 12 megapixel camera at the lowest end of the current market requires 36 Mb of uncompressed data per image before any compression. Higher resolution calls for one or more pixels, information per pixel, and less compression). Another rapidly growing consumer product is in-vehicle navigation. Here the same factors are apparent. New units are marketed on their performance, accuracy, and reliability: a more precise geolocation system will require more processing to calculate with greater precision (the basic in-vehicle geolocation systems typically transfer less than 2.5 MB per day in constant use and offer a positioning accuracy of 10-15 m, higher levels of precision, an accuracy call for Emerging Technologies and Their Environmental Impact 15 increased sampling rates, and more processing, for example, the use of "least squares" to minimize misfits between modeled and real location \[Bilich, 2006\]). In both cases the result is the same: the amount of resource (energy or hardware) is larger: more storage and transfer capacity are required because the volume of data generated is larger, and/or there is more processing of this additional data to provide the higher levels of accuracy and precision. Increased Number of Separate Functions Multifunction devices (MFDs) are now commonplace, whether in the guise of the phone that also acts as a camera, personal organizer, Internet access device, entertainment center, mapping and location system, and so on, or the office printer that also provides copying, scanning, and image processing. On public transport, a single device can issue and check tickets, provide a live timetable and journey planning information, ATM functionality, and the generation of data relating to cash sales. Less obviously, single devices provide multiple functions in networking, connecting both wired and wireless devices, and a single vehicle system can support both engine maintenance and management and route control. Initiatives in the field of wearable computing expand this use with devices that allow health monitoring to be added to the existing functions of the mobile phone. This could be viewed as a good thing because it should reduce the number of separate devices needed by any one person, but the reality is rather less distinct. We argue that the increased number of functions also contributes to the negative environmental impact of technology for the following reasons. A proportion (often significant) of this added functionality is unnecessary: Often many of the added functions available are rarely used if at al, but they still require support, which is rarely energy free. Simply running a background process on a device consumes CPU cycles and hence power. Unless the user chooses otherwise, it is quite likely that the application will be monitored and upgraded, regardless of whether it is actually used. Unsuitability for multiple purposes: It is often difficult to combine the demands of different user interfaces into the design of a single device while retaining the comfortable and efficient (in human terms) use of any one function. This can often lead to compromise, resulting in dissatisfaction. Although it might be almost acceptable to take holiday photographs with a regular tablet computer, using it as a "normal" telephone (without the addition of a hands-free earphone and microphone) would not be acceptable to most people. How this dissatisfaction is then manifest may have a further environmental impact: the user may discard the device in an attempt to find a "better" multifunctional device or purchase more than one device, using each for the subset of roles the user finds most satisfactory (and then adding to the communication network load by synchronizing them); or other technology may be added in an attempt to make the use more comfortable. As an example of the latter case, consider the growing market in detachable keyboards for tablet computers to meet the need for a more "user-friendly" keyboard for sustained operator. Separation of ownership/primary use: A rush hour journey on public transport will quickly reveal that many users have more than one example of what is ostensibly the "same" device. This typically is one mobile phone for work and one for personal use. This practice may be driven by individuals' wish to maintain a separation between their public and private lives or by security-driven management policies to avoid misuse of company-provided equipment. Increased Demand for Speed and Reliability Describing the communications between units of the British Army in South Africa during the military campaigns of 1899-1902, Arthur Conan Doyle gives us what was then clearly considered a remarkable example of speedy and reliable communication: "it is worthy of record that... at a distance of thirty miles \[48 km\], they succeeded in preserving a telephonic connection, seventeen minutes being the average time taken over question and reply" (Doyle, 1902). In the 112 years since then, military (and more peaceful) operations in both sound and vision are controlled remotely over thousands of kilometers in real time. The use of the word controlled in the modern-day description is in deliberate contrast to question and reply in the 1902 example: whereas local commanders in the army of 1902 would be expected to make local decisions based on information received, the remote control of an operation now calls for instructions to be issued and received with 100% reliability and accuracy. Emerging trends in a number of fields are the requirements for increases in the use of such systems. Smart city initiatives call for monitoring environmental characteristics, traffic patterns, power, water and sewage flows, traffic systems, water and power grids, and building heating and ventilation units. The mapping and location requirements of navigation systems (in vehicle or handheld) that will guide users to specific locations will need real-time information about current location, and automated transport (exemplified by the recent announcements of driverless cars) will call for ever more precise and accurate location. At the moment, there is no suggestion that driverless cars be remotely controlled, although many mass transit systems do operate in a highly automated mode (e.g., the Docklands Light Railway in London), and this will be extended whether through entirely new systems or additions or modifications to existing ones. In those applications in which remote control of systems whose incorrect functioning can present a risk to life and property, the speed and reliability of information and commands are paramount. Emerging Technologies and Their Environmental Impact 17 Similar demand for speed, reliability, and accuracy now exist in commonplace activities. Although they may not be essential, these device have become desirable or expected. The consequence of this is the need for systems that deliver these three requirements. In turn, these demands are met by the user of higher performance devices that typically require more memory and processing power. The standard approach to providing increased reliability is through redundancy with stand-by units both hot (ready to go instantly) or cold (able to be powered up at short notice); by load sharing to avoid overloading any one component; and by system designs that seek to eliminate any single point of failure. It hardly needs to be stated that the overall result of attempts to meet the demands for speed, reliability, and accuracy in any single activity is often to increase the volume of resource needed to support that activity. Obsolescence---The Problem of Backward Compatibility The IT industry is one in which change and innovation are rapid and often fundamental. Consider the change from mainframe computers to minicomputers to stand-alone PCs and then to networked PCs, the client-server era, wireless networks, and the current trend to use cloud and mobile devices. These step changes---occurring roughly once per decade---have meant that otherwise fully operational and functional systems have been rendered obsolete before their designed lifespan has expired. Backward compatibility---allowing newer and older software and equipment to work together---may be achievable for a time but is not sustainable in the longer term without constricting development. The need to support a variety of different hardware types (old and new) also adds to the complexity of the software development process for applications, operating systems, and device drivers. After some time, the cost and complexity of maintaining backward compatibility will become too high and will be dropped. The majority of users typically will have already moved on to the newer product as part of normal update/renewal processes. Those who are left then must decide whether to continue with (now unsupported) equipment or to seek alternative applications that will run on their current equipment and are supported by the developer. Note, however, that this approach brings the added cost of learning the new system and possibly the need to convert data formats. The speed of development and upgrading of technology attenuates this process, giving rise to the well-known situation of fully functional equipment being discarded because of obsolescence rather than being worn out. Within the PC environment, Microsoft's recent decision to cease support for the XP operating system is a classic example of a step change, and user response is a classic example of what happens as a consequence (Covert, 2014). Although it is possible to install other operating systems, it is likely that many of the current XP devices will go out of use in relatively short order as application support, development, and (probably more significantly) security updating are discontinued. The difficulty---real or perceived---of transiting to a newer operating system will add to the likelihood of accelerated obsolescence. 18 Chapter 2 Another example of accelerated obsolescence, already mentioned in the introduction to this chapter, is the UK's decision to no longer provide analog TV signals. Here the driving force was government policy rather than manufacturer decisions. Although newer equipment was already being adopted by a significant proportion of users as a consequence of demand (or desire) for the additional features offered by cable and satellite provision, it is clear that the government's decision to reallocate the frequency bands used by analog TV in order to auction them to mobile operators created a step change in this process. The provision of set-top converter units offered the option to retain existing analog units, but the added functionality of new TV sets---particularly HD---has led to an increase in sales of the newer ones. Whether the sets rendered redundant by this are stored or disposed of is unclear. The Other Side of the Balance Sheet---Positive Environmental Impacts or the "Other 90%" We have already seen that the likely result of continued development in the use, range, and application of IT will be an increased demand for the resources required to create, maintain, and dispose of the various hardware items in use. The use of resources here refers to everything from the raw materials (including rare earth metals) and other chemicals (acids for etching, water for cleaning) and the energy used in the manufacturing process to electricity for powering the devices and the energy needed to handle them at the end of their lives. A wider definition would also include factors such as the energy required to transport both new and discarded raw materials and finished products. The case is frequently made that much of the focus has been on the greening of IT systems; that is, to make the technology more energy efficient, driving down the energy consumption of data centers, and so on. However, if we take the widely quoted figure that IT is responsible for 10% of the world's energy consumption, this leaves the question of the "other 90%." In addressing this, we consider the opportunities for greening by IT (i.e., using the technology to reduce the energy use of other aspects of human activity). Most prominent among these opportunities are transport and building heating and lighting. They are simple because they are the largest consumers of energy and therefore even a small proportional reduction in them would be significant. One possible approach to create a balance sheet is to consider a particular activity that is performed in a particular way that allows a calculation of the resource required to support that activity. Then consider how the activity would be undertaken using IT as an enabler, calculate the resource requirements for that mode of operation, and compare the two. One activity--- replacing the intercontinental business trip by videoconference---will become almost the de facto comparative factor. Emerging Technologies and Their Environmental Impact 19 Constructing a balance sheet for an activity such as a trip involves determining the energy used by a single air traveler on a return journey to conduct a face-to-face meeting and then working out the energy required to support a videoconference of the same time length. Such a calculation makes a number of assumptions, including that a videoconference is as productive as a face-to-face meeting and that removing a number of passenger trips would necessarily lower the number of airplane trips as opposed to the same number of planes traveling at reduced occupancy. Attempts to calculate the actual energy (or resource) used for a particular transaction results in very widely variable outcomes; there are significant variations in the determination of which devices are considered to support a transaction. The problem is further complicated by the task of apportioning the resource use of any shared device that actually handles a specific transaction and by the discussion of whether the device in question would have used any less resource had the specific transaction not taken place (e.g., if a particular call did not take place, would a router that was not required to route the particular call have used any less energy or resource?). Videoconference as an Alternative to Business Travel Perhaps unsurprisingly, most calculations of this form consider the use stage alone; there is no attempt to include the embedded energy of manufacturing a passenger airliner and then apportioning a part of the embedded energy to a given passenger on a particular journey. Nor should it be noted whether these calculations include any consideration of the energy costs of building and operating airports or of the cost of operating the communications infrastructure necessary to direct air traffic. Even without these added embedded energy costs, the typical energy balance sheet is heavily in favor of the videoconference solution. For example, Raghavan and Ma (2014) calculate that replacing 25% of the current 1.8 billion air journeys by videoconference would "save about as much power as the entire Internet \[consumes\]." An added benefit is that the energy saved is energy that would be generated by burning aviation fuel. Dematerialization of Product Chain Perhaps a more interesting and exciting opportunity is offered by dematerialization of the supply chain for a particular product item. Consider the situation in which some mechanical device (e.g., an office air-conditioning unit) fails. Currently, the sequence of operations is typically as follows: a maintenance engineer is called to the site, diagnoses that the problem requires replacement of a particular component, which is then delivered to the site, and the engineer then returns to perform the repair. Alternatively, the engineer may obtain the component or, very rarely (or so it seems), has the required part on hand, but this is only at the cost of maintaining a stock of such parts, resulting in a larger vehicle to carry more weight. 20 Chapter 2 A 3-D printer on site would allow the replacement part to be produced locally and fitted in one visit. As an additional IT-related benefit, remote diagnostics (also supported by IT) would provide an additional enhancement: the engineer could arrive knowing that the task was to replace a particular component and find it waiting upon arrival, thus saving time. The potential for 3-D printing is enormous. Although the spare part example is one of the more trivial ones, it is easy to envisage other products being "delivered" in the same way; examples include clothing, food, and buildings, both terrestrial (BBC, 2014) and extraterrestrial. NASA and the European Space Agency are currently considering the possibility of constructing a moon base using 3-D printers that use lunar soil as the raw material. The cost of transporting the 3-D printers would be significantly less than that of transporting building components from earth (NASA, 2014). Although it is still necessary for the raw materials to be available for the printer to use on site and the limitations of choice and quality (particularly for 3-D food) may leave something to be desired with current technology, 3-D printing could clearly offer potential energy savings in reducing the transportation and packaging of finished goods. What are arguably more trivial examples of dematerialization are already with us: the decline in printed newspaper, magazine, and book volumes brought about by their replacement with e-readers is beginning to have a noticeable impact on sales and production methods. The change to online consumption of music and home entertainment has already led to significant declines in the supply of physical media such as CDs and DVDs with consequences in the decline of the high street outlets for these products. The last example is only one part of an even more significant disruptive use of IT: the online shopping boom and its impact on shopping habits. We do not believe that there has been any attempt to calculate the cost (in energy and resources) of the typical online shopping spree in the run-up to Christmas 2014. Nor has there been an attempt to compare the cost with the energy and resources that would have been necessary to collect the same set of goods by traditional shopping. It seems likely, however, that savings quoted for videoconferencing coupled with the economies of scale and the logistical improvements achievable by the major online retailers and the opportunities for planning fuel-efficient delivery would occur for the online activity. If products such as clothing, toys, kitchen goods and even food items were to be 3-D printed, the cost savings would be even higher. Whether we are prepared to accept the societal impacts of this (the inevitable disappearance of actual shops in towns and cities) or the loss of the physical and mental activity of the shopping trip on us as humans adds extra complications to any such calculation. Travel Advice/Road Traffic Control Although the replacement of journeys and physical movement of goods described here offer the potential to decrease some travel, the human need (or desire) to move around in vehicles of some form apparently is likely to continue. Travel in more populates areas can be met by a mixture of private and public transport, and IT can make a positive difference to overall energy engine management systems and other devices to ensure optimum use of fuel; communications systems allow information on traffic flows and road conditions to be made available so that drivers can consider changing travel plans to avoid congestion. Public transport vehicles also benefit from such devices, and a well-planned information system allows control of the whole network while keeping customers informed and allowing them to plan their journeys. Finally, citywide traffic control systems provide traffic management across a city so that local changes to traffic flows, such as sequencing traffic lights and rerouting traffic, will reduce delay, thus offering the option to use less fuel. It is likely that these initiatives will further develop and expand to incorporate data such as weather forecasting (cold and rainy weather encourages the use of cars and buses rather than walking or cycling for essential journeys and reduces the number of nonessential journeys---both of which have predictable effects on transport patterns and demand). Other uses of ICT in traffic and travel management are more subtle: charging fees that to persuade leisure users not to travel at peak times can be facilitated by tracking technologies such as automatic vehicle license plate recognition; requiring changes of public transport mode (e.g., bus to train to bus) is made easier by ticketing and using an integrated timetable. Fuller integration of traffic management, journey planning, and timetable integration, together with longer-term planning and modeling of city infrastructure (to allow easier access to places of work and leisure), bring us to the reality of smart cities, the potential of which is now beginning to be exploited. Intelligent Energy Metering In the United Kingdom and other countries, most domestic energy use (gas and electricity) was until recently recorded and paid for using a labor-intense process. A "meter reader" would visit each household to physically make a reading (by reading a meter's dial and writing down the number on it). This reading then generates a bill, which is probably the first indication of their energy use that most customers have. On receipt of the bill, some users might take action to reduce use, but without a positive feedback loop and accurate (and timely) information, this is often a piecemeal activity. The location of meters in difficult to access locations was another disincentive to proactive energy awareness. The concept of so-called smart meters can provide easier access to information about energy consumption in real time, allowing more obvious changes to be made to current energy consumption. A web-enabled meter can communicate real-time usage to an end user device (a smartphone or tablet) without requiring the user to locate and check the meter. Among other benefits, the time-consuming round of meter reading can be removed; if the data can be sent to the user's smartphone, it can also be communicated to the energy company. Emerging applications of the potential for this technology will include energy suppliers' ability to negotiate and smart meters to balance demand, allowing customers to choose between suppliers using a number of criteria. Moving the negotiation processes further into the domestic arena, customers can determine the pattern of their energy costs for their household devices--- especially washing machines and dishwashers---and can use them during periods of lower demand. More controversially, there is the prospect that the energy supplier could control operation and behavior more directly by restricting supply. We are not aware of this actually taking place, but it is indicative of the challenges resulting from the opportunities for more control by more complex technology. The issue of the extent to which we are willing to allow technology to control our behavior and lifestyle is a topic worthy of a full research study in its own right but is beyond the scope of this book. Building Management Systems We are familiar with systems that control the temperature of buildings in which we live and work from domestic central heating controllers to office air-conditioning units. In large buildings, such as office blocks, hospitals, and university and colleges, these control systems are being merged with sensors and other input systems in the form of a building management system (BMS). A complex BMS has the potential to have very detailed control and operation not just of the heating, ventilation, and air conditioning (HVAC) system but also lighting and security systems; they can also provide room ambience (mood music) appropriate to the occasion. By including inputs from weather forecasting systems, the BMS can adjust HVAC settings to prepare a building for coming weather conditions---for example, providing some heating while energy is at a cheap rate because the outside temperature is predicted to fall over a three- to fiveday window rather than waiting until the temperature falls below a set level after two days of cooler weather, which requires using more expensive energy to provide required heating. Other BMS-related opportunities include a linkage between room use and HVAC settings and modification to heat needs for different numbers of occupants or for different levels of activity; more active occupants would need less heat (or more cooling) with the less active the needing the opposite. Security could be enhanced via an intelligent BMS: movement detected in a room that is scheduled to be unoccupied would trigger an alarm; the ability to count the number of people entering a given room could be compared with the room's safe occupancy level, allowing action to be taken as necessary. As with energy metering, it is likely that the potential range of uses to which this technology is put will be limited as much by human factors as by any limitations in the technology itself. The degree to which we are prepared to accept the "intrusion" that many of these applications bring with them---in both monitoring behavior and adjusting the environment---remains to be seen. Once data are available relating to an individual's presence in a given location (or to the intensity with which the individual is moving within that location), legal and regulatory action is needed to prevent the use of that information for other purposes---such as monitoring active working hours. Saving IT Resources---A Drop in the Ocean? It is clear that the effect of individuals making the changes that are available will have negligible (a drop in the ocean) impact on the overall energy consumption of IT---whether current or emerging. One person deciding to cull unwanted photographs stored "somewhere in the cloud" is not going to change the resource requirements of the cloud servers or the network that connects them. However, a number of individuals involved would probably be sufficient to have a measureable impact if they all did the same thing. The likelihood of this is, of course, increasingly small although workplace campaigns to raise awareness can have a limited but measureable effect. Campaigns to encourage office staff to turn off their computer systems on weekends typically have some impact: the problem is that the impact of the campaign and hence the level of response diminishes over a relatively short time scale. To make a major, lasting difference, legislators or providers need to take action; the most effective tool is financial---making it more expensive to produce and retain more data. Providers could also introduce a "delete by default" form of operation in which stored photographs are automatically deleted after a set time period unless the owner explicitly tags them for retention, thereby taking advantage of the inertia that affects the majority of people. Another "drop in the ocean" response to the call for greater efficiency in the IT sector relates to the relative size of its contribution to the environment (10% of energy, 2% of greenhouse gas) compared to other sectors, not only the airline industry but also heavy manufacturing of such items as steel and chemical works. This is in addition to the underlying inefficiency of much fossil fuel electricity generation. Although this is undoubtedly true, a counterargument is that if some proportion of that 2% (or 10%) can be reduced, the opportunity should be taken. With energy demands at the level discussed earlier for the zettabyte network, even a small relative change is significant in absolute terms.