REN R 105 Midterm 2 (2024) Lecture Notes PDF

Document Details

AdoringEnjambment

Uploaded by AdoringEnjambment

University of Alberta

2024

Professor Nielsen

Tags

climate change physical environment geography science

Summary

These lecture notes from REN R 105, Midterm 2, cover the basics of climate history and geography, focusing on topics like terrain-related microclimates, climate variations, and radiative forcing. The document also references the Gaia hypothesis and climate over the past 1500 years.

Full Transcript

REN R 105, Midterm 2 (2024) Notes from Professor Nielsen on topics to focus on (especially anything in bold) Lecture 8, Physical Environment V: Basics of Climate & History (slide 21+) Terrain-related microclimates o Cold air is heavier, so it sinks while warm air rises; in some places,...

REN R 105, Midterm 2 (2024) Notes from Professor Nielsen on topics to focus on (especially anything in bold) Lecture 8, Physical Environment V: Basics of Climate & History (slide 21+) Terrain-related microclimates o Cold air is heavier, so it sinks while warm air rises; in some places, cold-air drainage can create frost pockets where it is not possible to grow trees (too many summer frosts) o Terrain creates slope-aspect relationships that alter the local microclimate; in the N. Hemisphere, south-facing slopes are ‘warmer’ than north-facing slopes that are ‘cooler’; this effect can be substantial in changing vegetation communities (in Edmonton’s River Valley grasslands on s-facing; forests on n-facing) o Large, deep lakes can create cool microclimates that are substantial, such as L. Superior, where the cold water creates shoreline temperatures like the arctic-alpine thus being a hotspot (refugia) for disjunct arctic-alpine plants. Climate over deep history & Gaia hypothesis o Over the last 500 Ma the earth has swung between hothouse (earth without ice caps) and icehouse (earth with ice cap) conditions; the swings in temperature are most apparent at high latitudes while the tropics (equator) are relatively stable o Lovelock first proposed the Gaia hypothesis (later adding Margulis), which states that the Earth is a self-regulating system with temperatures falling within a defined range due to negative feedback. Later supported by the Daisyworld simulation that focuses on changes in Albedo (learn this definition) due to changes in the color of the planet’s surface. Climate variations from Younger Dryas (YD) to Little Ice Age o Orbital variations can partly explain the end of the last glaciation and the recent shift from Holocene Thermal Optimum (10 ka to 5 ka) to more recent Neoglacial (5 ka to recent) periods. Holocene Optimum was warm enough to have higher treelines in the mountains (Alps, Canadian Rockies), with the lowest treeline since Younger Dryas being the Little Ice Age o Before our current warming, the Little Ice Age was the coolest period since the end of Younger Dryas and generally matches well trends in Obliquity. Climate over the past 1500 years: o 2 cool periods, 2 warm periods in the order of Dark Ages (cold), Medieval Warm Period, Little Ice Age, and Modern Warming. [I’m less concerned about you knowing dates/years than the order.] Many studies support the Medieval Warming Period, although it is argued by some not to have been so warm, and they focus instead on recent warming since the end of the Little Ice Age (current reference is the Industrial Revolution to 1850, the coldest 500 years in past 11 ka). This is the period when temperature stations were started. Lecture 9, GIS: G.I.S. stands for Geographical Information System o Geographical relates to the geography of data o Information relates to location, scale, and type (presentation) of data o System refers to the hardware/software used to organize/manage data Georeferenced data o Relationship of the data to the 2D or 3D coordinates; every datapoint must have location (x,y) coordinates; database the organized structure while tables are used to store information. Attribute is the description of the data values GIS data model o GIS data models include either Vector or Raster formats. Vector data are discrete features (points, lines, polygons); Raster data are continuous values defined in a grid. Lecture 10, Physical Environment VI: Recent climate change Radiative forcing of the sun: o Solar constant just outside our planet is 1360 w/m2 o Top of Earth’s Atmosphere solar insolation is 340 w/m2 o Solar insolation at the Earth’s surface is 161 w/m2 Be generally aware of Earth’s energy balance. o I won’t expect you to remember all the number details, but I expect you to understand that there is an interplay between incoming (shortwave radiation) and outgoing (longwave radiation), with clouds having a big effect on either reflecting incoming or absorbing. CO2, water vapor & other GHGs (Greenhouse Gases) affect outgoing longwave radiation (but not shortwave incoming radiation), with this relationship being referred to as the ‘Greenhouse’ effect. o You don’t need to remember Planck’s curve, although knowing that the greenhouse effect of longwave radiation on CO2 starts to diminish as it saturates (getting to higher CO2 levels). Each time CO2 doubles, the effect of warming diminishes, so warming is greater at an initial doubling of CO2 at low levels. Understand the overall levels and ranking of radiative forcing for different greenhouse gases: water vapor has the biggest greenhouse effect (50% of total), clouds are next (25%), CO2 is third (20%), and other gases sum up the rest (5%) o Note anthropogenic radiative forcing is larger than recent natural variations, with volcanoes (above the surface) having a cooling effect, while solar cycles (longer and shorter) have +/- effects (but still somewhat dwarfed by recent CO2 effect) o Most of the atmosphere is N2 (78%) and O2 (21%) with CO2 at 0.04% (so quite small, but at low levels, it can have noticeable effects on climate (greenhouse effect)); b/c it is so small, we don’t express CO2 in % of atmosphere but rather parts per million (ppm) and currently CO2 is at 420 ppm from Keeling curve (measures) at Mauna Loa, Hawaii. o Globally, we are releasing about 35 billion tonnes of CO2 each year, with per capita CO2 emissions for Canadians at about 15 tonnes per year; recent decades show declining per capita emissions for Canadians, Americans, and Europeans (globalization & closing coal-burning plants) while rising emissions for China where industry is expanding rapidly. Measures of recent climate change: o Over the past Century or so, we have warmed about 1.5 °C (land + water); but land is warming faster (at about 2 °C) than oceans (at about 1 °C). o There is not much support for major global changes in precipitation, with the frequency of regional droughts, floods, and hurricanes about the same as in the past. o Weather stations do have a bias in geographic distribution and local siting (nearby objects that warm it through radiation with minimum temperatures most affected). o Some concern about stations increasingly being urbanized over time and thus measuring the Urban Heat Island effect (often 3 °C warmer in cities than rural areas), which is mixed with the effects of CO2 warming (so, land use-related warming + CO2 greenhouse warming). One study of rural-only stations showed a 0.55 °C/century change vs. all stations showing a 0.89 °C/century change (don’t worry about numbers). o Satellites provide an independent measure of global skin temperature without station biases and do show that broad trends agree even if there are minor biases introduced. o Global dimming was a period of decades where air pollution reflected incoming radiation and thus kept the planet cooler than it would have been with recent warming. Legislation to clean the air has been suggested for the initial lack of warming with CO2 to more recent warming. This also includes the recent (2020) Marine shipping fuel standards that have reduced the amount of ship track clouds that were reflecting heat. o Arctic amplification is the greater warming of the planet at higher latitudes, particularly in the northern hemisphere (Antarctica hasn’t warmed nearly like that of the Arctic). It is thought to be due to melting sea ice (sea ice extent at the end of the summer, not winter) that has been declining over the past few decades. Sea ice has high albedo keeping temperatures cool, so the melting of sea ice exposes darker water absorbing heat, leading to more warming. This amplification of warming is also observed at high altitudes. o Recent warming can be decomposed into a set of natural sources of warming (natural cycles) and anthropogenic global warming sources; natural sources are El Nino warming, solar cycle, Hunga Tonga eruption (water vapor). Anthropogenic sources are in marine fuel pollution reduction and greenhouse gases. Some years, the sum of natural factors exceeds that of man-made, but cumulatively, man-made exceeds natural variation, particularly since natural variation swings from cool to warm in a cyclical pattern. o Scale is super important when interpreting current warming as we are nearly always comparing to some kind of reference (Normal) period, which is then reported as an anomaly from that. In the short-term, we are way warmer than in the late 1800s but cool relative to the past millions of years (even the Medieval Warm Period). Changes are rapid today, but there have been rapid in the recent past, too (Younger Dryas, etc.). o Climate-related deaths from extreme weather are at record lows over the commonly reported climate change scale (late 1800s/early 1900s). o Bjorn Lomborg is a Danish human resource economist who has done several projects on ranking wicked problems to humanity; when doing this from a cost-benefit ratio, climate change is ranked low given the high cost for little overall change. He doesn’t say climate change isn’t a problem as it is, just an expensive one where limited dollars don’t really do much to change things relative to the biggest problems for humanity (education, health, war/conflicts, disease, etc.) Lecture 11, Physical Environment VII: Future Climate Change & Implications GCM is a General Circulation Model, while RCM is a Regional Climate Model IPCC is a UN working group on climate change that develops periodic reports on the state of climate change and possible adaptations/mitigations. We are currently on the 6th report (Assessment Report 6, AR6). o IPCC develops a framework for standardizing GCMs and puts them into scenarios where human population and transitions in energy are simulated. The old version of scenarios was called RCP (Representative Concentration Pathways), being replaced now by SSPs (Shared Socioeconomic Pathways). o SSPs of changes in energy and geopolitics are labeled with the level of radiative forcing (warming) on units of w/m2. These range from 1.9 (unlikely, too low) to 8.5 (unlikely, too high); most suggest a moderate warming scenario is most likely (2.6 to 4.5), resulting in 1.8 to 2.7 °C warming by 2100. o Because there is a lot of uncertainty around GCMs for even a single SSP, the most common method of prediction is using an ensemble model that is the average of multiple models (avg. temp, avg. precipitation) that reduces uncertainty. o A key limitation in the SSPs is trying to pin down long-term changes in human populations and transitions in energy. For populations, we have been revising down the peak human population towards later in the century to now a little over 10 billion (a decade ago, we were projecting 12 billion). This is partly why 8.5 w/m2 is unlikely. The reduction in the total human population is due to declining fertility rates. We are now at 2.3 children per woman, so still growing at a global scale (many countries below replacement). A stable population requires a fertility rate of 2.1 children per woman. Net Zero will be expensive and quite challenging to achieve in the short-term (mid-century), but it is a focus of many environmental organizations and, increasingly, governments. Lomborg examines this relative to what it would take to keep temperatures in check. o Specifically, Lomborg did an assessment of climate change risk relative to risks in human welfare (note he is ignoring ecological issues like biodiversity loss, etc.). He takes an optimization approach to the problem by optimizing the lowest total cost relative to (a) policy cost and (b) climate cost (disasters & mitigations), resulting in an optimal cost of about $125 Trillion to keep temperatures below 3.5 °C warming. We are already baked in to warm another 1.5 °C with the current CO2 emissions in the atmosphere. Keeping it to a warming of 2.2 °C would cost about $225 Trillion. [I don’t expect you to memorize all these numbers.] These are big numbers and require much bigger changes than any current climate agreement (e.g., the Paris Agreement). Lomborg does circle back to what the public ranks in UN surveys as the highest priority for public expenditures on human welfare (again, remember that this ignores biodiversity issues, etc.), and consistently, the public chooses climate lower than education, health, etc. So, there is a disconnect between policy folks and the public in priorities and, thus, where public dollars are currently invested. o Lomborg also illustrates in these calculations the need to consider that similar-sized natural disasters from their intensity (say the size of a flood) are resulting in larger and larger damages not necessarily due to changes in frequency or intensity of natural disasters (which may be changing in some locales but globally not much change), but rather the increasing size of urban areas that overlap areas of risk (say a flood plain). This concept of expanding urban areas over existing risky places is called the “Bull’s eye effect”. Where the bull’s eye (urban area) of risk (say flood risk) keeps getting bigger due to geography. Example applications of risk to the ecology of species o Velocity of climate change relates to the fact that geography and terrain vary within a region, making the effect of climate change on the location of the nearest similar climate in the future variable. Basically, the velocity of climate change is a measure of how fast you would need to move to ‘keep up with climate change’ from the standpoint of having to move (migrate) to keep some type of climate equilibrium condition. Way lower velocities in the mountains or near coastal areas than in flat areas of the continent's interior. [I don’t expect you to remember any of the case studies of Alberta and rarity] o The refugia concept, particularly climate refugia (or climate change refugia), is somewhat related to the velocity of climate change in that it looks for areas of low velocity, but instead of only using climate data, it uses the climate niches of individual species in current vs. future periods to find places where species will persist through time. It has also been used for historic refugia like glacial refugia, where species persisted during the last glacial maximum. Lecture 12, Biosphere I: Biomes to Niches Biosphere definitions o Biosphere is the place on earth where life dwells o Concepts for defining types, with the first from Holdridge on Life Zones using precipitation, potential evapotranspiration, and humidity levels o Whittaker’s biome concept is popular and even simpler, using only Mean annual temperature and precipitation and mapping what the dominant vegetation type is within different zones. o Whittaker also noticed something interesting in that climatically large areas of the planet should be forested but are not (climatically suitable to be forested). He called those places that are less forested than predicted or have the potential to be less forested than expected Ecosystem Uncertain. Bond suggested the solution as being consumer-controlled ecosystems (herbivory and/or fire). o Ecoregions use the concept of biomes but add a hierarchy (3 levels of detail) and add to the climate concepts of biomes that of dominant substrates for more subdivision. Case study of Temperature Deciduous Forest biome: o Disjunct species common from East N. America with that of East Asia. In fact, there are so many commonalities that Gray suggested that East N America’s flora should be more like East Asia than Western N America (called Gray’s hypothesis). Generally supported. But how? This is thought to be due to the evolution of the ecosystem at a time in the Earth’s history when the continents were close to each other at the north polar region during a hothouse period (66 Ma). o Not only were biomes in different places on the planet historically over longer periods, but in recent times, during the end of the last glacial maximum, the biomes were shifted on the planet and, in some cases, contained non-analog communities to today’s biomes/vegetation (not like anything today) due to unique climates and migration history of species. In N America, glacial refugia were thought in the east to be the southern Appalachian Mountains and in the west a complicated series of places in the inter-mountain west and coastal California, but also, to a lesser extent, Beringia in Alaska. o Some of the post-glacial changes in vegetation communities were due to differential migration of species, with trees being the most widely studied, demonstrating some paradoxes with initial expectations. Botanists assumed light wind-dispersed species should migrate faster than heavier seeded species, but the opposite was the case (hence the paradox) with Reid naming this phenomenon (called Reid’s paradox of larger seeded species like oaks migrating faster due to dispersal agents of animals). Some common biogeographic rules o Rapoport’s Rule: range sizes increase with latitude (more variable climates) o Bergman’s Rule: body size of animals increases in colder environments/higher latitudes (increase efficiency in dealing with cold temperatures) o Range limits at north vs. south: abiotic factors most affect range limits in the north of the species range, while biotic factors (competition) most affect range limits in the south. Concepts of Niche o Grinnellian (the ecological role or ‘habitat’ or ‘need’ of a species), Eltonian (place in the biotic environment such as trophic level), Hutchinsonian (multiple dimensions of the environment in which a species persists). o Hutchinson’s niche can be subdivided further into the (1) Fundamental niche (all possible conditions a species can persist including not observed on Earth so largely unknown but could be determined in growth chambers etc.); (2) Potential niche (place where a species would fill in the absence of biotic factors & dispersal limitations); and (3) Realized niche (the place/environment actually occupied by the species as it relates to interactions with other species etc.) o Niche conservatism is the retention of ecological traits over time; this is assumed in climate change risk assessment models for species and largely supported in studies of paleo-history of species locations to reconstructed climates at that time. o BAM Venn diagram approach to niches breaks up the idea of niches (Hutchinson-like) into Biotic, Abiotic, and Movement (BAM); it can be considered under two different conditions: (1) Equilibrium distribution where there is no difference/limitation in movement/migration; or (2) Non-equilibrium distributions where we assume species are not at equilibrium with the climate and locations of those climates on the planet (limitations in migration etc.). o Eltonian Noise hypothesis is where we assume A = B in BAM and that species are at equilibrium with the environment (no dispersal/movement limitations). Many models of species distribution for climate change assume this. o Hutchinson’s duality points out, however, that E-space (environment space) doesn’t map one-to-one, nor are they reciprocal to G-space (geographic space) [non- reciprocal where one G-space has only one E-space, but one E-space has multiple G- spaces.] This can help explain why species distribution is not really at equilibrium with the climate globally of where the environment space occurs (can be in wildly different places on the planet with no historical ability to colonize the area despite being good environmental conditions for the species). It also helps explain why we have so many invasive species (the environment was suitable, but they could not get there until humans transported them)! Lecture 13, Biosphere II: Trophic systems Basic trophic levels: o Species occur within trophic levels starting at primary producers (1st level), to a second level of primary consumers (or herbivore/heterotroph), and to upper levels of apex predators. The ‘Rule of thumb’ is that 10% of energy is retained (transferred) to the next level. So, in a simple 3-level system of plant-herbivore-carnivore, only 1% of energy is retained at the 3rd level. o Although energy is lost at each higher level, toxins are accumulated. This is referred to as Bioaccumulation. Green world and trophic cascades: o The Green-world hypothesis tries to explain why we are surrounded by mostly uneaten plants (if we let things go to equilibrium between plants and herbivores, we should see way more vegetation eaten). The explanation is that apex predators control herbivores, and thus, with reduced herbivore numbers, we see greater maintenance of plants (the first primary producer layer). Terborgh tested this on BCI in Panama and, more specifically, on islands in Venezuela, where he proposed that ‘big things/predators run the world’ and so a ‘top-down’ perspective of importance rather than the ongoing assumption that the world is run bottom-up. Schmitz further tested this in an old field plant-grasshopper-spider trophic system. The impact of removing the top predator on the rest of the ecosystem is often called a Trophic cascade, where a small change in total biomass at higher levels (loss of predators) can cause dramatic changes to the rest of the levels below it. Yellowstone National Park is perhaps the most widely known example. o Although a top-down perspective is popular and identified in places, this doesn’t mean that it dominates everywhere. Instead, plant defenses (physical, like thorns, or more commonly chemical) can restrict herbivory and thus be less sensitive to herbivore numbers or loss of apex predators. o Note that an odd vs. even number of trophic levels makes a big difference in the effects on the primary producers (base). Loss of the 3rd level predator will result in loss of the base plants, while in a 4th level system, the loss of the top doesn’t result in loss of base since effects alternate between levels. Case study of white-tailed deer hyperabundance: o Hyper-abundant herbivores like white-tailed deer in eastern N. America where their primary predator is gone (wolves), has resulted in crazy impacts to the ecosystem (forest regeneration loss, biodiversity loss, even extirpation of a carnivore). o Massive impacts on humans (human life being the #1 most dangerous wildlife), including Lyme disease. o Hunting is an obvious way to manage populations, but in some regions where deer numbers are high, there has NOT been public support to change hunting methods to control deer numbers (such as focusing on killing female deer; tradition is for trophy hunting for large bucks). Case study of Pennsylvania and Gary Alt’s job. o Quote and ideas from Aldo Leopold putting the perspective of the importance of predators and possible trophic cascades decades before it became common.

Use Quizgecko on...
Browser
Browser