The ongoing, decades-long integration of non-atmospheric components into climate science takes many forms. This reflects increasing recognition of the two-way interactions by which climate impacts both modulate and are modulated by culturally based decisions, demographic vulnerabilities, societies' financial resources and technical capabilities, and so on. Simultaneously, the emphasis on mean change from the early IPCC reports has given way to a diverse bouquet of illustrations of how rare and geographically patchy conditions cause the lion's share of impacts in just about every way that we care about: health, ecosystems, economies, and societies. Even among extreme events, it is often multifaceted ones (or combinations of them) that are responsible for the most severe damages. These 'compound' events can result from intrinsic physical processes (as with the tight relationship between heat and drought in many places, or a single storm affecting coastal water levels in several ways), or from the more or less random co-occurrence of two events in close spatial or temporal proximity (e.g. wildfires followed at a lag of weeks or months by heavy rain). Sector-specific analyses, e.g. for food, hydrology, or insurance, have gone a step beyond physical hazards, by diving deep into relevant locations, actors, and relationships that affect risks. Along with work by the natural-hazards community, these studies have refined the concept of 'risk' to now encompass a nuanced portrait of how individual and sociocultural characteristics ('societal drivers') and physical characteristics ('climatic drivers') combine and interact to produce the observed impact from an observed event. Inspired by such ideas, in May 2019 I and others hosted a workshop at Columbia University exploring the dual frontiers of (1) characteristics of interacting extreme events and (2) how they affect human societies. Our new paper builds on some of the main threads of the workshop, but adds in what had been an overlooked element: how human decisions can amplify impacts relative to what they could have been, and in some cases can even feed back onto the severity of the event itself. In the terminology that we propose, these societal drivers transform an event (or event sequence) from being 'compound' to 'connected'; that is, interactions between events or event components can occur regardless of whether their physical hazards compound or not. The example we describe in the most detail -- because it was recent, well-documented, and catastrophic for those who were shortchanged -- revolves around Hurricane Maria's impacts on Puerto Rico in 2017. The island's existing vulnerabilities in terms of infrastructure and emergency-response systems were already laid bare when Hurricane Irma struck weeks earlier; Category 5 Maria added to the misery. But beyond that, bureaucratic inefficiencies and resource limitations on the part of the Federal Emergency Management Agency made for a slow and painful recovery, ultimately magnifying the long-term costs. Once the (admittedly complex) prototype is understood, examples abound. Multiple-breadbasket failure, for instance? Its likelihood is implicitly defined by, and thus modulated by, the places where crops are grown (and this could be changed in the future). Large wildfires primed by naturally dry years and climate-change-aggravated summertime heat? They're also primed by untrimmed vegetation, aging equipment, and land-use policies that allow or even encourage population growth in canyons and hills. In fact, the below figure from the paper illustrates how peaks in the timeseries of damages from four major types of extreme events are all straightforward to explain using this framework. ![]() Paper figure 2: Major losses caused by extreme climate events over 1980-2019 and their connective elements, among tropical cyclones (green), floods (blue), droughts (orange), and wildfires (red). Annotations in high-loss years state some of the (first row) climatic and (second row) societal drivers that shaped the total impacts. Loss data are from Aon, Catastrophe Insight Division. In the course of writing the paper, we realized that intricacies of the type described above are not at all rare. They may only seem so because data limitations, and the intimidating nature of the multiple feedback loops that can be present, have restricted the number of reports that directly reference such interactions. However, it does not take much supposition to conclude that connected extreme events are an important and fertile area spanning the intersection of climate science, engineering, and the social sciences, and one for which we recommend some principles for effective research based on our author team's researcher and practitioner expertises. Chief among these is the recognition that, to be actionable, climate information must be provided in a way that is directly congruous to existing decision-making pathways (for whatever the intended application). Studies focusing on even a slightly different area, timescale, or variable are nearly impossible for stakeholders to integrate, considering that they have many responsibilities and considerations outside of the climate domain, as well as legal mandates and financial limitations. In other words, collaborations that lead to important impact-relevant research share many characteristics with collaborations among scientists: they must be carefully and deliberately constructed; not suck up all of either party's time; rest on a solid foundation of trust; and be flexible enough to recognize each other's interests and strengths, yet firm enough to mutually shape them. Storylines, stress testing, and ethnographic surveys are some of the promising yet still sparsely employed techniques for identifying research questions that sit at the junction of what can happen, what can be found, and what a decision-maker needs to know when it comes to connected extreme events.
Notwithstanding the topic's early stage, we conclude the paper on a note of optimism, relating that similar challenges have been successfully met in other contexts, and that it takes only having a sufficient incentive for people to come together and devise improvements to a functional but suboptimal status quo. The increasing severity and frequency of many types of climate extremes, resulting in a continual stream of new combinations of impactful and hard-to-predict idiosyncratic connections, hopefully provide that basis. Even without improved forecasts or projections per se, we are convinced that investing in better awareness about and allocation of resources for connected extreme events is more than worthwhile if it enables responses to them that result in recovery, innovation, and increased resilience going forward, rather than degradation of human and financial capital and a perverse widening of inequalities.
0 Comments
In the virtual pages of Science Advances today appears a study by Tom Matthews, Radley Horton, and me reporting advances in knowledge of the historical record of heat-humidity extremes. In this post I'll discuss some of the key findings, my thoughts about how they fit into the broader scientific context, and the kinds of future work that they help incentivize. The study is making some headlines probably because the concept of a wet-bulb temperature so high that it physiologically cannot be withstood for prolonged periods of time captures people's imaginations, but we have been careful to point out that the bulk of the work (as well as the novelty) rests in the story about values that are still extreme but decisively below this level. For instance, 33C corresponds to a temperature of 40C accompanied by a dewpoint temperature of 31C, and even 31C (which has occurred about 7,500 times globally over 40 years) exceeds the all-time maximum in the notoriously humid Washington, DC metropolitan area. These values preferentially occur in South Asia and select portions of Mexico, the Middle East, Australia, and East Asia. As published, in fact, the station dataset HadISD contains significantly more such extreme values (including dozens of exceedances of 35C), but we progressively eliminated many of these with additional quality-control measures ranging from temporal and spatial consistency checks (against the station itself, other stations, and reanalysis) to removing any dewpoint temperatures above the widely-reported 35C dewpoint recorded in Dhahran, Saudi Arabia in July 2003. We also dug deep into possible instrumental and observer errors. The result is what we consider to be a conservative estimate of global frequency and intensity -- keeping in mind that many of the stations record at 3- or 6-hourly intervals, rather than hourly, and also the paucity of good-quality observations in many global hotspots such as the Sahel; Iran and Pakistan; and East Africa. Every threshold value except for the exceedingly rare 35C has experienced at least a doubling in frequency since 1979, underscoring the tight and nonlinear relationship between global mean temperature and extreme heat. This creates an alarming prospect in a world where temperatures are rising rapidly. Expanding the reach of artificial cooling (in the form of air conditioning) serves as an obvious solution in the short term, but looking forward, the inevitability of further increases is a sobering reminder of the importance of both mitigation of the temperature rise, and adaptation to what's already in store. Architecture, urban & regional planning, and national policies around farming and land use could play important roles on the adaptation side, but need to be guided by highly region- and season-specific knowledge in order to be effective. For instance, in South Asia prior to the onset of the summer monsoon, wet-bulb extremes are driven primarily by excessive temperatures (~45C or higher), whereas afterwards they occur in conjunction with greater relative humidity but lower temperatures (more like 37C). This difference could lead to policies that prioritize reducing heat in spring, and reducing humidity in summer, while remaining cognizant of the many constraints on behavioral changes, such as the need of farmers to irrigate at specific times within the growing season. ![]() Global trends in extreme humid heat. (A-D) Annual global counts of TW exceedances above the thresholds labeled on the respective panel, from HadISD (black, right axes, with units of station-days), and ERA-Interim grid points (gray, left axes, with units of grid-point-days). We consider only HadISD stations with at least 50% data availability over 1979-2017. Correlations between the series are annotated in the top left of each panel, and dotted lines highlight linear trends. (E) Annual global maximum TW in ERA-Interim. (F) The line plot shows global mean annual temperature anomalies (relative to 1850-1879) according to HadCRUT4, which we use to approximate each year's observed warming since pre-industrial; circles indicate HadISD station occurrences of TW exceeding 35°C, with radius linearly proportional to global annual count. This study does not so much increase our knowledge about the future as it does about the past. Specifically, it improves our understanding of the historical baseline and the details of the brief 'spikes' that it contains. While extremes of any variable are more intense when looking at small spatial and temporal scales, we were particularly surprised by the steepness of the horizontal and vertical gradients of wet-bulb temperature in places like the Persian Gulf and Gulf of California. During the most severe events, the dropoff as seen by radiosondes ascending from 995 to 975 mb (150 to 310 m) averages 5C or more! This is consistent with the expectation from the two powerful competing forces that shape extreme heat in these arid regions: a source of moisture, itself at high temperatures, sitting underneath a near-constant large-scale high-pressure system which acts to trap heat and moisture close to the surface. Under the right meteorological conditions (e.g. a continuous onshore flow, plus maybe some other anomalies which future work will need to establish), the sun beats down and moist static energy builds and builds in a shallow boundary layer with no means of escape. The small scales and brief times at which this occurs -- before being dissipated by some modest amount of horizontal or vertical mixing, for example -- make it very difficult for models or reanalyses to capture it with appropriate severity. ERA5 does a much better job than ERA-Interim, but the underestimates still generally exceed 2 deg C. Where the spatiotemporal scales of the extremes are larger, such as in the interior eastern United States, these reanalyses are right on the money. ![]() Meteorological conditions when TW=35°C. (Top) ERA-Interim composite of 10-m winds and 2-m TW on the n=4 days when TW=35°C was recorded at Ras al-Khaimah, UAE (blue square). Resolution of plotted data is 0.5°x0.5° and 6-hourly. (Bottom) Same as top but for ERA5. Resolution of plotted data is 0.25°x0.25° and 1-hourly. Mean TW daily maximum near station location is 28.7°C for ERA-Interim and 30.8°C for ERA5. ![]() Vertical profiles of coastal extreme humid heat. Radiosonde temperature (diamonds) and TW (circles) for Abu Dhabi International Airport, United Arab Emirates at 12Z on all days between 1983 and 2019 with a lowest-level TW value greater than the annual 99.75th percentile (red, 10 days); between the annual 97.5th and 99.75th percentiles (orange, 90 days); between the annual 90th and 97.5th percentiles (green, 298 days); and between the annual 50th and 90th percentiles (blue, 1593 days). Profiles are truncated at 850 hPa for visibility. Vectors on the right-hand side indicate composite wind speed and direction on these days for each height bin, where available; the map on the left-hand side is a 2.5x2.5 box indicating the location of the launch site. There remains much we do not know about the ingredients and impacts of these exceptionally rare events. How are they affected by irrigation or other land-cover change? By urbanization? By short-term variations in sea-surface temperature? By meteorological conditions such as monsoon progression or passing weather systems? Given the generally larger importance of humidity in producing the 'spikes' that we see, should more attention be focused on limiting extreme humidity rather than extreme temperature? Our paper hints at some hypotheses about these factors, but does not definitively establish or quantify them, especially in terms of how they may work together in varying combinations to affect extreme wet-bulb temperatures in one region versus another.
Ultimately, speaking for Radley and Tom as well as myself, we see our paper fitting neatly within the structure of the running conversation begun by hot-water-immersion tests in the early and mid 20th century, emerging definitively in the climate literature with Sherwood and Huber 2010, and continuing recently with papers by Pal and Eltahir, Im et al., and others. We hope that this robust conversation broadens and continues, evolving such that even as the leading edge of these extremes push beyond levels seen in the Holocene, human ingenuity and compassion find ways to forestall disasters in the near-term, and to develop policies and technologies in the medium- and long-term, that look holistically at this 'wicked' problem and chip away at the socioeconomic vulnerabilities, consumption habits, and development geographies that exacerbate it. A personal, subjective attempt at summarizing the top climate stories and advances of the decade that has been: Climate models continued to prove very good at predicting global temperature change from greenhouse-gas forcings, and total emissions continued to track the top-line emissions scenario. Temperature increases led to 8 of the decade's 10 years ranking in the top 10 warmest years since 1850. A recent review paper looking back at studies from the late 20th century found that even the simpler, coarser-resolution models then in use predicted temperature changes entirely consistent with what has since been observed. The similarity across generations of models gives further confidence to global-average statistics such as mean annual temperature changes. It is sobering but not at all surprising, given the strength of the economic, social, and political status quo, that efforts such as the Paris Agreement of 2015 have not yet made any appreciable dent in the irrepressible upward track of greenhouse-gas *emissions* (not to mention concentrations), and thus that global-average temperature records continue to be broken left and right. A variety of severe extreme events, often distinguished by their long durations, inspired new efforts to understand and mitigate them. Several such events made their mark on the arc of history by striking wealthy and/or populated areas, rather than by their geophysical rarity. These included the 2010 Russian heatwave, 2010 Pakistan floods, 2010-11 Queensland floods, 2011 Thailand floods, 2012 Midwest drought and heatwave, 2012 Hurricane Sandy, 2014 and 2015 US Midwest and Northeast cold snaps, 2015 and 2018 European heatwaves, 2017 Hurricanes Harvey, Irma, and Maria, 2017-2019 California wildfires, and ongoing 2019 Australia bushfires. Some were notable more for highlighting to physical scientists aspects of climate variability or change that were previously underappreciated (such as the much larger rainfall amounts associated with slow-moving tropical cyclones, or the mid-latitude effects of Arctic sea-ice melt), while others made headlines for their dramatic economic or ecological effects (such as the vulnerability of international supply chains to floods or storms). The now-ubiquitous Internet, and in particular social media, enhanced the power of some events by making the visual evidence of them compelling and unavoidable. Areas from agriculture to international trade to urban planning were increasingly shaped by the recognition that these kinds of extreme weather and climate events pose major (and in many cases growing) risks which it is imperative to address. Weather and climate computer models enabled qualitative as well as quantitative improvements in representing the Earth system, across a spectrum from basic research to public-facing operational forecasts. Probabilistic forecasts of storm surge and fluvial flooding hour by hour and house by house. Continuous global 3-km hourly weather forecasts. Quantifications of how the land surface affects the development of individual severe storms. Near-real-time estimates of the fractional contribution of anthropogenic effects on the characteristics of a natural disaster. Robust partitioning of observed regional climate changes into deforestation, irrigation, urbanization, global greenhouse gases, and major modes of variability. All of these were well beyond the limit of scientific and computational capability before the 2010s, but have now come into their own. A safe bet is that the 2020s will see many more such successes, each of which allows us to see 'around a corner' that had previously been blind. The pipeline from research to operations moves haltingly, but on a decadal basis the progress is clear, even if in ways that don't garner much publicity. And now, on to the 2020s! They present at the same time the largest-ever opportunity for human development and for furthered understanding of the physical climate (and how the two-way links between it and societies function), as well as the largest-ever risks from the potential misallocation of financial and intellectual resources in the face of rapid ongoing changes. As the world continues to become larger and more complicated, a certain level (even minimal) of harmony and collaboration within and among the international research and policy communities would greatly ease our ability to constructively manage the enormous task that we have effectively set out for ourselves as a species: to be, for the foreseeable future, active and conscientious guardians of an entire planet.
The notion of 'long-term means' is fundamental to climate science. Averaging over time and/or space constitutes the very definition of climate, according to authoritative sources such as the WMO. It's baked into our sense of the world as humans -- that if we wait long enough, all reasonably likely things will occur, and as a corollary, our memories and lived experiences are good proxies for the probability of occurrence of certain events, and the range of possibilities. But in times of rapid change, this sense (which goes by the technical term 'stationarity') can be undermined, and with it associated ideas such as anomalies. The question then becomes, if each decade is different in a statistical sense than the decades on either side of it, how are we to conceptualize the climate system -- as crystallized in, for example, major decisions about where to live, where to make investments, or whether to buy certain kinds of insurance (and how much). Recent evidence points to an approximately 5-year window over which people tend to adjust to climate regimes, which creates some cause for concern that rapid climate change will not be perceived as such, and we will not fully appreciate the environmental damage that it creates. Although there is clearly inherent tension between physical and psychological realities and the usage of 'averages', there is no good answer for many of these issues, which is likely why 'averages' and their ilk continue to be regularly cited and discussed in many contexts. (The nomenclature of 'climate normal', as used by the National Weather Service and others, is particularly prone to misinterpretation.) An especially striking case in point involves the most recent set of Climate Prediction Center seasonal outlooks. For three-month-average periods going out 15 months, each part of the country is shaded according to whether above-average, near-average, or below-average temperatures are expected, and also the confidence of these predictions. Remarkably, throughout the entire period (out to boreal autumn 2020), every part of the country is expected to see near- or above-average conditions, with above-average accounting for the vast majority of that. These categories are based on a tripartite splitting of the 1981-2010 temperature distribution, which made me curious as to how the probabilities may have shifted in the 20-25 years since the midpoint of that time period. I used the difference between the 2013-2017 and 1981-2010 season-average CPC temperature data as a rough approximation of recent seasonal warming, and added this value to the 1981-2010 33rd percentile, to see just where what used to be a 33rd-percentile day now falls. Since seasonal averages vary fairly little from year to year, a large percentile change can result from a modest mean warming -- and this is what I find in the below figure. Over much of the country, 20 years of warming have turned what was a 33rd-percentile day into one in the warm half of the 'normal' distribution, and up to the 75th percentile of 'normal' in the most rapidly warming regions and seasons. In other words, a '75' on the map indicates the bottom 75% of the 'normal' distribution now occurs only 33% of the time. I've made a small spreadsheet illustration of this shift, which appears below the map. Clearly 5 years is much too short to make any kind of reliable climate statement; perhaps the next 5 years will bring something a bit different. But the purpose of this analysis is more of a proof of concept, of how much 'normals' are being eroded by ongoing rapid warming, and that the lack of 'below-normal' conditions projected over the entire continental U.S. for the entire next year is not nearly as improbable as it might at first seem. |
Archives
January 2022
Categories |