A set of recent articles on climate change research uncovers some new insights that might be helpful to gardeners, although the studies’ findings suggest a future almost universally unhelpful to the plants. So, for this “Lab Report,” we will step out of the lab to look at our new climate, and what it means for our plant friends.
Fewer days with above-normal temperatures!
By the very title of their article, “Warming is clearly visible in new US ‘climate normal’ datasets,” authors Bollinger and Schumacher (2021) give away the punchline—which is good, because they are trying to convey two seemingly contradictory points. is that the “normal” temperatures—what we expect to feel in our garden during a particular day (or season)—are not set in stone, but are calculated from large datasets of daily data. In other words, they represent a long-term average, and the exact date range for this calculation matters quite a bit. The previous normals were based on data spanning 1981 to 2010 (roughly, from the formation of Pet Shop Boys to Coldplay’s “Mylo Xyloto”); the updated normals span the period from 1991–2020 (Nirvana’s “Nevermind” to Beyoncé’s 'Black Parade').
The difference is noticeable: the average “normal” temperature rose by 0.5°F between the two periods, and is an entire 1.2°F higher than the twentieth-century average. For precipitation, the data are more complex, but in the West (especially south of the Oregon-California border), there is a clear drying trend, with the month of November showing the greatest decline in precipitation. Not coincidentally, our most destructive wildfire occurred in early November—the 2018 Camp Fire.
But there is a paradox implied in this recalculation. As the range of what is considered normal has moved (upward for temperature, downward for precipitation), the nightly news will announce fewer above-normal days—even when the day had been uncommonly hot—and fewer below-normal precipitation events—even when the rainfall seems to be scarce—because those hot dry days now fall within the normal . Our plants, however, having been planted in the “old-normal” climate (and many having evolved in the very-old normal), are unlikely to find this explanation helpful as they try to cope with the increased water demand and increased damage from high temperatures, to mention just a couple of the effects.
Get ready for some whiplash
[caption id="attachment_28677" align="alignright" width="331"] Swain et al Fig 4[/caption]
Let us also have another look at the precipitation side of climate change. We already noted the Diffenbaugh and colleagues article presenting the modeled future in which all droughts become “hot droughts.” Since that article was published, we also endured a spectacularly wet winter (2016–2017) and a historic atmospheric river this October. A couple of recent articles provide new insight into both sides of that unhappy drought-flood coin: let’s take the wet weather extremes first.
Daniel Swain and colleagues (2018) start by reminding us that, when it comes to predicting the effects of climate change on precipitation in California, “a cohesive picture has yet to emerge” (in contrast to predicting the temperature, where the trend is very clear—see above). But they then describe three interesting outputs of their long-term model: the first modeled outcome is a nearly 100 percent increase in the frequency of extremely wet periods across California, both at winter-long scale (which used to be seen only once in 25 years), and at sub-season scale (say, a spectacularly wet month that is not part of a wet winter; this is a more uncommon situation, that was only seen once every 200 years or so). The second, parallel and paradoxical result is a 50 percent increase in extremely dry winters (those that used to occur only once every century), although this shift seems to be more pronounced in Southern California. The third result you might be able to predict from the preceding two. It is the increased frequency of “precipitation whiplash”: a rapid shift between a dry year (<20th percentile of precipitation) followed by a very wet year (>80th percentile), something like what happened when the 2012–2016 drought gave way to the 2016–2017 winter floods. This whiplash is now predicted to happen 50 percent more often in the coming century than in centuries past. These effects combined, the authors note, result in very little change to the average precipitation—things just get more extreme at the extreme wet or dry ends.
As if these three unhappy predictions were not enough, the authors also note another modeled outcome: the wet season is projected to become a bit more “concentrated,” in that the shoulder months (September, October, April, and May) will become a bit drier, whereas the dead of winter, November to March, will become a bit wetter. In other words, our usual annual dry summer is likely to get a bit longer.
[caption id="attachment_28678" align="aligncenter" width="356"] Swain et al Fig 5[/caption]
But back to droughts
Our final article (Williams et al. 2020) also gives away the plot in its title, using the dreaded term “megadrought” to describe the 2000–2018 period. Williams and colleagues first remind us of the immense natural variability of our climate (echoing the preceding two articles), and also of the two moisture reservoirs that are especially important to anyone with a garden in the West: the soil and the snowpack.
[caption id="attachment_28683" align="aligncenter" width="757"] Williams et al Fig 1[/caption]
Interestingly, the authors use tree ring data to model these two moisture parameters. To indicate a drought, they use the soil moisture “anomaly” measure—the deviation from average soil moisture—and point out that from the year 800 to 2018, there were about 40 very large droughts, of which four were megadroughts. While the historical megadroughts were longer than anything we have seen in this century (thus far), our own mini-megadrought was much more expansive in space than the previous large droughts, affecting the entire Southwest at one time or another between 2000 and 2018. Our 2000–2018 drought was (is?) about as bad as they get: it is one of the top two by intensity, exceeded only by the late-1500s megadrought (specifically, the 1575–1593 period).
[caption id="attachment_28684" align="aligncenter" width="747"] Williams et al Fig 2[/caption]
And climate change? of the models the authors used showed that the climate change made the drought worse. And not a little worse: the effects of global warming accounted for as much as 47 percent of the soil water anomaly, making what would otherwise have been a serious drought into a megadrought comparable to the worst one ever recorded.
The findings described in these three articles continue the themes suggested by earlier research in the West, and it is striking how the un-related articles weave together some common themes. First, the climate has been and will continue to get warmer everywhere—thus plants will need more water. Second, our average precipitation conditions will not get dramatically wetter or dramatically drier, although our rainy season will become somewhat compressed into the winter months at the expense of the fall and spring months. Finally, the frequency of the very-wet and the very-dry periods (years, seasons) will increase, as will the chance that an “ordinary” drought will feel more like a “megadrought. The added stress of a globally warmed planet means plants will need more water at the same time that less water is available both in snowpack (melted early) and the soil (evapo-transpired away).