While the U.S. suffers a historic cold spell featuring Arctic birds flocking to Florida, airports colder than the South Pole, and the President using the weather to plug Obamacare, the nation’s news outlets are bursting with stories lecturing us that global warming can lead to extreme cold temperatures.
Why don’t the talking heads ever lecture us when it’s hot out that global cooling can lead to extreme hot temperatures?
Back in the 1980s and 1990s, many scientists warned that human-generated carbon dioxide emissions were causing an unprecedented increase in global surface temperature that could be halted only by drastic reductions in energy production, i.e. by dramatically slowing the pace of industrial civilization.
Throughout the 1990s, mean global surface temperature rose a few tenths of a degree Celsius. Doom-saying climatologists felt vindicated and started making ever-wilder predictions, including an increased rate of global warming and widespread environmental damage and wildlife extinction over the next 15 years.
(Spoiler alert: The Earth stopped warming for the next 15 years.)
Around the mid-2000s, when mean global surface temperature was slightly lower than it was in 1997, references to “global warming” were replaced with references to “climate change,” a more all-encompassing term that seemed less likely to be scoffed at by laymen who could clearly see the world failing to boil over around them.
By the late 2000s, when mean temperature was still stagnating, the public started to suspect that international spokesmen on climate change were crackpots. As Lord Christopher Monckton noted, “Globally-averaged land and sea surface absolute temperature has not risen since 1998… The models heavily relied upon by the Intergovernmental Panel on Climate Change had not projected this multidecadal stasis in ‘global warming’… nor 50 years’ cooling in Antarctica and the Arctic; nor the absence of ocean warming since 2003… nor the active 2004 hurricane season; nor the inactive subsequent seasons… nor the consequent, precipitate decline of ~0.8 °C from January 2007 to May 2008 that has canceled out almost all of the observed warming of the 20th century.”
In other words, global warming models predicted nothing.
In the face of a decade’s worth of colder-than-normal winters, increased snowfall, and Antarctic ice thickening, alarmed alarmists expanded the connotation of “climate change” to include extreme temperatures, their hastily-concocted theory being that increased carbon emissions could cause temperature to oscillate wildly. Around the same time, climate change activists, less confident of their ability to snooker the public with temperature-related warnings, predicted a rapid increase in hurricanes, tornadoes, and extreme weather events.
Then the frequency of extreme weather events declined to historic lows.
Yet the scare-mongerers persist. Take a look at this widely-distributed NASA graph, which shows global land-ocean temperature anomaly from 1880 to 2012. Pretty frightening, huh? Now look at the high dot around 1998, and compare it to the final dot from 2012, which is lower. Or see what happens to the red trend line from 2003 to 2012. It declines. Not by much, but it does go down. And not one of the 73 United Nations global warming computer models predicted this decline during the 2000s.
Are climate scientists crazy? The theory that carbon dioxide emissions can lead to extreme high and low temperatures is not, on the face of it, raving lunacy. Perhaps some climatological pattern involving amplification of feedback could result in extreme heat in the summer and extreme cold in the winter. But the point is that this is just a theory, improbably and self-servingly based on 10 years of data that climatologists failed to anticipate.
When you preordain accelerated temperature increases, a melting Arctic, and the extinction of snow, and your predictions turn out wildly off-base, you’re allowed to take stock and come up with another theory. What you’re not allowed to do—as any real scientist will tell you—is use the same observational evidence as data to form your hypothesis and to confirm it. Go ahead and develop a new climate model; make specific, testable, falsifiable predictions; then wait for more data to amass. If what you predicted happens over, say, the next decade, then you’ve obtained vindication for your theory. That’s still not enough evidence to use as a basis for upending industrial civilization—I would prefer 30 years’ worth of on-the-money predictions—but at least you’ve generated some after-the-fact empirical support.
But that’s too long to wait for leftists who want government to stop rich oil companies and dirty coal manufacturers and greedy electricity producers from doing their jobs right now, or for scientists competing for millions of dollars in federal and private grant money to go out on a limb and do some open-minded research with the risk that it might take a decade or two for controversial theories to be proven right.
The next time someone lectures you about how global warming can lead to extreme cold and heavy snowfall, tell them, “That’s not what you people were saying 10 years ago. Come back in a decade and we’ll see if you were right.”
- TIME Mag blamed ‘polar vortex’ on ‘global cooling’ in 1974… (climatedepot.com)
- On The Coldest Day in America in 20 Years, Here Are Al Gore’s Stupidest Global Warming Quotes (activistposter.wordpress.com)
- Can global warming be real if it’s cold in the U.S.? Um… yes! (washingtonpost.com)
- Polar Vortex in U.S. May be Example of Global Warming (climatecentral.org)
- 2013 was a good year for climate science, but a mixed bag for climate policy | Dana Nuccitelli | Environment | theguardian.com (theguardian.com)