Search This Blog

Wednesday, December 12, 2007

Cool New Warming Research

Though the global warming consensus is supposedly etched in stone (we're all gonna die!!!), there's been quite a bit of new research in the last few months that calls some of it into question (maybe we'll live after all).

Much of the general freakout is due to the belief that the 20th century was the warmest of the last millennium. This belief is based on reconstructions of past temperatures using tree ring data. However, Loehle recently combined a group of other, non-tree ring temperature reconstruction proxies and got significantly different results. As shown in the figure above, the 20th century wasn't anomalous at all according to Loehle's temperature reconstruction. Indeed, it was nearly exactly average for the last 2000 years.

Loehle's results were published in the lowly (according to the consensus scientists) Energy and Environment journal and have been mostly ignored. However, these results are different enough from the current reconstructions and his methods, while not perfect, are good enough such that at least some scientists are taking notice. I wouldn't be surprised if there are some revisions to climate theory due to Loehle's results.

In a second publication, Monckton asks:
The fact of warming tells us nothing of the cause. Yet the scientific consensus is that, though the rapid climatic warming from 1906 to 1940 was a natural recovery from the historically low temperatures of the Little Ice Age, it is we who are chiefly to blame for the equally rapid warming from 1975 to the present. Since some climatologists challenge this consensus, can we settle the debate by predicting with models and then detecting by observation a characteristic “signature” in the climate data that allows us definitively to distinguish between anthropogenic and natural warming of the Earth’s atmosphere?
The answer according to Monckton, as you may have guessed, is yes. Furthermore, the IPCC models seem to all agree on the "signature":
The UN’s fourth assessment report on climate change (IPCC, 2007) confirms that computer modeling predicts the existence of a unique and distinct signature or fingerprint of anthropogenic warming caused by our emissions of greenhouse gases. That signature is the instantly-recognizable tropical, mid-troposphere “hot spot” about 10km above the Earth’s surface. In the “hot spot”, the models predict that the rate of increase in atmospheric temperature, measured in degrees Celsius per decade, will be two or three times greater than at the Earth’s surface. In IPCC (2007), this predicted “hot-spot” signature of anthropogenic greenhouse warming is clearly visible on plots of modeled greenhouse forcing and of all forcings including the dominant greenhouse forcing, but is not visible on plots of solar, volcanic, tropospheric and stratospheric ozone, or sulphate aerosol forcings. The UN’s models accordingly distinguish clearly between greenhouse warming and other climate forcings: at least five separate general-circulation computer models of the climate all predict the existence of the “hot-spot” signature of anthropogenic greenhouse warming in the tropical mid-troposphere.
In other words, anthropogenic greenhouse warming should show different and predictable warming trends at different altitudes and latitudes. So how do the trends predicted by the models match real world data? Apparently, very poorly:
Yet in the plot from the Hadley Centre’s radiosondes, showing actual, observed temperatures in the troposphere, presented in the same altitude-vs-latitude fashion as the predictions made by the five computer models, the computer models’ repeatedly-predicted “hot-spot” signature of anthropogenic greenhouse warming is entirely absent. Indeed, very nearly all observational data on mid-tropospheric temperature trends over the past half-century show no tropical “hot-spot” at all; and, in the one record that shows it at all, the magnitude of the observed effect is insufficient to justify the UN’s choice of a very high central estimate of climate sensitivity to anthropogenic enhancement of the greenhouse effect. Our own small experiment also fails to demonstrate even the existence of the “hot-spot” fingerprint of anthropogenic warming, still less a magnitude sufficient to justify the IPCC’s high climate sensitivity. These surprising results present a very real difficulty for the conventional “global warming” theory – a difficulty that is not resolved either in CCSP (2006) or in IPCC (2007).
What Monckton is basically saying is that there is absolutely no sign of the greenhouse (or anthropogenic) component of global warming predicted by all of the models that have been used to predict catastrophic global warming.

The response of the global warmenists is that the observed data must be wrong (or very noisy) but their climate models are valid. Monckton's response:
Thorne et al. (2007) have attempted to resolve this difficulty by suggesting that the error-bars in the observational datasets are so large that they could in theory encompass the model-predicted “hot-spot”, that the datasets are not designed to identify small temperature trends, and that the outputs are exceptionally sensitive to the choice of limiting dates. However, it is on the basis of the observed data that the models are contrived, and, if the observed data are inadequate for drawing conclusions about whether the characteristic fingerprint of anthropogenic greenhouse warming exists, then a fortiori the outputs from theoretical models founded upon those data will be inadequate, and no conclusion about the magnitude of the temperature response to anthropogenic enhancement of the natural greenhouse effect can be legitimately drawn.
In other words, garbage in - garbage out: either the models are based on faulty data and they are therefore crap or they are based on good data and are therefore crap (because they don't predict the good data). In either case, the models, upon which all catastrophic global warming is based, are crap, especially since there are alternative answers:
The very close correlation between anomalies in tropical outgoing long-wave radiation and anomalies in global lower-troposphere temperatures, taken with the near-total absence of correlation between monotonic increases in CO2 concentration and chaotic temperature anomalies, suggests that it is the computer models, not real-world observations that are likely to be at fault.
Maybe the seas won't boil after all:
On this analysis, “global warming” is unlikely to be dangerous and extremely unlikely to be catastrophic.
Other new research sheds light on parts of the climate models that might be crap:
The widely accepted (albeit unproven) theory that manmade global warming will accelerate itself by creating more heat-trapping clouds is challenged this month in new research from The University of Alabama in Huntsville.

Instead of creating more clouds, individual tropical warming cycles that served as proxies for global warming saw a decrease in the coverage of heat-trapping cirrus clouds, says Dr. Roy Spencer, a principal research scientist in UAHuntsville's Earth System Science Center.

That was not what he expected to find.

"All leading climate models forecast that as the atmosphere warms there should be an increase in high altitude cirrus clouds, which would amplify any warming caused by manmade greenhouse gases," he said. "That amplification is a positive feedback. What we found in month-to-month fluctuations of the tropical climate system was a strongly negative feedback. As the tropical atmosphere warms, cirrus clouds decrease. That allows more infrared heat to escape from the atmosphere to outer space."
Well, that'll make a difference. Using just the increase in CO2 expected over the next century, the corresponding expected temperature increase would be about 1.2 C. The current models assume a significant cloud amplification increasing the estimate over the next hundred years by about a factor of three. If Spencer's observations are right, then the warming due to anthropogenic greenhouse gases will actually be substantially less than 1.2 C. In other words, the current models might be over predicting warming by an order of magnitude.

I've guessed that the feedback loops involved with greenhouse gases would be negative feedback loops. If they weren't, it seems unlikely to me that the climate would've been stable enough so far for life to have survived to this point.

But here we are.

8 comments:

Susan's Husband said...

When one side looks at data, and the other makes excuses that are obviously mendacious regardless of what's actually happening, I tend to trust the former more than the latter.

Bret said...

Previous to this year I've been neutral on the IPCC predictions. However, with all this new information coming out, I'm growing increasingly skeptical.

I guess that it'll probably take at least 5 years before the main stream media reports these new developments.

Harry Eagar said...

Monckton is wrong about the models' being 'contrived' on the basis of observational data.

Naively, you'd think that would be how they do it, but the modelers say no, that historical data are not part of their toolkit.

So, Monckton is right, the models are 'contrived,' but not out of data. They are contrived out of (heh,heh) thin air.

Bret said...

My understanding is that model parameters are tweaked to make the models match the data.

Not that being contrived out of thin air is any better.

Howard said...

Yes, but what chance do reasonable arguments stand against the power of faith?

Hey Skipper said...

Bret:

It seems to me that tweaking the models so they conform to data is precisely what they should be doing.

After all, that is how we reconciled the contradiction between aerodynamic models showing bumblebees cannot fly, and flying bumblebees.

Which is a lot different than contriving things from the rectal data bank.

That first para contains a gigantic, implicit, if, of course.

I am not agnostic on global warming; the evidence the earth's climate is getting warmer is undeniable.

What is also undeniable is that it has been doing so for roughly the last 300 years (glaciers in North America, with almost no exceptions) have been retreating for at least as long as anyone who cared to measure them has been around).

What is also undeniable, and acknowledged without a hint of irony at the UN Global Warming conference this week is that the planet is the warmest it has been

in

1100 years.

The difference between a mountain and a molehill is one of perspective.

Would that those who approach the IPCC on bended knee get some.

Susan's Husband said...

Yes, one should be tweaking models to make them conform to the data. But one can't then go back and say "the data is garbage, but my model is good".

Hey Skipper said...

SH:

I could pedantically quibble with your use of "can" vs. "may".

But not your point.