Comments on: Forecast Verification for Climate Science, Part 2 http://cstpr.colorado.edu/prometheus/?p=4306 Wed, 29 Jul 2009 22:36:51 -0600 http://wordpress.org/?v=2.9.1 hourly 1 By: Coyote http://cstpr.colorado.edu/prometheus/?p=4306&cpage=1#comment-9314 Coyote Wed, 09 Jan 2008 09:23:37 +0000 http://sciencepolicy.colorado.edu/prometheusreborn/?p=4306#comment-9314 A couple of thoughts come to mind: 1. When comparing 100 year forecasts in the first 5 years, it is really hard to show much divergence from forecast, even if the forecast is really bad. It might be more interesting to go back to the first or second assessment and compare their forecasts to reality 2. The convergence of surface temperature measurements with satellite should be a source of skepticism, not confidence. We know that the surface temperature measurement system is immensely flawed: there are still many station quality issues in the US like urban biases that go uncorrected, and the rest of the world is even worse. There are also huge coverage gaps (read: oceans). The fact this system correlates with satellite measurement feels like the situation where climate models, many of which take different approaches, some of them demonstrably wrong or contradictory, all correlate well with history. It makes us suspicious the correlation is a managed artifact, not a real outcome. 3. Satellite temperature measurement makes immensely more sense - it has full coverage (except for the poles) and is not subject to local biases. Can anyone name one single reason why the scientific community does not use the satellite temps as the standard EXCEPT that the "answer" (ie lower temperature increases) is not the one they want? Consider the parallel example of measurement of arctic ice area. My sense is that before satellites, we got some measurements of arctic ice extent from fixed observation stations and ship reports, but these were spotty and unreliable. Now satellites make this measurement consistent and complete. Would anyone argue to ignore the satellite data for spotty surface observations? No, but this is exactly what the entire climate community seems to do for temperature. A couple of thoughts come to mind:

1. When comparing 100 year forecasts in the first 5 years, it is really hard to show much divergence from forecast, even if the forecast is really bad. It might be more interesting to go back to the first or second assessment and compare their forecasts to reality

2. The convergence of surface temperature measurements with satellite should be a source of skepticism, not confidence. We know that the surface temperature measurement system is immensely flawed: there are still many station quality issues in the US like urban biases that go uncorrected, and the rest of the world is even worse. There are also huge coverage gaps (read: oceans). The fact this system correlates with satellite measurement feels like the situation where climate models, many of which take different approaches, some of them demonstrably wrong or contradictory, all correlate well with history. It makes us suspicious the correlation is a managed artifact, not a real outcome.

3. Satellite temperature measurement makes immensely more sense – it has full coverage (except for the poles) and is not subject to local biases. Can anyone name one single reason why the scientific community does not use the satellite temps as the standard EXCEPT that the “answer” (ie lower temperature increases) is not the one they want? Consider the parallel example of measurement of arctic ice area. My sense is that before satellites, we got some measurements of arctic ice extent from fixed observation stations and ship reports, but these were spotty and unreliable. Now satellites make this measurement consistent and complete. Would anyone argue to ignore the satellite data for spotty surface observations? No, but this is exactly what the entire climate community seems to do for temperature.

]]>
By: Coyote http://cstpr.colorado.edu/prometheus/?p=4306&cpage=1#comment-9313 Coyote Wed, 09 Jan 2008 09:21:28 +0000 http://sciencepolicy.colorado.edu/prometheusreborn/?p=4306#comment-9313 A couple of thoughts come to mind: 1. When comparing 100 year forecasts in the first 5 years, it is really hard to show much divergence from forecast, even if the forecast is really bad. It might be more interesting to go back to the first or second assessment and compare their forecasts to reality 2. The convergence of surface temperature measurements with satellite should be a source of skepticism, not confidence. We know that the surface temperature measurement system is immensely flawed: there are still many station quality issues in the US like urban biases that go uncorrected, and the rest of the world is even worse. There are also huge coverage gaps (read: oceans). The fact this system correlates with satellite measurement feels like the situation where climate models, many of which take different approaches, some of them demonstrably wrong or contradictory, all correlate well with history. It makes us suspicious the correlation is a managed artifact, not a real outcome. 3. Satellite temperature measurement makes immensely more sense - it has full coverage (except for the poles) and is not subject to local biases. Can anyone name one single reason why the scientific community does not use the satellite temps as the standard EXCEPT that the "answer" (ie lower temperature increases) is not the one they want? Consider the parallel example of measurement of arctic ice area. My sense is that before satellites, we got some measurements of arctic ice extent from fixed observation stations and ship reports, but these were spotty and unreliable. Now satellites make this measurement consistent and complete. Would anyone argue to ignore the satellite data for spotty surface observations? No, but this is exactly what the entire climate community seems to do for temperature. A couple of thoughts come to mind:

1. When comparing 100 year forecasts in the first 5 years, it is really hard to show much divergence from forecast, even if the forecast is really bad. It might be more interesting to go back to the first or second assessment and compare their forecasts to reality

2. The convergence of surface temperature measurements with satellite should be a source of skepticism, not confidence. We know that the surface temperature measurement system is immensely flawed: there are still many station quality issues in the US like urban biases that go uncorrected, and the rest of the world is even worse. There are also huge coverage gaps (read: oceans). The fact this system correlates with satellite measurement feels like the situation where climate models, many of which take different approaches, some of them demonstrably wrong or contradictory, all correlate well with history. It makes us suspicious the correlation is a managed artifact, not a real outcome.

3. Satellite temperature measurement makes immensely more sense – it has full coverage (except for the poles) and is not subject to local biases. Can anyone name one single reason why the scientific community does not use the satellite temps as the standard EXCEPT that the “answer” (ie lower temperature increases) is not the one they want? Consider the parallel example of measurement of arctic ice area. My sense is that before satellites, we got some measurements of arctic ice extent from fixed observation stations and ship reports, but these were spotty and unreliable. Now satellites make this measurement consistent and complete. Would anyone argue to ignore the satellite data for spotty surface observations? No, but this is exactly what the entire climate community seems to do for temperature.

]]>
By: Harry Haymuss http://cstpr.colorado.edu/prometheus/?p=4306&cpage=1#comment-9312 Harry Haymuss Tue, 08 Jan 2008 21:18:19 +0000 http://sciencepolicy.colorado.edu/prometheusreborn/?p=4306#comment-9312 Shouldn't we use the best information available? Using ground based observations skews the spatial coverage toward (subjectively "removed") UHI effects. Satellite is the best information we have, correct? Also, as Luboš Motl points out, 2007 is the coldest year this *century*. Why would you start at 2000? Try just this century, which started in 2001. What kind of trend do we get then? Interestingly, we get exactly what the solar influence people have been saying - a cooling Earth... Specifically, about 2.5 degrees per century... http://bp3.blogger.com/_4ruQ7t4zrFA/R33-zmGiC4I/AAAAAAAAAPc/qHr4TPICyfM/s1600-h/rss-msu-anomaly.JPG Shouldn’t we use the best information available? Using ground based observations skews the spatial coverage toward (subjectively “removed”) UHI effects. Satellite is the best information we have, correct? Also, as Luboš Motl points out, 2007 is the coldest year this *century*. Why would you start at 2000? Try just this century, which started in 2001. What kind of trend do we get then? Interestingly, we get exactly what the solar influence people have been saying – a cooling Earth… Specifically, about 2.5 degrees per century…

http://bp3.blogger.com/_4ruQ7t4zrFA/R33-zmGiC4I/AAAAAAAAAPc/qHr4TPICyfM/s1600-h/rss-msu-anomaly.JPG

]]>