Monday 7 October 2013

Climate changes in weather variability

Scientific summary, Priority research program (DFG-SPP) proposal

This SPP will study changes in the variability of the weather using daily climate data. An important reason to study variability is that changes in extremes can be caused by changes in the mean and in the variability. Variability is more important for extreme extremes and the mean is more important for moderate extremes. It is thus not clear whether results for moderate extremes, which are most studied, extrapolate to true extremes, which are important for many climate change impacts. Furthermore, the mean state has been studied in much more detail and is thus likely more reliable. Also for modelling nonlinear processes in climate models, variability is important.

In the first three-year period the SPP will study weather variability with a focus on the quality of observational data and on the performance of the modelling and analysis tools. This phase will concentrate on Europe, which has the longest and best observations available. In the second phase, the focus will be on understanding the mechanisms that cause these changes. Such studies need to be performed on the global scale.

First phase


1. Weather variability needs to be analysed in a range of difference climatic variables and phenomena at various temporal time scales and spatial ranges, as well as for different measures of variability, seasons and regions and their relations with climate modes. To learn most from these studies, they should be performed in a way that eases intercomparisons.

2. We will improve the analysis methods to study changes in the spatial and temporal dependence of variability over a range of spatio-temporal scales. Important for the comparability of studies is that the range of spatio-temporal scales is well defined. These methods will analyse the full probability distribution or multiple variability measures and not just one or a few.

3. Non-climatic changes due to changes in monitoring practices are especially important when it comes to changes in variability. We will thus develop quality control and (stochastic) homogenization methods for the probability distribution of daily data and estimate uncertainties due to remaining data problems.

4. We will investigate the properties of inhomogeneities in the essential climatic variables in various climate regions. Two methods for this are 1) using parallel measurements with historical and modern set-ups and 2) by studying the adjustments made by homogenisation methods.

5. An attractive alternative to creating homogenized datasets is the development of analysis methods that use data from homogeneous subperiods (similar to what the Berkeley project (BEST) has done for the mean temperature).

6. We will validate climate models with respect to variability at various temporal and spatial scales. Because of the differing spatial averaging scales, this includes the study of the downscaling or gridding methods.

Second phase


7. The methods developed in the first phase will have to be made robust to be applied to large global datasets to be able to study changes in weather variability for all climate regions of the Earth.

8. We will validate climate models globally, for various climate regions, with respect to variability at various temporal and spatial scales.

9. The mechanisms that determine natural and man-made changes in variability will be studied in global models and datasets.

Climatologists, statisticians and time series analysts working on extreme weather, quality control, homogenization, model validation or downscaling likely have the skills to participate in this SPP.

The SPP is focused on our understanding of the climate system. While impact studies will strongly benefit from the results, they are not part of this SPP. Studies on changes in extremes are welcome if they analyse the extremes together with other variability measures. Research on long-term (climatic) changes in the mean does not fit in this SPP on weather variability.

Saturday 5 October 2013

Five statistically interesting problems in homogenization

For many the term homogenization is associated with dusty archives. Surely good metadata on station histories is important for achieving best results, but homogenization is much more, it is especially a very exiting statistical problem. It provides a number of problem that are of fundamental statistical interest.

Most of the work in homogenization has been focussed on improving the monthly and annual means, for example to allow for accurate computations of changes in global mean temperature. The recent research focus on extreme and server weather and on weather variability, has made the homogenization of daily data and its probability distribution necessary. Much recent work goes in this direction.

As I see it, there are five problems for statisticians to work on. The first problems are of general climatological interest and thus also for the study of weather variability. The latter ones are more and more important for the study of weather variability.

Problem 1. The inhomogeneous reference problem
Neighboring stations are typically used as reference. Homogenization methods should take into account that this reference is also inhomogeneous
Problem 2. The multiple breakpoint problem
A longer climate series will typically contain more than one break. Methods designed to take this into account are more accurate as ad-hoc solutions based single breakpoint methods
Problem 3. Computing uncertainties
We do know about the remaining uncertainties of homogenized data in general, but need methods to estimate the uncertainties for a specific dataset or station
Problem 4. Correction as model selection problem
We need objective selection methods for the best correction model to be used
Problem 5. Deterministic or stochastic corrections?
Current correction methods are deterministic. A stochastic approach would be more elegant


Friday 4 October 2013

A database with daily climate data for more reliable studies of changes in extreme weather

(Repost from Variable Variability)

In summary:
  • We want to build a global database of parallel measurements: observations of the same climatic parameter made independently at the same site
  • This will help research in many fields
    • Studies of how inhomogeneities affect the behaviour of daily data (variability and extreme weather)
    • Improvement of daily homogenisation algorithms
    • Improvement of robust daily climate data for analysis
  • Please help us to develop such a dataset

Introduction



One way to study the influence of changes in measurement techniques is by making simultaneous measurements with historical and current instruments, procedures or screens. This picture shows three meteorological shelters next to each other in Murcia (Spain). The rightmost shelter is a replica of the Montsouri screen, in use in Spain and many European countries in the late 19th century and early 20th century. In the middle, Stevenson screen equipped with automatic sensors. Leftmost, Stevenson screen equipped with conventional meteorological instruments.
Picture: Project SCREEN, Center for Climate Change, Universitat Rovira i Virgili, Spain.


We intend to build a database with parallel measurements to study non-climatic changes in the climate record. This is especially important for studies on weather extremes where the distribution of the daily data employed must not be affected by non-climatic changes.

There are many parallel measurements from numerous previous studies analysing the influence of different measurement set-ups on average quantities, especially average annual and monthly temperature. Increasingly, changes in the distribution of daily and sub-daily values are also being investigated (Auchmann and Bönnimann, 2012; Brandsma and Van der Meulen, 2008; Böhm et al., 2010; Brunet et al., 2010; Perry et al., 2006; Trewin, 2012; Van der Meulen and Brandsma, 2008). However, the number of such studies is still limited, while the number of questions that can and need to be answered are much larger for daily data.

Unfortunately, the current common practice is not to share parallel measurements and the analyses have thus been limited to smaller national or regional datasets, in most cases simply to a single station with multiple measurement set-ups. Consequently there is a pressing need for a large global database of parallel measurements on a daily or sub-daily scale.

Also datasets from pairs of nearby stations, while officially not parallel measurements, are interesting to study the influence of relocations. Especially, typical types of relocations, such as the relocation of weather stations from urban areas to airports, could be studied this way. In addition, the influence of urbanization can be studied on pairs of nearby stations.

Thursday 3 October 2013

A real paper on the variability of the climate

(Reposted from Variable Variability)

I am searching for papers on the variability of climate and its natural variability and possible changes due to climate change. They are hard to find.

The New Climate Dice

This weekend I was reading a potential one: the controversial paper by James Hansen et al. (2012) popularly described as "The New Climate Dice". Its results suggest that variability is increasing. After an op-ed in the Washington Post, this article attracted much attention with multiple reviews on Open Mind (1, 2, 3), Sceptical Science and Real Climate. A Google search finds more than 60 thousand webpages, including rants by the climate ostriches.

While I was reading this paper the Berkeley Earth Surface Temperature group send out a newsletter announcing that they have also written two memos about Hansen et al.: one by Wickenburg and one by Hausfather. At the end of the Hausfather memo there is a personal communication by James Hansen that states that the paper did not intend to study variability. That is a pity, but at least saves me the time trying to understand the last figure.

Reinhard Böhm

That means that the best study I know on changes in variability is a beautiful paper by Reinhard Böhm (2012), who unfortunately recently passed away, an enormous loss. His paper is called "changes of regional climate variability in central Europe during the past 250 years". It analyses the high-quality HISTALP dataset. This dataset for the greater Alpine region contains many long time series; many of the earliest observations were performed in this region. Furthermore, this dataset has been very carefully homogenized.

Reinhard Böhm finds no change in variability, not for pressure, not for temperature and not for precipitation. His main conclusions are:
  • The first result of the study is the clear evidence that there has been no increase of variability during the past 250 years in the region.
  • We can show that also this recent anthropogenic normal period [1981-2010, red.] shows no widening of the PDF (probability density function) compared to preceding ones.
  • It shows that interannual variability changes show a clear centennial oscillating structure for all three climatic elements [pressure, temperature and precipitation, red.] in the region.
  • For the time of being we have no explanation for this empirical evidence.