2015, "Decoding the precision of historical temperature observations," Quarterly Journal of Royal Meteorological Society
DOI: 10.1002/qj.2612
"Abstract
Historical
observations of temperature underpin our ability to monitor Earth's
climate. We identify a pervasive issue in archived observations from surface stations, wherein the use of varying conventions for units and
precision has led to distorted distributions of the data. Apart from the
original precision being generally unknown, the majority of archived
temperature data are found to be misaligned with the original
measurements because of rounding on a Fahrenheit scale, conversion to
Celsius, and re-rounding. Furthermore, we show that commonly-used
statistical methods including quantile regression are sensitive to the
finite precision and to double-rounding of the data after unit
conversion. To remedy these issues, we present a Hidden Markov Model
that uses the differing frequencies of specific recorded values to
recover the most likely original precision and units associated with
each observation. This precision-decoding algorithm is used to infer the
precision of the 644 million daily surface temperature observations in
the Global Historical Climate Network database, providing more accurate
values for the 63% of samples found to have been biased by
double-rounding. The average absolute bias correction across the dataset
is 0.018∘C, and the average inferred precision is 0.41∘C, even though data are archived at 0.1∘C
precision. These results permit for better inference of when record
temperatures occurred, correction of rounding effects, and
identification of inhomogeneities in surface temperature time series,
amongst other applications. The precision-decoding algorithm is generally applicable to rounded observations — including surface
pressure, humidity, precipitation, and other temperature data--thereby offering the potential to improve quality control procedures for many
datasets." via Hockey Schtick
..........................
http://onlinelibrary.wiley.com/doi/10.1002/qj.2612/abstract;jsessionid=58E811B777611C4C5CE7DFC6D48F717B.f01t03
.....................................
Above by Rhines et al. published by AGU, Dec. 2014:
The precision technique "can alter the apparent frequency of record-breaking events."
"Title: | Decoding the Surface Temperature Record | |
Authors: | Rhines, A. N.; Tingley, M.; McKinnon, K. A.; Huybers, P. J. | |
Affiliation: | AA(Harvard University, Cambridge, MA, United States arhines@fas.harvard.edu), AB(Pennsylvania State University Main Campus, University Park, PA, United States martin.tingley@gmail.com), AC(Harvard University, Cambridge, MA, United States mckinnon@fas.harvard.edu), AD(Harvard University, Cambridge, MA, United States phuybers@fas.harvard.edu) | |
Publication: | American Geophysical Union, Fall Meeting 2014, abstract #GC13H-0769 | |
Publication Date: | 12/2014 | |
Origin: | AGU | |
Keywords: | 1616 Climate variability, , 3252 Spatial analysis, , 3270 Time series analysis, , 3309 Climatology | |
Bibliographic Code: | 2014AGUFMGC13H0769R |
Abstract
Historical temperature observations from surface stations have been recorded using a variety of units and levels of precision, with metadata that are often incomplete. As a result, the amount of rounding applied to these observations is generally unknown, posing a challenge to statistical methods that are sensitive to the use of discrete data. Methods used to infer distributional changes often assume that data are continuously distributed and can only be reliably applied when the specific discreteness of each sample is known. We present a new technique, termed `precision-decoding,' that identifies the original precision and units of time series data. Applying it to the GHCND database, we identify temporal and spatial patterns in the precision and units used by surface stations. We show that many archived values have been offset from the original observations due to double-rounding in the presence of conversion between Fahrenheit and Celsius, and provide additional metrics to identify stations in need of further quality control. While the discreteness of the data is unlikely to have influenced global mean temperature trends, we show that it can affect higher-order moments of the temperature distribution such as the variance or skewness, and that it can alter the apparent frequency of record-breaking events."..........................
-----------------------------
6/18/15, "Global Historical Climatology Network - Daily (GHCN-Daily), Version 3," NOAA, catalog.data.govMetadata Updated: Jun 18, 2015
"The Global Historical Climatology Network-Daily (GHCN-Daily) dataset integrates daily climate observations from approximately 30 different data sources. Version 3 was released in September 2012 with the addition of data from two additional station networks. Changes to the processing system associated with the version 3 release also allowed for updates to occur 7 days a week rather than only on most weekdays. Version 3 contains station-based measurements from well over 90,000 land-based stations worldwide, about two thirds of which are for precipitation measurement only. Other meteorological elements include, but are not limited to, daily maximum and minimum temperature, temperature at the time of observation, snowfall and snow depth. Over 25,000 stations are regularly updated with observations from within roughly the last month. The dataset is also routinely reconstructed (usually every week) from its roughly 30 data sources to ensure that GHCN-Daily is generally in sync with its growing list of constituent sources. During this process, quality assurance checks are applied to the full dataset. Where possible, GHCN-Daily station data are also updated daily from a variety of data streams. Station values for each daily update also undergo a suite of quality checks."
...............
No comments:
Post a Comment