Quantcast
Environmental News for a Healthier Planet and Life

Help Support EcoWatch

Climate Change Denial at Its Worst?

Climate
Climate Change Denial at Its Worst?

Rep. Gary Palmer falsely claimed on a radio show that temperature data used to measure global climate change have been “falsified” and manipulated.

Alabama Rep. Gary Palmer’s claim that “we are building an entire agenda on falsified data” has no basis in evidence.

Palmer, a Republican from Alabama, cited the so-called Climategate episode of five years ago, in which emails written by climate scientists purportedly showed evidence of data manipulation, and a more recent accusation of climate scientists tampering with data from temperature monitoring stations. The Climategate scandal has been subject to several separate investigations, all of which exonerated all scientists involved from any wrongdoing, and the latest data manipulation charges are a mischaracterization of standard and well-validated methods for adjusting temperature records to eliminate factors that could produce inaccurate readings.

‘Manipulating Data’

Radio host Matt Murphy in Birmingham, Alabama, asked for Palmer’s thoughts on the snowstorms in the Northeast and climate change:

Palmer, Feb. 10: I think it might be a matter of the report that came out last week about the government manipulating data and misleading people a little bit. But two feet of snow ought to get their attention. … It’s not the first time. I mean, I wrote about this a couple of years ago, when it came out that the scientists at East Anglia University in England had done this, and that was the data that the United Nations report was based on. It was a huge scandal, there were emails going around where they were, the scientists were literally talking about how they were going to change the data. We are building an entire agenda on falsified data that will have an enormous impact on the economy.

The “report” to which Palmer referred was actually a series of blog posts, written by climate change denier Paul Homewood, which were then highly publicized in two stories by Christopher Booker in the Daily Telegraph in London. Both writers focused on the adjustments made to temperature readings at certain monitoring stations around the world, and claimed that those adjustments throw the entire science of global warming into question. This is not at all the case, and those adjustments are a normal and important part of climate science.

The National Oceanic and Atmospheric Administration (NOAA), the U.S. agency responsible for monitoring national and global temperature trends, has addressed these types of adjustments several times before. NOAA addresses the subject in a Q&A on its website:

Q: What are some of the temperature discrepancies you found in the climate record and how have you compensated for them?

A: Over time, the thousands of weather stations around the world have undergone changes that often result in sudden or unrealistic discrepancies in observed temperatures requiring a correction. For the U.S.-based stations, we have access to detailed station history that helps us identify and correct discrepancies. Some of these differences have simple corrections.

NOAA maintains about 1,500 monitoring stations, and accumulates data from more than a thousand other stations in countries around the world (many national and international organizations share this type of data freely). There are actually fewer monitoring stations today than there used to be; modern stations have better technology and are accessible in real time, unlike some older outposts no longer in use. The raw, unadjusted data from these stations is available from many sources, including the international collaboration known as the Global Historical Climatology Network and others.

As the years go by, all those stations undergo various types of changes: This can include shifts in how monitoring is done, improvements in technology, or even just the addition or subtraction of nearby buildings.

For example, a new building constructed next to a monitoring station could cast a shadow over a station, or change wind patterns, in such ways that could affect the readings. Also, the timing of temperature measurements has varied over time. And in the 1980s, most U.S. stations switched from liquid-in-glass to electronic resistance thermometers, which could both cool maximum temperature readings and warm minimum readings.

Monitoring organizations like NOAA use data from other stations nearby to try and adjust for these types of issues, either raising or lowering the temperature readings for a given station. This is known as homogenization. The most significant adjustment around the world, according to NOAA, is actually for temperatures taken over the oceans, and that adjustment acts to lower rather than raise the global temperature trend.

Read page 1

The homogenization methods used have been validated and peer-reviewed. For example, a 2012 paper in the Journal of Geophysical Research confirmed the effectiveness of the homogenization processes for NOAA’s network of stations, and even noted that “it is likely that maximum temperature trends have been underestimated.” In other words, there may have actually been more warming than NOAA has reported.

Another paper, from 2010, looked into the siting of U.S. monitoring stations in particular, and again found no problem with the homogenization methods. “[T]he adjusted [U.S. Historical Climatology Network] temperatures are extremely well aligned with recent measurements. … In summary, we find no evidence that the [conterminous United States] average temperature trends are inflated due to poor station siting.”

Berkeley Earth, a climate science nonprofit founded in early 2010 by scientists expressing skepticism at the time about global warming, has also found no undue manipulation of temperature data in its own analyses. Its page specifically on the Paraguayan Puerto Casado station that Homewood mentioned shows the adjusted readings do in fact show a rise in temperature over time.

An October 2011 paper in the Journal of Geophysical Research provides an overview of the entire Global Historical Climatology Network’s temperature data set, including detailed information about adjustments. In total, at least one “bias correction” was applied to 3,297 of the 7,279 stations in use at some point since 1801, though most of these occurred from the 1950s through the 1980s. As the chart below shows, there are approximately equal numbers of adjustments in the positive and negative directions.

A spokesman for Palmer told us in an email that “it’s very apparent that some of the temperature records have been mangled by the computer in an attempt to make them conform to certain standards.” As the research we describe above shows, no such “mangling” or other manipulation is at all apparent. The spokesman cited a 2007 paper by an economist at the University of Guelph and a scholar at the Cato Institute that found that correlations between temperature readings and socioeconomic data call into question the overall global temperature trend. A subsequent paper by a NASA climate scientist highlighted the problems with this finding, most notably a very limited set of correlations (primarily the U.S., Japan and Western Europe). He concluded that “there is no compelling evidence from these correlations of any large-scale contamination.”

Scientists have criticized the Telegraph’s Booker (and by extension Homewood) for spreading misinformation on climate science. In a post on RealClimate.org, Norwegian Meteorological Institute senior researcher Rasmus Benestad quickly debunked the details of Booker’s and Homewood’s claims. He said of the Telegraph story, “a person who writes such a misleading story shows little respect for his readers.”

Climategate Revisited

The supposed manipulation of data by East Anglia and other scientists in the Climategate affair also proved to be completely unfounded, as we have written twice before.

Climate skeptics claimed that leaked emails between many climate scientists around the world showed there was a coordinated effort to inflate the global warming signal in temperature data. But several separate investigations, including by the U.S. Department of Commerce Inspector General and the U.S. Environmental Protection Agency, found no such wrongdoing or manipulation.

According to one independent international investigation, known informally as the Oxburgh Report: “We saw no evidence of any deliberate scientific malpractice in any of the work of the Climatic Research Unit and had it been there we believe that it is likely that we would have detected it.” Palmer’s spokesman said the congressman had no comment on the repetition of this claim in spite of the repeated exonerations.

Palmer’s claim that “we are building an entire agenda on falsified data” has no basis in evidence. Even as these claims of data manipulation have resurfaced, there is now a general consensus that 2014 was the hottest single year since temperature record keeping began. This same conclusion has been reached by NOAA and NASA, Japan Meteorological Agency and World Meteorological Organization. The United Kingdom’s Met Office said that 2014 was among the warmest along with 2010, but it is impossible to say for sure that 2014 was hotter. According to NASA, nine of the 10 warmest years have all occurred since 2000, with 1998 the lone exception.

YOU MIGHT ALSO LIKE

Global Divestment Day: A Huge Success

NASA Scientists: Future Megadroughts Could Last 30+ Years ‘Thanks to Human-Induced Climate Change’

National Academy of Sciences Says Geoengineering Is Not the Answer to Climate Change

Sustainable t-shirts by Allbirds are made from a new, low-carbon material that uses a mineral extract from discarded snow crab shells. Jerry Buttles / Allbirds

In the age of consumption, sustainability innovations can help shift cultural habits and protect dwindling natural resources. Improvements in source materials, product durability and end-of-life disposal procedures can create consumer products that are better for the Earth throughout their lifecycles. Three recent advancements hope to make a difference.

Read More Show Less

EcoWatch Daily Newsletter

A net-casting ogre-faced spider. CBG Photography Group, Centre for Biodiversity Genomics / CC BY-SA 3.0

Just in time for Halloween, scientists at Cornell University have published some frightening research, especially if you're an insect!

The ghoulishly named ogre-faced spider can "hear" with its legs and use that ability to catch insects flying behind it, the study published in Current Biology Thursday concluded.

"Spiders are sensitive to airborne sound," Cornell professor emeritus Dr. Charles Walcott, who was not involved with the study, told the Cornell Chronicle. "That's the big message really."

The net-casting, ogre-faced spider (Deinopis spinosa) has a unique hunting strategy, as study coauthor Cornell University postdoctoral researcher Jay Stafstrom explained in a video.

They hunt only at night using a special kind of web: an A-shaped frame made from non-sticky silk that supports a fuzzy rectangle that they hold with their front forelegs and use to trap prey.

They do this in two ways. In a maneuver called a "forward strike," they pounce down on prey moving beneath them on the ground. This is enabled by their large eyes — the biggest of any spider. These eyes give them 2,000 times the night vision that we have, Science explained.

But the spiders can also perform a move called the "backward strike," Stafstrom explained, in which they reach their legs behind them and catch insects flying through the air.

"So here comes a flying bug and somehow the spider gets information on the sound direction and its distance. The spiders time the 200-millisecond leap if the fly is within its capture zone – much like an over-the-shoulder catch. The spider gets its prey. They're accurate," coauthor Ronald Hoy, the D & D Joslovitz Merksamer Professor in the Department of Neurobiology and Behavior in the College of Arts and Sciences, told the Cornell Chronicle.

What the researchers wanted to understand was how the spiders could tell what was moving behind them when they have no ears.

It isn't a question of peripheral vision. In a 2016 study, the same team blindfolded the spiders and sent them out to hunt, Science explained. This prevented the spiders from making their forward strikes, but they were still able to catch prey using the backwards strike. The researchers thought the spiders were "hearing" their prey with the sensors on the tips of their legs. All spiders have these sensors, but scientists had previously thought they were only able to detect vibrations through surfaces, not sounds in the air.

To test how well the ogre-faced spiders could actually hear, the researchers conducted a two-part experiment.

First, they inserted electrodes into removed spider legs and into the brains of intact spiders. They put the spiders and the legs into a vibration-proof booth and played sounds from two meters (approximately 6.5 feet) away. The spiders and the legs responded to sounds from 100 hertz to 10,000 hertz.

Next, they played the five sounds that had triggered the biggest response to 25 spiders in the wild and 51 spiders in the lab. More than half the spiders did the "backward strike" move when they heard sounds that have a lower frequency similar to insect wing beats. When the higher frequency sounds were played, the spiders did not move. This suggests the higher frequencies may mimic the sounds of predators like birds.

University of Cincinnati spider behavioral ecologist George Uetz told Science that the results were a "surprise" that indicated science has much to learn about spiders as a whole. Because all spiders have these receptors on their legs, it is possible that all spiders can hear. This theory was first put forward by Walcott 60 years ago, but was dismissed at the time, according to the Cornell Chronicle. But studies of other spiders have turned up further evidence since. A 2016 study found that a kind of jumping spider can pick up sonic vibrations in the air.

"We don't know diddly about spiders," Uetz told Science. "They are much more complex than people ever thought they were."

Learning more provides scientists with an opportunity to study their sensory abilities in order to improve technology like bio-sensors, directional microphones and visual processing algorithms, Stafstrom told CNN.

Hoy agreed.

"The point is any understudied, underappreciated group has fascinating lives, even a yucky spider, and we can learn something from it," he told CNN.

Trending

There are many different CBD oil brands in today's market. But, figuring out which brand is the best and which brand has the strongest oil might feel challenging and confusing. Our simple guide to the strongest CBD oils will point you in the right direction.

Read More Show Less
Financial institutions in New York state will now have to consider the climate-related risks of their planning strategies. Ramy Majouji / WikiMedia Commons

By Brett Wilkins

Regulators in New York state announced Thursday that banks and other financial services companies are expected to plan and prepare for risks posed by the climate crisis.

Read More Show Less
The left image shows the OSIRIS-REx collector head hovering over the Sample Return Capsule (SRC) after the Touch-And-Go Sample Acquisition Mechanism arm moved it into the proper position for capture. The right image shows the collector head secured onto the capture ring in the SRC. NASA / Goddard / University of Arizona / Lockheed Martin

A NASA spacecraft has successfully collected a sample from the Bennu asteroid more than 200 million miles away from Earth. The samples were safely stored and will be preserved for scientists to study after the spacecraft drops them over the Utah desert in 2023, according to the Associated Press (AP).

Read More Show Less

Support Ecowatch