Quantcast
A 130-metre-wide waterfall drains meltwater from the Nansen Ice Shelf into the ocean. Stuart Rankin via Flickr

By Tim Radford

Scientists poring over military and satellite imagery have mapped the unimaginable: a network of rivers, streams, ponds, lakes and even a waterfall, flowing over the ice shelf of a continent with an annual mean temperature of more than -50C.

In 1909 Ernest Shackleton and his fellow explorers on their way to the magnetic South Pole found that they had to cross and recross flowing streams and lakes on the Nansen Ice Shelf.

Antarctic Waterways

Now, U.S. scientists report in the journal Nature that they studied photographs taken by military aircraft from 1947 and satellite images from 1973 to identify almost 700 seasonal networks of ponds, channels and braided streams flowing from all sides of the continent, as close as 600km to the South Pole and at altitudes of 1,300 meters.

And they found that such systems carried water for 120km. A second research team reporting a companion study in the same issue of Nature identified one meltwater system with an ocean outflow that ended in a 130-meter wide waterfall, big enough to drain the entire surface melt in a matter of days.

In a world rapidly warming as humans burn ever more fossil fuels, to add ever more greenhouse gases into the atmosphere, researchers expect to observe an increase in the volume of meltwater on the south polar surface. Researchers have predicted the melt rates could double by 2050. What isn't clear is whether this will make the shelf ice around the continent—and shelf ice slows the flow of glaciers from the polar hinterland—any less stable.

"This is not in the future—this is widespread now, and has been for decades," said Jonathan Kingslake, a glaciologist at Columbia University's Lamont-Doherty Earth Observatory, who led the research.

"I think most polar scientists have considered water moving across the surface of Antarctica to be extremely rare. But we found a lot of it, over very large areas."

The big question is: has the level of surface melting increased in the last seven decades? The researchers don't yet have enough information to make a judgment.

"We have no reason to think they have," Dr Kingslake said. "But without further work, we can't tell. Now, looking forward, it will be really important to work out how these systems will change in response to warming, and how this will affect the ice sheets."

Many of the flow systems seem to start in the Antarctic mountains, near outcrops of exposed rock, or in places where fierce winds have scoured snow off the ice beneath. Rocks are dark, the exposed ice is of a blue colour, and during the long days of the Antarctic summer both would absorb more solar energy than white snow or ice. This would be enough to start the melting process.

The Antarctic is already losing ice, as giant floating shelves suddenly fracture and drift north. There is a theory that meltwater could be part of the fissure mechanism, as it seeps deep into the shelves.

Drainage Theory

But the companion study, led by the polar scientist Robin Bell of the Lamont-Doherty Observatory suggests that drainage on the Nansen Ice Shelf might help to keep the ice intact, perhaps by draining away the meltwater in the dramatic waterfall the scientists had identified.

"It could develop this way in other places, or things could just devolve into giant slush puddles," she said. "Ice is dynamic, and complex, and we don't have the data yet."

A coastal glacier in southern Greenland mirrored in the sea. Photo credit: Claire Rowland via Flickr

By Tim Radford

By the century's end, some of Greenland's ice will have vanished forever.

New research shows that the coastal glaciers and ice caps are melting faster than ever before and may have already reached the point of no return two decades ago. That is because they have passed the stage at which they can refreeze their own meltwater.

These peripheral glaciers and icecaps cover an estimated 100,000 square kilometers of the island. And when they have gone, the world's oceans will have risen by four centimeters.

Body of Greenland ice

But scientists reporting in Nature Communications journal said most of the Greenland ice—the biggest body of ice in the northern hemisphere—is still safe. Were all of its ice to melt, sea levels would rise by at least seven meters.

"Higher altitudes are colder, so the highest ice caps are still relatively healthy at the moment," said study leader Brice Noël, a PhD student of polar glaciology and Arctic climate modeling at the University of Utrecht in the Netherlands.

"However, we see melting occur higher and higher. That's a big problem, because that 'melting line' is moving towards the altitude where most of the ice mass is.

"The main ice sheet in the interior of Greenland is much more elevated and isn't doing too bad yet. But we can already see an increase in the altitude of the 'melting line' there as well."

The coastal research concentrated on the mechanics of ice loss. Normally, glaciers and ice caps grow because summer meltwater drains through into the deeper frozen snow and freezes again. The icecap retains its mass and even increases.

But 20 years ago, the firn, or older snow, became saturated, freezing right through, and more summer meltwater now runs to the sea. The rate of increase varies from 17 to 74 percent and the icecaps each year are losing three times the mass loss measured in 1997.

Concern about Greenland ice and glaciers being in retreat is not new. In fact, glaciers in both hemispheres are observed to be in retreat, and the Geological Society of America has just published telltale imagery and an analysis based on observations of more than 5,200 glaciers in 19 regions around the world, showing that the loss of ice mass this century is without precedent.

So Greenland's glaciers are just part of a bigger picture. But since Greenland is home to the second largest volume of ice on the planet, what happens there concerns the entire world.

Testimony to climate change

Researchers observed years ago that the rivers of Greenland ice are in spate and rates of melting are thought likely to accelerate. The latest report is another piece of testimony to climate change in the far north.

"These peripheral glaciers and ice caps can be thought of as colonies of ice that are in rapid decline, many of which will likely disappear in the near future," said Ian Howat, a glaciologist at Ohio State University in the U.S. and a co-author of the report.

"In that sense, you could say that they're 'doomed.' However, the ice sheet itself is still not 'doomed' in the same way. The vast interior ice sheet is more climatologically isolated than the surrounding glaciers and ice caps.

"Also, since this 'tipping point' was reached in the late 1990s before warming really took off, it indicates that these peripheral glaciers are very sensitive and, potentially, ephemeral relative to the timescales of response of the ice sheet."

Sponsored

By Paul Brown

Any lingering hope that a worldwide nuclear power renaissance would contribute to combating climate change appears to have been dashed by U.S. company Westinghouse, the largest provider of nuclear technology in the world, filing for bankruptcy, and the severe financial difficulties of its Japanese parent company, Toshiba.

After months of waiting, Toshiba still could not get its auditors to agree to its accounts this week. But it went ahead anyway and reported losses of nearly $5 billion for the eight months from April to December, in order to avoid being delisted from the Japanese stock exchange.

The company admitted it too could face bankruptcy, and is attempting to raise capital by selling viable parts of its business.

In a statement, it said: "There are material events and conditions that raise substantial doubt about the company's ability to continue as a going concern."

Nuclear Reactors

The knock-on effects of the financial disasters the two companies face will be felt across the nuclear world, but nowhere more than in the UK, which was hoping Westinghouse was about to start building three of its largest nuclear reactors, the AP 1000, at Moorside in Cumbria, northwest England.

The UK's Conservative government will be particularly embarrassed because, in late February, it won a critical parliamentary by-election in the seat that would be home to the Moorside plant, on the guarantee that the three reactors would be built—a pledge that now seems impossible to keep.

"I think the day of the large-scale nuclear power station is over," said Martin Forwood, campaign co-ordinator for Cumbrians Opposed to a Radioactive Environment. "There is no one left to invest anymore because renewables are just cheaper, and these prices are still going down while nuclear is always up."

Toshiba and Westinghouse are in deep trouble because the reactors they are currently building—the same design as the ones planned for Cumbria—are years late and billions of dollars over budget. Even if the companies can be refinanced, it seems extremely unlikely they would risk taking on new reactor projects.

Both the UK and Toshiba have looked to the South Korean nuclear giant KEPCO to take over the Moorside project, but the company is unlikely to want to build the Westinghouse design and would want to put forward its own reactor, the APR 1400.

This would delay the project for years, since the whole safety case for a new type of reactor would have to be examined from scratch.

But the company is already under pressure from within South Korea, where Members of Parliament have urged KEPCO not to take on a risky project in the UK. Twenty-eight members of the Republic of Korea's "Caucus on Post-Nuclear Energy" have called on KEPCO not to invest in Moorside.

The other nuclear giant present in Britain, the French-owned Électricité de France (EDF), is in serious difficulties of its own. It is already deep in debt and its flagship project to build a prototype 1,600 megawatt reactor at Flamanville in northern France is six years behind schedule and three times over budget at €10.5 billion.

Originally due to open in 2012, its start date is now officially the end of 2018, but even that is in doubt because an investigation into poor quality steel in the reactor's pressure vessel is yet to be completed.

Despite this, the company and the UK government are committed to building two more of these giant reactors in Somerset in southwest England, and have started pouring concrete for the bases to put them on. These reactors are due to be completed in 2025, but nobody outside the company and the UK government believes this is likely.

So, with troubles of its own, EDF is in no position to help Toshiba out of its financial difficulties. In the nuclear world, this leaves only the Chinese and the Russians who might be capable of taking on such a project.

The Russians will be ruled out on political grounds, and the Chinese are already helping out EDF with a large financial stake in the Somerset project. They also want to build a nuclear station of their own design at Bradwell in Essex, southeast England—another project that looks likely to take more than a decade to complete.

Vast Capital Costs

The problem for all these projects, apart from the vast capital cost and the timescales involved, is that the energy industry is changing dramatically. Solar and wind power are now a cheaper form of producing electricity across the world, and are less capital-intensive and quicker to build.

Despite the fact that there are more than 430 nuclear reactors in operation worldwide and the industry still has great economic and political clout, it is beginning to look like a dinosaur—too big and cumbersome to adapt to new conditions.

Nuclear power now produces about 10 percent of the world's electricity, while 40 percent is from coal and 23 percent from renewables. The rest is mainly from natural gas.

"Nuclear lobbyists are abandoning the tiresome rhetoric about a nuclear power renaissance," said Jim Green, national nuclear campaigner with Friends of the Earth Australia. "They are now acknowledging that the industry is in crisis."

"The crisis-ridden U.S., French and Japanese nuclear industries account for half of worldwide nuclear power generation," he continued. "Renewable energy generation doubled over the past decade, and strong growth, driven by sharp cost decreases, will continue for the foreseeable future."

Reposted with permission from our media associate Climate News Network.


By Tim Radford

If humans burn all the fossil fuels at their disposal—and this could happen in the next two centuries—researchers predict that the planetary atmosphere would match the one that witnessed the days of the dinosaurs at the dawn of the Jurassic period, around 200 million years ago.

By the 23rd century, planetary temperatures would be as high as those at the end of the Silurian, 420 million years ago. In this baking environment, plants had yet to begin to colonize the land and almost all life was concentrated in the oceans.

This torrid forecast is not based on any one piece of research: It is the outcome of an analysis of 1,200 estimates of ancient atmospheres, based on evidence of fossilized plants and shells, over a timespan of almost half a billion years.

The consequence is that, if humans exhaust the resources of coal, oil and natural gas, conditions will follow that have no precedent in 420 million years of evolution.

Planetary Temperatures

And the agency at work is the ratio of carbon dioxide in the atmosphere, which has hovered at around 280 parts per million (ppm) for almost all human history.

Carbon dioxide is a greenhouse gas that was once present in the atmosphere at far greater levels.

Once humans started to burn coal and oil—based on plant material sequestered during the Carboniferous era—they also started to return ancient CO2 to the atmosphere, to create a heat trap. Carbon dioxide ratios have risen to more than 400ppm and planetary average temperatures have risen by almost 1°C.

The latest research, published in Nature Communications journal, contains a grim warning for humankind—but it was driven at least in part by curiosity about the coupling of atmosphere and evolution during the emergence of complex life.

"We cannot directly measure CO2 concentrations from millions of years ago," said Gavin Foster, professor of isotope geochemistry at the University of Southampton in the UK, who led the study. "Instead we rely on indirect 'proxies' in the rock record.

"In this study, we compiled all the available published data from several different types of proxy to produce a continuous record of ancient CO2 levels."

During the half billion years, planetary temperatures alternated between extended cold snaps with low CO2 levels and intense "greenhouse" temperatures at which CO2 levels rose to 3,000 ppm.

But these changes were immensely slow and the study emphasizes the speed of human impact in what geologists would like to call the Anthropocene period.

Research like this is fundamental: It tells climate scientists something about the dynamics of atmosphere and sunlight over the millennia. And one of the puzzles of evolution is that, in the early days of life, the Sun must have been fainter than it is now.

"Due to nuclear reactions in stars, like our Sun, over time they become brighter," explained Dan Lunt, professor of climate science at the University of Bristol, UK and a co-author of the report.

"This means that, although carbon dioxide concentrations were high hundreds of millions of years ago, the net warming effect of CO2 and sunlight was less. Our new CO2 compilation appears on average to have gradually declined over time by about 3-4 ppm per million years.

"This may not sound like much, but it is actually just about enough to cancel out the warming effect caused by the Sun brightening through time, so in the long term it appears the net effect of both was pretty much constant on average."

So the coincidence of a greenhouse atmosphere and a cooler Sun created conditions in which life emerged, evolved and adapted to its environment. Plants consumed and sequestered carbon dioxide and animals benefited from the oxygen released in the process.

Future CO2 Levels

Planetary temperatures began to stabilize—until humans launched the Industrial Revolution in the late 1700s. The world's nations, meeting at the UN climate conference in Paris in 2015, pledged to cut fossil fuel use and contain global warming to a maximum of 2°C.

The past enshrines a horrific warning. The return of all that prehistoric carbon—preserved in fossil fuels—back into the atmosphere would mean that, by 2250, CO2 levels would reach 2,000 ppm. This has not been seen for 200 million years.

"However, because the Sun was dimmer back then, the net climate forcing 200 million years ago was lower than we would experience in such a high CO2 future," Prof. Foster said.

"So not only will the resultant climate change be faster than anything the Earth has seen for millions of years, the climate that will exist is likely to have no natural counterpart, as far as we can tell, in at least the last 420 million years."

Reposted with permission from our media associate Climate News Network.

Sponsored

By Tim Radford

The Colorado River is dwindling, and climate change is officially to blame. In the first 14 years of this century, the flow declined to only four-fifths of the 20th century average, according to new research. The water lost would have been enough to supply two million people for a whole year.

Altogether, the river supplies water to seven U.S. states and two in Mexico, and 40 million people rely on it for their water. But the entire Colorado River basin has been experiencing sustained drought since 2000. And somewhere between one sixth and one half of this liquid loss can be put down to global warming, scientists said.

They publish their findings in the journal Water Resources Research. "This paper is the first to show the large role that warming temperatures are playing in reducing the flows of the Colorado River," said Jonathan Overpeck, professor of geosciences and of hydrology and atmospheric sciences at the University of Arizona.

"We're the first to make the case that warming alone could cause Colorado River flow declines of 30 percent by mid-century and over 50 percent by the end of the century if greenhouse gas emissions continue unabated."

His co-author Bradley Udall, a climate scientist at Colorado State University, said, "The future of the Colorado River is far less rosy than other recent assessments have portrayed. A clear message to water managers is that they need to plan for significantly lower river flows."

The two scientists began by looking at the drought years of 2000-2014. The river starts with precipitation in the upper regions of its drainage basin, in Wyoming, Utah, Colorado and New Mexico.

They found that in the first decade and a half of this century, average temperatures in the region were 0.9°C higher than the average for the past 105 years. This is, very roughly, the temperature by which the globe has warmed on average over the last century, under a global warming regime driven by greenhouse gases emitted from fossil fuel combustion.

But there is another factor to consider. The U.S. Southwest has a climate history characterized by intermittent megadroughts—periods of much lower rainfall over spans of 20 to 60 years.

Researchers have proposed that the risk of megadroughts is likely to increase in any climate change scenario. What actually will happen is uncertain, but the scientists are betting that as greenhouse gas emissions rise, so will the difficulties of water supply.

"Even if the precipitation does increase, our work indicates that there are likely to be drought periods as long as several decades when precipitation will still fall below normal," said Overpeck.

According to Udall, "Current planning understates the challenge that climate change poses to the water supplies in the American Southwest. My goal is to help water managers incorporate this information into their long-term planning efforts."

Reposted with permission from our media associate Climate News Network.

Renewable Energy
Winds of change … a storage system for energy generated by renewables is closer to being realized. Photo credit: Sheila Sund / Flickr

By Kieran Cooke

It is the holy grail of the renewable energy sector—a cheap and efficient battery system that can store energy generated by renewables such as wind and solar.

These days there are few who doubt the potential of renewables, except those diehards on the extreme of the fossil fuel industry.

According to the International Energy Agency (IEA)—the main body monitoring developments in the global energy sector—renewables are surging ahead.

Investment in Renewables

In 2015, investments in oil and gas—fossil fuels that, along with coal, are the main drivers of global warming—declined by 25 percent, while energy produced from renewables rose by 30 percent.

Renewables are becoming increasingly competitive with fossil fuels in many sectors: According to the IEA, in the five years to the end of 2015 the price of solar energy dropped by 80 percent and wind power by a third.

Fast-developing countries—China and India, in particular—are investing millions of dollars in the renewable sector.

The big problem with renewables development has been storage. In order to operate a commercially viable power plant, a reliable flow of fuel is needed. In the case of oil, coal or gas this is relatively straightforward as supplies can quickly be replenished.

In the case of nuclear, as long as there is a readily available supply of uranium isotopes, power can continue to be generated.

Solar and wind power supply is far more varied—dependent on sunshine and wind speeds—and cannot be stored or used in the same way as so-called conventional fuels.

For years, scientists have struggled to develop storage systems capable of handling the peaks and troughs of renewable power so that an even supply can be guaranteed.

Researchers at the John A. Paulson School of Engineering and Applied Sciences at Harvard University in the U.S. said in an article published in ACS Energy Letters that they have now developed a long-lasting flow battery capable of storing renewable power that­ could operate for up to 10 years, with minimum maintenance required.

A flow battery is a cross between a conventional battery and a fuel cell. Flow batteries store energy in liquid solutions in external tanks and are regarded as one of the primary ways of storing renewable energy. The bigger the tanks, the more energy can be stored.

But flow batteries are costly. Most use expensive polymers that can cope with the potent chemicals inside the battery.

Battery Capacity

The battery's components and materials, such as membranes and electrolytes, have to be frequently replaced in order to retain capacity.

The Harvard team modified molecules used in the electrolyte solutions to make them soluble in water and so vastly increase the battery's ability to retain power.

"Because we were able to dissolve the electrolytes in neutral water, this is a long-lasting battery that you could put in your basement," said Roy Gordon, a professor of chemistry and materials science and a leading member of the research team.

"If it spilled on the floor, it wouldn't eat the concrete and, since the medium is non-corrosive, you can use cheaper materials to build components of the batteries, like the tanks and pumps," Gordon added.

Reducing the cost of the battery is vital. The U.S. Department of Energy said that in order to make stored energy from wind and solar competitive with fossil fuels, a battery needs to be able to store energy for less than $100 per kilowatt hour.

"If you can get anywhere near this cost target then you can change the world," said Michael Aziz, another lead researcher in the battery project and a professor of materials and energy technologies at Harvard.

"It becomes cost effective to put batteries in so many places—this research puts us one step closer to reaching that target," said Aziz.

Reposted with permission from our media associate Climate News Network.

Sponsored
Renewable Energy
The proposed tidal lagoon would enclose Swansea Bay with a rock wall incorporating 16 turbines and generating 320MW of electricity. Photo credit: Tidal Lagoon Power

By Richard Sadler

Ambitious plans have been drawn up for a network of "tidal lagoons" around the UK coast that could provide up to a quarter of the country's electricity—and there is potential to roll out the technology in many parts of the world.

Tidal lagoons work by using a wall to capture a body of water in the sea or a tidal estuary pushed in on the rising tide. The water drives turbines as the tide comes in and then, as the tide falls, the turbines are reversed and the energy from the falling tide is harnessed again.

As Geoffrey Chaucer, one of the earliest English poets put it: "Time and tide wait for no man." Unlike with wind and solar, the amount of energy being produced from tides is predictable months in advance and is now being recognized as a major renewable resource.

More Tidal Lagoons

Planning approval has already been given for a £1.3 billion pathfinder project at Swansea Bay, South Wales, described by developers as "a scalable blueprint for a new, global, low-carbon power industry." Another nine lagoons are planned around tidal hotspots in the Severn Estuary and North-West England/North Wales. These would have the potential to generate 25,000MW of electricity—enough to provide 12 percent of the UK's electricity needs.

The company behind the proposals, Tidal Lagoon Power, already has teams working in Northern France and India and is studying opportunities in Mexico and Canada's Atlantic coast. Further tidal lagoon markets may exist in South America, China, South-East Asia and Oceania.

Tidal power is recognized by the EU's Joint Research Centre as a key contributor to the continent's future energy mix. Its main attraction is that, unlike other renewable energy sources, it does not require the wind to blow or the sun to shine.

An oceanographer at Southampton University, Dr. Simon Boxall, said the technology has improved to the point where tidal energy was a "no-brainer," with the latest bi-directional turbines capable of generating power on both incoming and outgoing tides. He said that with sufficient investment it could provide up to a quarter of UK electricity needs within 20 years.

"We can always rely on tides—they come in and they go out and they will continue doing so for thousands of years. Parts of the UK have tidal ranges in excess of 15 meters, so that's a heck of a drop of water and that's happening twice a day—or four times a day when you count the water coming in and going out," said Dr. Boxall.

"The other great advantage is that the tides aren't the same in different locations, so if you've got a network of tidal power stations you are always generating electricity: 24 hours a day, seven days a week," added Dr. Boxall.

In December a former UK energy minister, Charles Hendry, published an independent review, concluding: "Power from tidal lagoons could make a strong contribution to UK energy security, as an indigenous and completely predictable form of supply."

He said the UK was well-placed to take a global lead and with economies of scale and mass manufacture of turbines, turbine housing and other components costs could be substantially reduced.

Cheap Electricity

To be viable the new industry would require subsidies, with a guaranteed premium price for electricity generated. However, Hendry calculated that in the long term tidal lagoons will work out cheaper than wind and "significantly less expensive" than nuclear. And they could go on generating for 140 years—providing clean, subsidy-free energy long after other energy plants have been decommissioned.

The technology is not without its drawbacks. Artificial lagoons can cause increased silting-up of shipping lanes. Tidal estuaries are also important for wading birds, marine mammals and migratory fish and conservation groups have warned that the ecological impacts of tidal lagoons are not well understood and that any roll-out of lagoons in the UK should be conditional on the Swansea project being tried and tested. Backers of the technology say management practices can be adapted to address such concerns—and they point out that lagoons can provide environmental benefits, acting as artificial reefs for marine wildlife.

The UK government is expected to announce a final decision on the Swansea Bay project within the next few months.

Reposted with permission from our media associate Climate News Network.

By Richard Sadler

The Internet is fast becoming a major source of global carbon emissions—and the main cause is video demand, the increasing popularity of "real time" streamed video content.

Video streaming to Internet-enabled TVs, game consoles and mobile devices already accounts for more than 60 percent of all data traffic—and the latest forecasts suggest this will rise to more than 80 percent by 2020.

Facebook's Prineville data center in Oregon: Demand just keeps on growing.Tom Raftery / Flickr

Increasingly, viewers across the world are watching films and TV series in real time through subscriptions to Netflix or Amazon, while social media platforms such as Facebook and Twitter are offering more and more streamed video content for free.

This is driving a dizzying increase in the amount of information that needs to be stored and transmitted by power-hungry data centers. Up until 2003 the world had accumulated a total of five exabytes—five billion gigabytes—of stored digital content. By 2015 that amount was being consumed every two days, as annual consumption reached 870 exabytes.

As more video is streamed and more of the world's population goes online, annual data traffic is forecast to reach 2,300 exabytes by 2019.

Pressure for Renewables

The IT sector already consumes around 7 percent of electricity worldwide and as data traffic rises, demand from data centers alone could reach 13 percent of global electricity consumption by 2030.

Now leading video content providers are coming under increasing pressure to show what proportion of their power derives from fossil fuels.

A recent report by Greenpeace USA acknowledges that social media platform Facebook has made significant progress towards its target for 100 percent of its electricity to come from renewables, following support from millions of its users for Greenpeace's 2011 "Unfriend coal" campaign. Google and Apple receive praise for progress towards similar commitments made in 2012.

However, major providers of video streaming content including Netflix, Amazon Prime and Hulu are criticized for sourcing more than half of their energy from coal or natural gas.

Cloud computing market leader Amazon Web Services is credited for taking important steps towards renewables but censured for lack of transparency and heavy reliance on new data centers in the state of Virginia powered mainly by fossil fuels.

Elsewhere the lack of access to renewable energy from monopoly utilities in East Asia is seen as a major obstacle towards creating a renewably-powered Internet in the region.

The report concludes: "The dramatic increase in the number of data centers … dominated by utilities that have little to no renewable energy is driving a similarly dramatic increase in the consumption of coal and natural gas."

Attempting to express the effect of increasing Internet traffic in terms of emissions is fraught with difficulty, but one study, published in the journal Environmental Research Letters, has calculated that in 2011 Americans streamed 3.2 billion hours of video.

This would have consumed 25 petajoules of energy (estimated at about the annual consumption of 175,000 U.S. households), resulting in 1.3 billion kilograms of CO2 emissions.

Efficiency Limits

The lead author, Arman Shehabi, a research scientist at Lawrence Berkeley National Laboratory in California, said the IT sector had so far managed to offset its soaring electricity needs by designing more energy-efficient data centers. But there was a limit to how far energy efficiency could go.

"The growth in video streaming is enormous just based on the size of the companies that are providing these services—but they are still reaching only a small part of the global population and we can imagine that's going to just keep increasing," he said.

"You're still going to have this growth of more and more servers needed. We've seen some good efficiency measures but we're getting close to the end of that—we can't go out much further—and with video streaming there's no end in sight."

He added that another major driver of future growth in data traffic would be the Internet of Things—remote digital sensors, devices and driverless cars connected to the Internet.

Richard Sadler, a former BBC environment correspondent, is a freelance environment and science journalist. Reposted with permission from our media associate Climate News Network.

mail-copy

Get EcoWatch in your inbox