Thursday, February 28, 2013

Latest hair-brained geoengineering scheme proposes Saturn-like rings in Earth's orbit

A forthcoming paper proposes the latest hair-brained geoengineering scheme to allegedly offset global warming. The authors suggest that transporting, at a minimum, 1 trillion kilograms of dust into space to form a "Sun-pointing elliptical Earth ring" will shield the Earth from solar radiation. The paper joins other geoengineering schemes including using a pipe to an airship to spew sulfuric acid into the upper atmosphere to "fix" the non-existent problem of man-made global warming.

The authors are apparently unaware that clouds act as the Earth's negative feedback cooling mechanism and have maintained relatively stable Earth temperatures for millions of years without any evidence of net positive feedback "tipping points." Not to mention, the scheme is irreversible if solar activity enters a lull such as the Maunder or Dalton minimums, and is an irreversible hazard to satellites and space travel. 


Heliotropic dust rings for Earth climate engineering

Publication date: 1 April 2013  [any coincidence that this is April fool's day?]

Source:Advances in Space Research, Volume 51, Issue 7

This paper examines the concept of a Sun-pointing elliptical Earth ring comprised of dust grains to offset global warming. A new family of non-Keplerian periodic orbits, under the effects of solar radiation pressure and the Earth’s J 2 oblateness perturbation, is used to increase the lifetime of the passive cloud of particles and, thus, increase the efficiency of this geoengineering strategy. An analytical model is used to predict the orbit evolution of the dust ring due to solar-radiation pressure and the J 2 effect. The attenuation of the solar radiation can then be calculated from the ring model. In comparison to circular orbits, eccentric orbits yield a more stable environment for small grain sizes and therefore achieve higher efficiencies when the orbit decay of the material is considered. Moreover, the novel orbital dynamics experienced by high area-to-mass ratio objects, influenced by solar radiation pressure and the J 2 effect, ensure the ring will maintain a permanent heliotropic shape, with dust spending the largest portion of time on the Sun facing side of the orbit. It is envisaged that small dust grains can be released from a circular generator orbit with an initial impulse to enter an eccentric orbit with Sun-facing apogee. Finally, a lowest estimate of 1×1012 kg of material is computed as the total mass required to offset the effects of global warming.

Highlights

► The feasibility of an Earth ring for climate engineering is analysed. ► Orbital dynamics of high area-to-mass ratio dust grains around Earth are modelled. ► The perturbations of solar radiation pressure and the J2 effect are included. ► Stable Sun-pointing elliptical orbits are utilised to form an Earth ring.

Clouds/aerosols control the climate, not man-made CO2

A paper published today in the Journal of Climate notes that "global dimming" from clouds/aerosols occurred from the 1950s to 1980s at a rate of -3.5 Wm-2 per decade, followed by "global brightening" from 1992 to 2002 at a rate of 6.6 Wm-2 per decade. By way of comparison, the IPCC claims* CO2 forcing since the 1950s was 1.18 Wm-2, and .25 Wm-2 from 1992 to 2002, 26 times less than the effect of "global brightening" during the same period. The global temperature record shows that temperatures correspond to these periods of global dimming and brightening rather than the slow steady rise of CO2 levels. It is thus apparent that clouds/aerosols are the "control knob" of climate, not man-made CO2
The rate of warming increased by a factor of 3.8 from 1992 to 2002 corresponding to the period of "global brightening," and was followed by global cooling.
*CO2 forcing since 1959 calculated using the IPCC formula: 5.35*ln(393/315) = 1.18 Wm-2

Measurement Methods Affect the Observed Global Dimming and Brightening

Kaicun Wang,1 Robert E. Dickinson,2 Qian Ma,1 John A. Augustine,3 and Martin Wild4
1 State Key Laboratory of Earth Surface Processes and Resource Ecology, College of Global Change and Earth System Science, Beijing Normal University, Beijing, 100875, China
2 Department of Geological Sciences, The University of Texas at Austin, Austin, TX 78712, U. S.
3 NOAA/Earth System Research Laboratory, Boulder, CO 80305, U.S.
4 Institute for Atmospheric and Climate Science, ETH Zürich, 8092 Zürich, Switzerland
Abstract
Surface incident solar radiation (G) determines our climate and environment. G has been widely observed with a single pyranometer since the late 1950s. Such observations have suggested a widespread decrease between the 1950s and 1980s (“global dimming”), i.e., at a rate of -3.5 W m−2 per decade (or -2% per decade) from 1960 to 1990. Since the early 1990s, the diffuse and direct components of G have been measured independently and a more accurate G was calculated by summing these two measurements. Data from this summation method have suggested that G has increased at a rate of 6.6 W m−2 per decade (3.6% per decade) from 1992 to 2002 (“brightening”) at selected sites. The brightening rates from these studies were also higher than those from a single pyranometer. In this paper, we used 17 years (1995-2011) parallel measurements by the two methods from nearly 50 stations to test whether these two measurement methods of G provide similar long-term trends. Our results show that although measurements of G by the two methods agree very well on a monthly time scale, the long-term trend from 1995 to 2011 determined by the single pyranometer is 2-4 W m−2 per decade less than that from the summation method. This difference of trends in the observed G is statistically significant. The dependence of trends of G on measurement methods uncovered here has an important implication for the widely reported “global dimming” and “brightening” based on datasets collected by different measurement methods, i.e., the dimming might have been less if measured with current summation methods.

New paper finds no increase in tropical cyclones over past 200 years

A paper published today in Palaeogeography, Palaeoclimatology, Palaeoecology reconstructs tropical cyclones in Queensland, Australia and finds that "tropical cyclone frequency did not increase in the past 200 years." The paper adds to many other peer-reviewed publications demonstrating that, contrary to the claims of climate alarmists, tropical cyclones have not increased, and in many cases have decreased due to warming, and are projected to further decrease in the future. 

Reconstructing tropical cyclone frequency using hydrogen isotope ratios of sedimentary n-alkanes in northern Queensland, Australia

  • a Department of Earth Sciences, Geochemistry, Faculty of Geosciences, Utrecht University, Budapestlaan 4, 3584 CD Utrecht, The Netherlands
  • b Department of Marine Organic Biogeochemistry, NIOZ Royal Netherlands Institute for Sea Research, PO Box 59, 1790 AB Den Burg, The Netherlands
  • c Palaeoecology, Department of Physical Geography, Faculty of Geosciences, Utrecht University, 3584 CD, Utrecht, The Netherlands
  • d Department of Marine Geology, NIOZ Royal Netherlands Institute for Sea Research, PO Box 59, 1790 AB Den Burg, The Netherlands
  • e Marine Biogeosciences, Alfred Wegener Institute for Polar and Marine Research, Am Handelshafen 12, D-27570 Bremerhaven, Germany

Abstract

A peat record from Quincan Crater (Queensland, Australia), spanning the past 200 years, was used to test if hydrogen isotope ratios of leaf wax long-chain n-alkanes derived of higher plants can be used to reconstruct past tropical cyclone activity. Queensland is frequently impacted by tropical cyclones, with on average 1–2 hits per year. The most abundant n-alkanes in the peat are C29 and C31. Possible sources for long chain n-alkanes in the peat core are ferns and grasses, which grow directly on the peat layer, and the tropical forest growing on the crater rim. Hydrogen isotope ratios of C27, C29 and C31n-alkanes vary between − 155 and − 185 ‰ (VSMOW), with the largest variability in the upper 30 cm of the record. For the period 1950–2000 AD the variability in δD of C29 alkanes resembles a smoothed record of historical tropical cyclone frequency occurring within a 500 km radius from the site. This suggests that the high number of tropical cyclones occurring in this period strongly impacted the δD signal and on average resulted in more depleted values of precipitation. In the period before 1900 AD, the variability in the hydrogen isotope record is relatively small compared to the period 1950–2000 AD. This might be the result of lower variability of tropical cyclones during this time period. More likely, however, is that it results from the increasing age span per sampled interval resulting in a lower temporal resolution. Average δD values between 1900 and 2000 AD are around − 167‰, which is similar to average values found for the period between 1800 and 1900 AD. This suggests that on average tropical cyclone frequency did not change during the past 200 years. This study demonstrates the potential of stable hydrogen isotope ratios of long chain n-alkanes for the reconstruction of past tropical cyclone frequency.

Highlights

► δD ratios of long chain sedimentary n-alkanes from a peat core in northern Queensland were analyzed ► δD of the past 50 years resembles the historical record of tropical cyclone frequency in the area ► The results suggest that tropical cyclone frequency did not increase in the past 200 years

New paper finds IPCC models underestimate cooling effect of clouds

A paper published today in the Journal of Climate finds that IPCC climate models underestimate the cooling effect of clouds in the Arctic. The paper adds to many other peer-reviewed publications demonstrating that climate models exaggerate warming by underestimating the net cooling and negative feedback from clouds.


Sensitivity of CAM5 Simulated Arctic Clouds and Radiation to Ice Nucleation Parameterization

Shaocheng Xie,1 Xiaohong Liu,2 Chuanfeng Zhao,1 and Yuying Zhang1
1 Lawrence Livermore National Laboratory, Livermore, California, USA
2 Pacific Northwest National Laboratory, Richland, Washington, USA
Abstract
Sensitivity of Arctic clouds and radiation in the Community Atmospheric Model version 5 to the ice nucleation process is examined by testing a new physically based ice nucleation scheme that links the variation of ice nuclei (IN) number concentration to aerosol properties. The default scheme parameterizes the IN concentration simply as a function of ice supersaturation. The new scheme leads to a significant reduction in simulated IN number concentrations at all latitudes while changes in cloud amount and cloud properties are mainly seen in high latitudes and middle latitude storm tracks. In the Arctic, there is a considerable increase in mid-level clouds and a decrease in low clouds, which result from the complex interaction among the cloud macrophysics, microphysics, and the large-scale environment. The smaller IN concentrations result in an increase in liquid water path and a decrease in ice water path due to the slow-down of the Bergeron-Findeisen process in mixed-phase clouds. Overall, there is an increase in the optical depth of Arctic clouds, which leads to a stronger cloud radiative forcing (net cooling) at the top of the atmosphere.

The comparison with satellite data shows that the new scheme slightly improves low cloud simulations over most of the Arctic, but produces too many mid-level clouds. Considerable improvements are seen in the simulated low clouds and their properties when compared to Arctic ground-based measurements. Issues with the observations and the model-observation comparison in the Arctic region are discussed.

Wednesday, February 27, 2013

California faces electricity crisis due to green energy


California Girds for Electricity Woes

Increased Reliance on Wind, Solar Power Means Power Production Fluctuates

SAN FRANCISCO—California is weighing how to avoid a looming electricity crisis that could be brought on by its growing reliance on wind and solar power.
Regulators and energy companies met Tuesday, hoping to hash out a solution to the peculiar stresses placed on the state's network by sharp increases in wind and solar energy. Power production from renewable sources fluctuates wildly, depending on wind speeds and weather.
[image]
California has encouraged growth in solar and wind power to help reduce greenhouse-gas emissions. At the same time, the state is running low on conventional plants, such as those fueled by natural gas, that can adjust their output to keep the electric system stable. The amount of electricity being put on the grid must precisely match the amount being consumed or voltages sag, which could result in rolling blackouts.
At Tuesday's meeting, experts cautioned that the state could begin seeing problems with reliability as soon as 2015.
California isn't the only state having trouble coping with a growing share of renewables. Texas also needs more resources, such as gas-fired power plants, that can adjust output in response to unpredictable production from wind farms.
Renewable power has seen a boom in both states. On Feb. 9, wind farms in Texas set a record for output, providing nearly 28% of the state's supply for the day. Production hasn't hit that level yet in California, but the state's goal is to get one-third of its electricity from renewable resources by 2020.
"I think we're going to end up closer to 40%," said Robert Weisenmiller, chairman of the California Energy Commission, the state's policy and planning agency for electricity.
A decade ago, California was hit by an electricity crisis marked by price surges and rolling blackouts, stemming from market manipulation and tightening electricity supplies in a newly deregulated market. To prevent a recurrence, state regulators passed rules requiring utilities to line up enough energy to meet even high power demand, with a special emphasis on in-state renewable resources.
"California has been well served by the procurement process since the crisis," said Steve Berberich, chief executive of the California Independent System Operator, which runs the state's grid. "The problem is we have a system now that needs flexibility, not capacity."
Changes in California's market have attracted lots of new generation; the state expects to have 44% more generating capacity than it needs next year. Grid officials say they expect the surplus to fall to 20% by 2022, though it will remain high for about a decade.
However, the surplus generating capacity doesn't guarantee steady power flow. Even though California has a lot of plants, it doesn't have the right mix: Many of the solar and wind sources added in recent years have actually made the system more fragile, because they provide power intermittently.
Electricity systems need some surplus, so they can cover unexpected generator outages or transmission-line failures, but having too much can depress the prices generators can charge for electricity. In part because of low power prices, many gas-fired generation units aren't profitable enough to justify refurbishments required by pending federal regulations under the Clean Water Act. That means they are likely to be shut by 2020, adding to the state's power woes.
By July, state officials hope to have a plan in place addressing the problem. Turf issues among state and federal regulators could complicate the process.
Michael Peevey, president of the California Public Utilities Commission, which regulates utilities, said action is clearly needed, but he isn't sure whether the market needs "small adjustments or a major overhaul."
Utility executives are calling for immediate action, pointing to the risk of rolling blackouts. "We see the issue hitting as soon as 2013, 2014, 2015," said Todd Strauss, the head of planning and analysis for PG&E Corp., a big utility serving Northern California, who attended Tuesday's meeting. "If we thought it was far out, we wouldn't be here."

Tuesday, February 26, 2013

New paper finds sea levels rose twice as fast to 8 meters higher than the present during last interglacial

A new paper published in Geophysical Journal International finds that during the last interglacial, global sea levels rose more than twice as fast as the present rate, to more than 8 meters higher than the present. According to the authors, the maximum 1000-year-average rate of sea level rise during the last interglacial exceeded 6 mm/yr, which is double the rate claimed by the IPCC of 3.1 mm/yr, and 5 times the rate claimed by NOAA of ~ 1.2 mm/yr. The paper adds to many other peer-reviewed studies demonstrating there is nothing unusual, unnatural, or unprecedented regarding current sea level rise, and that there is no evidence of a human influence on sea levels.

Full paper here

note higstand figures and 1000-yr-average global sea level rise derived from text in section 3.1 using the average of the two most likely estimates.

A probabilistic assessment of sea level variations within the last interglacial stage

Robert E. Kopp 1, Frederik J. Simons2, Jerry X. Mitrovica3, Adam C. Maloof2 andMichael Oppenheimer 2,4

+Author Affiliations
1Department of Earth and Planetary Sciences and Rutgers Energy Institute, Rutgers University, Piscataway, NJ 08854, USA. E-mail: robert.kopp@rutgers.edu
2Department of Geosciences, Princeton University, Princeton, NJ 08544, USA
3Department of Earth and Planetary Sciences, Harvard University, Cambridge, MA 02138, USA
4Woodrow Wilson School of Public and International Affairs, Princeton University, Princeton, NJ 08544, USA

Summary

The last interglacial stage (LIG; ca. 130–115 ka) provides a relatively recent example of a world with both poles characterized by greater-than-Holocene temperatures similar to those expected later in this century under a range of greenhouse gas emission scenarios. Previous analyses inferred that LIG mean global sea level (GSL) peaked 6–9 m higher than today. Here, we extend our earlier work to perform a probabilistic assessment of sea level variability within the LIG highstand. Using the terminology for probability employed in the Intergovernmental Panel on Climate Change assessment reports, we find it extremely likely (95 per cent probability) that the palaeo-sea level record allows resolution of at least two intra-LIG sea level peaks and likely (67 per cent probability) that the magnitude of low-to-high swings exceeded 4 m. Moreover, it is likely that there was a period during the LIG in which GSL rose at a 1000-yr average rate exceeding 3 m kyr−1, but unlikely (33 per cent probability) that the rate exceeded 7 m kyr−1 and extremely unlikely (5 per cent probability) that it exceeded 11 m kyr−1. These rate estimates can provide insight into rates of Greenland and/or Antarctic melt under climate conditions partially analogous to those expected in the 21st century.

Sequestration may slow down Obama's EPA agenda

Sequestration already appears to have significant positive effects for the economy, as the massive 2% cut in the EPA budget this year leads to the rogue agency warning that staff furloughs "could create slowdowns in some of the agency's ongoing projects at a time when Obama has signaled that federal agencies will play a leading role in carrying out his promise to respond to the threat of climate change."


Acting EPA chief warns staff of furloughs

2:20pm EST   2/26/13
WASHINGTON (Reuters) - The acting head of the U.S. Environmental Protection Agency (EPA) warned staff on Tuesday that it may place an unspecified number of jobs on temporary furlough if across-the-board federal budget cuts take effect at the end of this week.
Bob Perciasepe, acting administrator of the EPA, wrote in an email that despite taking early measures to cut agency spending on contracts, grants and administration in recent months, furloughs are inevitable.
"Even with these actions, the arbitrary nature of the required budget cuts of sequestration would force us to implement employee furloughs over the remainder of the fiscal year, ending on September 30, 2013," Perciasepe wrote.
President Barack Obama and lawmakers in Congress have yet to resolve how to avoid the deep [2% is "deep"?] automatic spending cuts due on March 1, known as "sequestration."
Perciasepe said the agency will provide employees with 30 days notice before any furlough process begins. The EPA will try to minimize the burden on staff while trying to meet its regulatory obligations, he added.
The agency is also meeting with its national unions to prepare a plan, Perciasepe said.
The Energy Department warned workers of furloughs on February 7.
Furloughs at the EPA could create slowdowns in some of the agency's ongoing projects at a time when Obama has signalled that federal agencies will play a leading role in carrying out his promise to respond to the threat of climate change.
Among other things, the EPA is due to finalize rules to reduce greenhouse gas emissions from new power plants within a few months.

Cape Cod community considers taking down wind turbines after illness, noise


Cape Cod community considers taking down wind turbines after illness, noise


By  Published February 26, 2013 | FoxNews.com

Two wind turbines towering above the Cape Cod community of Falmouth, Mass., were intended to produce green energy and savings -- but they've created angst and division, and may now be removed at a high cost as neighbors complain of noise and illness.

"It gets to be jet-engine loud," said Falmouth resident Neil Andersen. He and his wife Betsy live just a quarter mile from one of the turbines. They say the impact on their health has been devastating. They're suffering headaches, dizziness and sleep deprivation and often seek to escape the property where they've lived for more than 20 years.

"Every time the blade has a downward motion it gives off a tremendous energy, gives off a pulse," said Andersen. "And that pulse, it gets into your tubular organs, chest cavity, mimics a heartbeat, gives you headaches. It's extremely disturbing and it gets to the point where you have to leave."

The first turbine went up in 2010 and by the time both were in place on the industrial site of the town's water treatment facility, the price was $10 million. Town officials say taking them down will cost an estimated $5 million to $15 million, but that is just what Falmouth's five selectmen have decided to move toward doing.

"The selectmen unanimously voted to remove them. We think it's the right thing to do, absolutely," Selectman David Braga said. "You can't put a monetary value on people's health and that's what's happened here. A lot of people are sick because of these."

Now the matter will go to a town meeting vote in April and could ultimately end up on the ballot during the municipal elections in May.

"It's highly likely that what the voters will be determining is are they willing to tax themselves at an appropriate amount to cover the cost and dismantle and shut down the turbines?" Falmouth Town Manager Julian Suso said.

In the meantime, the turbines are being run on a limited schedule as the selectmen respond to the concerns of nearby neighbors. The turbines only run during the day -- from 7 a.m. to 7 p.m. -- which means they're operating at a loss.
The dispute has been a bitter three-year battle in the seaside town where officials argue the project was thoroughly vetted, researched and put to public vote multiple times.

"To say 'let's let the voters decide' -- it sort of flies in the face of what we went through all these years," said Megan Amsler of the Falmouth Energy Committee.

Read more: http://www.foxnews.com/politics/2013/02/26/cape-cod-community-considers-taking-down-wind-turbines-after-illness-noise/print#ixzz2M2TBaTLZ

New study finds a carbon tax would have devastating impact on economy & jobs


 -  Today, the National Association of Manufacturers (NAM) released a study conducted by NERA Economic Consulting that shows a carbon tax would have a devastating impact on manufacturing and jobs. The report, titled Economic Outcomes of a U.S. Carbon Tax, found that levying such a tax would impact millions of jobs and result in higher prices for natural gas, electricity, gasoline and other energy commodities. Manufacturing output in energy-intensive sectors could drop by as much as 15.0 percent and in non-energy-intensive sectors by as much as 7.7 percent.

“The notion that some policymakers have in Washington that an economy-wide tax of this nature is a good idea is flatly wrong,” said NAM President and CEO Jay Timmons. “Our nation’s economy and family budgets can’t take it. As consumers of one-third of our nation’s energy supply, manufacturers and our employees will struggle with higher energy prices. A carbon tax will severely harm our ability to compete with other nations.” 

Other key findings of the report include the following:
  • A carbon tax would lead to lower real wage rates because companies would have higher costs and lower labor productivity. Over time, workers’ incomes could decline relative to baseline levels by as much as 8.5 percent.
  • The impact on jobs would be substantial, with a loss of worker income equivalent to between 1.3 million and 1.5 million jobs in 2013 and between 3.8 million and 21 million by 2053.
  • Any revenue raised from the carbon tax would be far outweighed by the negative effects on the economy.
  • A carbon tax would have a negative effect on consumption, investment and jobs, resulting in lower federal revenue from taxes on capital and labor.
  • The increased costs of coal, natural gas and petroleum products due to a carbon tax would ripple throughout the economy, resulting in higher production costs and less spending on non-energy goods.
“For manufacturers, a carbon tax would cause a net negative impact on output and productivity as the higher energy costs it imposes would ripple through all their supply chains,” said NERA Senior Vice President Anne E. Smith who conducted the research for the NAM. “In turn, higher production costs and reduction in output would ripple through the rest of the economy, reducing household incomes and consumption. A carbon tax would negatively impact the U.S. economy as a whole under both scenarios examined in this study.”

The study looks at two carbon tax scenarios: one levied at $20 per ton increasing at 4 percent and the other designed to reduce carbon dioxide (CO2) emissions by 80 percent. Both cases would have a negative impact on the economy. Please click on the links for the executive summary and full report and for information on10 hard hit states.

Study finds increased CO2 will greatly enhance productivity of world's major crops

A review paper published in Climate Research finds an increase of CO2 fertilization of 300 ppm would enhance productivity of the world's major crops by 34%- 40%. The paper adds to many other peer-reviewed publications finding, contrary to claims of climate alarmists, that increased CO2 will be a boon to agricultural productivity, as well as confer resistance of rainforests to climate change. 

From the latest NIPCC Report

Field scale influence of CO2 on the world's major crops

Reference:

Vanuytrecht, E., Raes, D., Willems, P. and Geerts, S. 2012. Quantifying field-scale effects of elevated carbon dioxide concentration on crops. Climate Research 54: 35-47.

Working with peer-reviewed publications that report the results of Free-Air CO2-Enrichment (FACE) studies - which they acquired via searches of the ISI Web of Science citation database (Thomson) and the ScienceDirect citation database (Elsevier BV) - Vanuytrecht et al. (2012) conducted a meta-analysis of 529 independent observations of various plant growth responses to elevated CO2 that they obtained from 53 papers that contained relevant data in graphical or numerical format pertaining to the following major crops: wheat (Triticum aestivum L.), barley (Hordeum vulgare L.), rice (Oryza sativa L.), soybean (Glycine max L.), potato (Solanum tuberosum L.), sugar beet (Beta vulgaris L.), cotton (Gossypium hirsutum L.), maize (Zea mays L.) and sorghum (Sorghum bicolor L.), as well as the two major pasture species of perennial ryegrass (Lolium perenne L.) and white clover (Trifolium repens L.).

Considered en masse, Vanuytrecht et al. determined that for an approximate 200-ppm increase in the air's CO2 concentration (the mean enhancement employed in the studies they analyzed), water productivity was improved by 23% in the case of above ground biomass production per unit of water lost to evapotranspiration, and by 27% in the case of above ground yield produced per unit of water lost to evapotranspiration, which two productivity increases would roughly correspond to enhancements of 34% and 40% for a 300-ppm increase in the atmosphere's CO2 concentration.

It is also important to note in this regard that although "the FACE technique avoids the potential limitations of (semi-) closed systems by studying the influence of elevated CO2 on crop growth in the field without chamber enclosure," as the team of Belgian researchers write, other studies have demonstrated a significant problem caused by the rapid (sub-minute) fluctuations of CO2 concentration about a target mean that are common to most FACE experiments, as described by Bunce (2011, 2012), who found most recently that total shoot biomass of vegetative cotton plants in a typical FACE study averaged 30% less than in a constantly-elevated CO2 treatment at 27 days after planting, while wheat grain yields were 12% less in a fluctuating CO2 treatment compared with a constant elevated CO2 concentration treatment.

Looking toward the future, getting higher crop yields per unit of water used in the process of obtaining them will be a key element of mankind's struggle to feed our ever-increasing numbers over the next four decades, when our food needs are expected to double (Parry and Hawkesford, 2010); and with both land and water shortages looming on the horizon, we are going to need all of the help we can possibly get to grow the extra needed food. Fortunately, the results of this meta-analysis coming out of Belgium point to one important avenue by which such very substantial help can come, but it will only come if the air's CO2 content is allowed to rise unimpeded.

Additional References:

Bunce, J.A. 2011. Performance characteristics of an area distributed free air carbon dioxide enrichment (FACE) system. Agricultural and Forest Meteorology 151: 1152-1157.

Bunce, J.A. 2012. Responses of cotton and wheat photosynthesis and growth to cyclic variation in carbon dioxide concentration. Photosynthetica 50: 395-400.

Parry, M.A.J. and Hawkesford, M.J. 2010. Food security: increasing yield and improving resource use efficiency. Proceedings of the Nutrition Society 69: 592-600.

Wind power takes another big blow

A new paper demonstrates that estimated power production of large wind farms may be overblown by "roughly a factor of four." In addition, a recent study finds that wind farms last only 50% of the lifetime claimed by the wind power industry. Thus, the cost of wind power is roughly 8 times more than previously claimed. 

Look out! PEAK WIND is COMING, warns top Harvard physicist

Last out of the windpower future turn out the lights … Oh

The realistic limits on wind power are probably much lower than scientists have suggested, according to new research, so much so that the ability of wind turbines to have any serious impact on energy policy may well be in doubt. Even if money were no object, the human race would hit Peak Wind output at a much lower level than has previously been thought.

This new and gloomy analysis for global wind power comes from Professor David Keith of the Harvard School of Engineering and Applied Sciences. The prof and his collaborator, Professor Amanda Adams of North Carolina uni, have weighed into a row which has been taking place for some years between crusading pro-wind physicists and their critics.

The pro-wind boffins, led by such figures as Harvard enviro-prof Michael McElroy and Mark Jacobson of Stanford, have long contended that if there is any upper limit on the amount of energy that could be extracted from the Earth's winds it is well above the amount the human race requires. They further contend that extracting these vast amounts of power from the atmosphere will not have any serious impact on the world's climate.

Both these assertions, however, have been called into doubt - and the first one, that there's plenty of wind power to meet all human demands, is particularly shaky as it ignores the thorny issue of cost. McElroy, Jacobson and their allies tend to make wild assumptions - for instance that it would be feasible to distribute massive wind turbines across most or even all of the planet's surface.

Professor Keith has some scathing criticism for these ideas. To start with, he says that most large-scale wind potential calculations thus far have simply ignored the problem that the possible massive wind farms of the future are going to result in much less powerful winds for long distances behind them. He and Professor Adams write:
Estimates of global wind resource that ignore the impact of wind turbines on slowing the winds may substantially overestimate the total resource. In particular, the results from three studies that estimated wind power capacities of 56, 72 and 148 TW respectively appear to be substantial overestimates given the comparison between model results and the assumptions these studies made about power production densities ... To cite a specific example, Archer and Jacobson assumed a power production density of 4.3 W m-2 ... production densities are not likely to substantially exceed 1 W m-2 implying that Archer and Jacobson may overestimate capacity by roughly a factor of four.

Peak Wind

Keith and Adams are referring to Archer and Jacobson's paper last year [1], in which they suggested that a "practical" windpower system of the future - employingmillion wind towers spread all round the world to avoid damage to the environment (!) - might yield average output of 7.5 terawatts over time.

Professor David Keith.
As we pointed out at the time - we not being top physicists here at the Reg, but at least knowing what a Watt is - this is actually far less energy than the human race now requires, and wildly less than the amount of energy it would require if it were to build and maintain a colossal worldwide grid of enormous steel and carbon towers sunk into heavy concrete foundations along with the necessary associated world-spanning interconnectors, grid extensions, transport access into remote wilderness etc etc.
Harvard uni now informs us:
Keith’s research has shown that the generating capacity of very large wind power installations (larger than 100 square kilometers) may peak at between 0.5 and 1 watts per square meter.
As opposed to the 4+ watts assumed by Archer and Jacobson. In other words we'll be hitting Peak Wind a lot sooner than anyone thought. Archer and Jacobson's ridiculous unbuildable world wind project - which seemed likely to cost substantially more than the entire human race's economic output - would actually produce as little as one-eighth of what they think: and that was only a quarter of the amount of power that the human race might reasonably ask for (ie, say two-thirds of what a present-day European uses for everyone). So it would be able to provide about 3 per cent of global energy requirements, or well under a terawatt.
And we have to bear in mind that in the real world things are much worse still for windy dreams. Professors Keith and Adams go on:
Total wind power capacity can — of course — be very large if one assumes that turbines are placed over the entire land surface or even over the land and ocean surface, but while these geophysical limits are scientifically interesting their relevance to energy policy is unclear.

More policy-driven wind power capacity estimates have restricted the area considered ... Yet these estimates have used power production densities that are several times larger than the wind power production limit of around 1 W m-2 ... It is therefore plausible that wind power capacity may be limited to an extent that is relevant to energy policy.
It should be made clear that Professor Keith is starting from the position that global warming is still definitely on (at some point, it has been stalled for well over a decade) and that humanity must go carbon-free or anyway carbon-very-low within a lifetime, generating "several tens of terawatts" of low to nil-carbon power on that timescale. The professor is merely pointing out that wind certainly can't do anything like the whole job in that scenario, and it may not actually be able to do very much at all.
“It’s worth asking about the scalability of each potential energy source," says the prof, "whether it can supply, say, 3 terawatts, which would be 10 percent of our global energy need, or whether it’s more like 0.3 terawatts and 1 percent."

It's definitely looking as though we would be hitting Peak Wind down at the low end of that range. And with wind very much the poster child of renewable power - it is cheap, scalable and practical compared to the other methods - that would seem to be the effective end for the dream of a renewables-powered future for humanity.

Professors Keith and Adams' paper can be read in full for free here [2] courtesy of the journal Environmental Research Letters, which is distributing it under the Creative Commons Attribution 3.0 Licence [3]. ®

Monday, February 25, 2013

New paper finds Antarctic meltwater was much higher in the past

A new paper published in The Holocene finds the Antarctic interior was warmer than the present over a period lasting about 2000 years, from 4300 to 2250 years ago. The paper surveys 3 lakes within 10° of the South Pole and finds meltwater levels were up to 69.5 meters higher than the present, which the authors "interpreted as being the result of an increased number of meltwater events and/or degree-days above freezing, relative to the present." The paper adds to many other peer-reviewed papers demonstrating that Antarctica has warmed for prolonged periods to temperatures higher than the present, and that there is nothing unusual, unnatural, or unprecedented in regard to present day temperatures. 

Lake highstands in the Pensacola Mountains and Shackleton Range 4300–2250 cal. yr BP: Evidence of a warm climate anomaly in the interior of Antarctica

  1. Dominic A Hodgson1
  2. Michael J Bentley2
  1. 1British Antarctic Survey, UK
  2. 2University of Durham, UK
  1. Dominic A Hodgson, British Antarctic Survey, Natural Environment Research Council, High Cross, Madingley Road, Cambridge CB3 0ET, UK. Email: daho@bas.ac.uk

Abstract

We surveyed and dated the former shorelines of one lake in the Shackleton Range and two lakes in the Pensacola Mountains, situated inland of the Weddell Sea embayment Antarctica between 80° and 85°S. These are amongst the highest latitude lakes in the Antarctic and are located in areas where there is little or no Holocene climate and hydrological information. Surveys of the lake shorelines show that past water levels have been up to 15.7, 17.7 and 69.5 m higher than present in the three study lakes. AMS radiocarbon dating of lake-derived macrofossils showed that there was a sustained period of higher water levels from approximately 4300 and until sometime after 2250 cal. yr before the present. This is interpreted as being the result of an increased number of meltwater events and/or degree-days above freezing, relative to the present. The closest comparable ice cores from the Dominion Range in the Transantarctic Mountains (85°S, 166°E) and the Plateau Remote ice core on the continental East Antarctic Ice Sheet (84°S, 43°E) also provide some evidence of a warmer period beginning at c. 4000–3500 yr BP and ending after 2000–1500 yr BP, as does a synthesis of oxygen isotope data from five Antarctic ice cores. This suggests that the well-documented mid- to late-Holocene warm period, measured in many lake and marine sediments around the coast of Antarctica, extended into these regions of the continental interior.

Friday, February 22, 2013

New paper finds climate models exaggerate warming in Australia by 'almost double'

A paper published today in the Journal of Geophysical Research finds climate models exaggerate predicted warming of SE Australia from 1960 - 2000 by "almost double the observed magnitude." The paper corroborates findings of Dr. John Christy that climate models exaggerate warming on a global scale by 1.5 times, and those of Dr. Roy Spencer that the latest climate models warm the tropics 3 times too fast.

A Score Based Method for Assessing the Performance of GCMs: A Case Study of Southeastern Australia

Guobin Fu et al

Abstract
A multi-criteria score-based method is developed to assess General Circulation Model (GCM) performance at the regional scale. Application of the method assessing 25 GCM's simulations of monthly mean sea level pressure (MSLP) and air temperature, and monthly and annual rainfall over the southeastern Australia region for 1960/1–1999/2000 indicate that GCMs usually simulate monthly temperature better than monthly rainfall and mean sea level pressure. For example, the mean observed annual temperature for the study region is 16.7 °C while the median and mean values of 25 GCMs are 16.8 and 16.9 °C respectively, and 24 GCMs (except BCC:CM1) can reproduce the annual cycle of temperature accurately, with a minimum correlation coefficient of 0.99. In contrast, the mean observed annual rainfall for the study region is 502 mm, whereas the GCM values vary from 195 to 807 mm, and 12 out of 25 GCMs produce a negative correlation coefficient of monthly rainfall annual cycle. However, GCMs overestimate trend magnitude for temperature, but underestimate for rainfall. The observed annual temperature trend is +0.007 °C/yr, while both the median and mean [modeled] values are +0.013 °C/yr, which is almost double the observed magnitude. The observed annual rainfall trend is +0.62 mm/a, while the median and mean values of 25 GCMs are 0.21 and 0.36 mm/a, respectively. This demonstrates the advantages of using multi-criteria to assess GCMs performance. The method developed in this study can easily be extended to different study regions and results can be used for better informed regional climate change impact analysis.

Dr. John Christy explains why climate models greatly exaggerate global warming

A guest post today at a Dutch climate blog by Dr. John Christy notes that climate "models, on average, depict the last 34 years as warming about 1.5 times what actually occurred" and that the model predictions diverge from observations by even more [2.5 times] in the atmospheric layers most affected by greenhouse gases [the missing 'hot spot']. According to Dr. Christy, "Since this increased warming in the upper layers is a signature of greenhouse gas forcing in models, and it is not observed, this raises questions about the ability of models to represent the true vertical heat flux processes of the atmosphere and thus to represent the climate impact of the extra greenhouses gases we are putting into the atmosphere" and "models, on average, have been overly sensitive" to the effect of greenhouse gases.

Related from Dr. Roy Spencer:  Latest Climate Models Warm 3x Too fast
Excerpts:
CMIP5 models versus observations
Of equal importance here are the magnitudes of the actual trends of the surface and troposphere.  The average global surface trend for 90 model simulations for 1979-2012 (Climate Model Intercomparison Project 5 or CMIP-5 used for IPCC AR5) is +0.232 °C/decade.  The average of the observations is +0.157 °C/decade.  Therefore models, on average, depict the last 34 years as warming about 1.5 times what actually occurred.  Santer et al. 2012 (for 1979-2011 model output) noted that a subset of CMIP-5 models produce warming in LT that is 1.9 times observed, and for a deeper layer of the atmosphere (mid-troposphere, surface to about 18 km) the models warm the air 2.5 times that of observations.  These are significant differences, implying the climate sensitivity of models is too high.
Signature
All of the above addresses the two issues mentioned at the beginning.  First, global climate models on average depict a relationship between the surface and upper air that is different than that observed, i.e. models depict an amplifying factor into the upper air that is greater than observed.  Secondly, the average climate model depicts the warming rate since 1979 as much higher than observed with increasing discrepancies as the altitude increases (which is consistent with the first issue).
Since this increased warming in the upper layers is a signature of greenhouse gas forcing in models, and it is not observed, this raises questions about the ability of models to represent the true vertical heat flux processes of the atmosphere and thus to represent the climate impact of the extra greenhouses gases we are putting into the atmosphere.  It is not hard to imagine that as the atmosphere is warmed by whatever means (i.e. extra greenhouse gases) that existing processes which naturally expel heat from the Earth (i.e. negative feedbacks) can be more vigorously engaged and counteract the direct warming of the forcing. This result is related to the idea of climate sensitivity, i.e. how sensitive is the surface temperature to higher greenhouse forcing, for which several recent publications suggest models, on average, have been overly sensitive.
References:

Christy, J.R., B. Herman, R. Pielke, Sr., P. Klotzbach, R.T. McNider, J.J. Hnilo, R.W. Spencer, T. Chase and D. Douglass, 2010:  What do observational datasets say about modeled tropospheric temperature trends since 1979? Remote Sens. 2, 2138-2169. Doi:10.3390/rs2092148.
Douglass D. H., J. R. Christy, B. D. Pearson and S. F. Singer (2007) A comparison of tropical temperatures trends with model predictions.. International J of Climatology 5 Dec 2007. doi: 10.1002ijoc.1651
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009:
An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J.Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.<http://pielkeclimatesci.wordpress.com/files/2009/11/r-345.pdf>
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010:
Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655.
Santer B. and 26 others (2012). Identifying human influence on atmospheric temperatures. Proceeding of the National Academy of Sciences. doi:10.1073/pnas.1210514109.