May 15, 2013 — Chances are you know how many miles your car logs for each gallon or tankful of gas, but you probably have only a foggy idea of how much energy your house consumes, even though home energy expenditures often account for a larger share of the household budget.
This disparity in useful energy data is just one of severalinformation gaps that must be bridged as the United States transitions towards residences that generate as much energy as they use over the course of a year — so-called net-zerohouses.
Gaps — and strategies to overcome them — are summarized in Strategies to Achieve Net-Zero Energy Homes: A Framework for Future Guidelines, a new publication* from the National Institute of Standards and Technology (NIST) based on the discussions at a 2011 workshop convened by the agency.
One such strategy, proffered by experts who attended the workshop, is to require that energy costs be listed in all real-estate transactions.
“This means incorporating energy in the appraisal process, and the valuation of principal, interest, taxes, and insurance (PITI), so that it incorporates energy cost considerations to become the valuation of principal, interest, taxes, insurance, and energy cost considerations (PITIE),” the report says.
The report breaks out three categories of challenges: design, technology and equipment, and the needs and behaviors of homeowners and the building industry.
With regard to design, one workshop recommendation is to establish a scoring system for new and used homes so that prospective buyers can “compare energy, durability, indoor air quality, accessibility, and other factors relative to their needs.”
In net-zero energy homes, energy loads will be substantially lower than current heating and cooling equipment is built to deliver and existing product performance standards are designed to test. According to the report, manufacturers will need new guidelines and underlying data that will help them size their equipment offerings appropriately and align performance with the conditions and requirements of net-zero energy homes.
The behaviors and requirements of homeowners and builders may provide the most complex set of challenges. One clear need, the report says, is to help designers, builders and occupants understand how best to collect and analyze home energy data.
“Consumers require information that is useful, timely and understandable to be able to make the energy purchase and consumption decisions necessary to achieve net-zero energy for new and existing homes,” the report says.
According to Bloomberg, Warren Buffet’s MidAmerican Energy Holdings Co. is gearing up to drop $1.9 billion on new wind farms in Iowa. The investment might build as many as 656 new turbines by 2015, which would add as much as 1,050 megawatts of wind power capacity to the 2,285 megawatts the company already operates in the state.
The project could also herald a revival in American wind power in general. The anticipated expiration of the production tax credit for wind energy drove a spike in installations in 2012, then a lull into 2013, and finally an anticipated ramp up now that the credit was extended for another year by the fiscal cliff deal.
And because the new extension merely requires projects to start construction by the end of the year to qualify — projects previously had to actually come online by the end of the year to benefit from the credit — GE now expects the full force of the revival to hit in 2014:
Wind-farm developers including NextEra Energy Inc. (NEE) and Invenergy LLC may install 3,000 megawatts to 4,000 megawatts of turbines in the U.S. this year and as much as 7,000 megawatts next year, Anne McEntee, GE’s vice president of renewable energy, said today in an interview.
The U.S. added a record 13,124 megawatts of turbines last year, outpacing natural gas installations for the first time, as wind developers raced to complete projects ahead of the Dec. 31 expiration of the production tax credit. Denmark’s Vestas Wind Systems A/S (VWS) andSpain’s Gamesa Corp Tecnologica SA (GAM) also expect new orders to pick up by the third quarter.…
GE has received orders this year for more than 1,000 megawatts of wind turbines, including one from NextEra for 100.3 megawatts announced today for a Michigan wind farm and Invenergy’s 215-megawatt deal announced last week for a project in Texas.
Also coming down the pike for wind power is the new version of GE’s Brilliant — a 2.5 megawatt wind turbine, featuring new smart systems and accompanying storage capacity. With both its own sensors and access to the internet, the Brilliant can take in weather forecast data, grid system information, and supply and demand patterns, and use all that top adjust everything from electronics operations to its blade positions. Combined with a new height and an increase of rotor length to 120 meters, these changes boost the new Brilliant’s efficiency by 25 percent over the last model.
The batteries will boast 50 kilowatt-hours of storage a pop, and be hooked up to the turbines from a nearby ground pad. The batteries will store up excess power generated when the wind is blowing the strongest and the turbines are operating at peak capacity, then distribute the power during off hours. This smooths out the power supply from the wind farms, thus avoiding a lot of the disruptions and reliability issues that came along with the fact that the wind does not always cooperate with the needs of us humans.
All told, this would continue the roll wind power has already been on in the United States: 2012 saw the installation of wind capacity outpace all other forms of energy production, and the U.S. and China led the boom in global installations that same year.
May 9, 2013 — Advanced biofuels — liquid fuels synthesized from the sugars in cellulosic biomass — offer a clean, green and renewable alternative to gasoline, diesel and jet fuels. Bringing the costs of producing these advanced biofuels down to competitive levels with petrofuels, however, is a major challenge. Researchers at the U.S. Department of Energy (DOE)’s Joint BioEnergy Institute (JBEI), a bioenergy research center led by Berkeley Lab, have taken another step towards meeting this challenge with the development of a new technique for pre-treating cellulosic biomass with ionic liquids — salts that are liquids rather than crystals at room temperature. This new technique requires none of the expensive enzymes used in previous ionic liquid pretreatments, and makes it easier to recover fuel sugars and recycle the ionic liqu
“Most of our ionic liquid efforts at JBEI have focused on using enzymes to liberate fermentable sugars from lignocellulosic biomass after pretreatment, but with this new enzyme-free approach we use an acid as the catalyst for hydrolyzing biomass polysaccharides into a solution containing fermentable sugars,” says Blake Simmons, a chemical engineer who heads JBEI’s Deconstruction Division and was the leader of this research. “We’re then able to separate the pretreatment solution into two phases, a sugar-rich water phase for recovery and a lignin-rich ionic liquid phase for recycling. As an added bonus, our new pretreatment technique uses a lot less water than previous pretreatments.”
Simmons is the corresponding author of a paper describing this research that has been published in the journal Biotechnology for Biofuels.
With the burning of fossil fuels continuing to add 9 billion metric tons of excess carbon dioxide to the atmosphere each year, the need for carbon neutral, cost-competitive renewable alternative fuels has never been greater. Advanced biofuels, produced from the microbial fermentation of sugars in lignocellulosic biomass, could displace gasoline, diesel and jet fuel on a gallon-for-gallon basis and be directly dropped into today’s engines and infrastructures without impacting performance. If done correctly, the use of advanced biofuels would not add excess carbon to the atmosphere.
Environmentally benign ionic liquids are used as green chemistry substitutes for volatile organic solvents. While showing great potential as a biomass pretreatment for dissolving lignocellulose and helping to hydrolyze the resulting aqueous solution into fuel sugars, the best of these ionic liquids so far have required the use of expensive enzymes. Recent studies have shown that acid catalysts, such as hydrochloric or Brønsted, can effectively replace enzyme-based hydrolysis, but the subsequent separation of sugars and ionic liquids becomes a difficult and expensive problem can require the use of significant amounts of water.
Guided by molecular dynamics simulations carried out at DOE’s National Energy Research Scientific Computing Center (NERSC), Simmons and his colleagues at JBEI solved this problem by deploying the ionic liquid imidazolium chloride in tandem with an acid catalyst.
“Imidazolium is the most effective known ionic liquid for breaking down lignocellulose and the chloride anion is amenable with the acid catalyst,” Simmons says. “The combination makes it easy to extract fermentable sugars that have been liberated from biomass and also easy to recover the ionic liquid for recycling. By eliminating the need for enzymes and decreasing the water consumption requirements of more traditional ionic liquid pretreatments we should be able to reduce the costs of sugar production from lignocellulose.”
Complete separation of the pretreatment solution into sugar-rich water and lignin-rich ionic liquid phases was attained with the addition to the solution of sodium hydroxide. The optimized sodium hydroxide concentration for both phase separation and sugar extraction was 15-percent, resulting in the recovery of maximum yields of 54-percent glucose and 88-percent xylose. The JBEI researchers believe these sugar yields can be increased by optimizing the process conditions and using more advanced methods of phase separation and sugar recovery.
“After optimizing the process conditions, our next step will be to scale the process up to 100 liters,” Simmons says. “For that work we will use the facilities at the Advanced Biofuels Process Demonstration Unit.”
This research was supported by the DOE Office of Science, which also supports NERSC.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Ning Sun, Hanbin Liu, Noppadon Sathitsuksanoh, Vitalie Stavila, Manali Sawant, Anaise Bonito, Kim Tran, Anthe George, Kenneth L Sale, Seema Singh, Blake A Simmons, Bradley M Holmes. Production and extraction of sugars from switchgrass hydrolyzed in ionic liquids. Biotechnology for Biofuels, 2013; 6 (1): 39 DOI: 10.1186/1754-6834-6-39
Need to cite this story in your essay, paper, or report? Use one of the following formats:
Ning Sun of the Joint BioEnergy Institute was lead author on a paper describing an enzyme-free ionic liquid pretreatment of biomass that can help boost the production of advanced biofuels. (Credit: Photo by Roy Kaltschmidt)
Wind power had a banner year in 2012,accounting for more new generating capacity than any other resource. Despite the boom in cheap natural gas, 42 percent of all new capacity last year was actually from wind, which clocked in at 13,131 MW in new installations for the year.
The economics for wind power have only gotten more compelling, helped in no small part by the Production Tax Credit and the Investment Tax Credit creating strong investment incentives. This boom in wind power has begun to transform electricity markets across the country, creating significant net benefits for consumers and providing low carbon power to homes nationwide.
A new report from the consulting firm Synapse and Americans for a Clean Energy Grid found that increases in wind power in the PJM Interconnection could save consumers $6.9 billion per year out to 2026, along with 14 percent reductions in CO2.
The PJM Interconnection is the world’s largest competitive wholesale electricity market, serving 60 million customers across 13 states and the District of Columbia. Currently, wind provides about 1.5 percent of the electricity to PJM’s customers, and accounts for 3.4 percent of installed capacity. Based just on the Renewable Portfolio Standards in states within the PJM region, renewables must provide 14 percent of all electricity by 2026. It is likely that about 11 percent of this will come from new wind generation.
The report from Synapse uses the projections from meeting Renewable Portfolio Standards as a reference case, and then takes a look at the effects of doubling the amount of wind power required by statute in the PJM region out to 2026. By modeling the production costs and capital investments, the authors can ascertain what the difference is between these two cases for revenue requirements (and therefore, the impact on ratepayers). The big difference between these two cases is that in the reference scenario, natural gas composes the majority of new generation beyond what the state Renewable Portfolio Standard requires—and wind takes that market share in the other. This illustrates the shortsightedness of the “cheap natural gas” narrative that has become conventional wisdom in mainstream reporting. The cheaper bet, actually, is increasing deployment of wind.
The report finds that doubling capacity of installed wind, from 32.1 GW in the base case to 65.4 GW in the wind case, would create net savings of $6.9 billion per year. This is the result of savings from production costs amounting to $14.5 billion, and capital investment requirements of $7.6 billion. Remember that the capital investment numbers here represent the difference between what would be spent in the reference case ($17.4 billion) and what would be spent in the wind case ($25 billion).
The other benefit of more wind in PJM is lower wholesale prices. The report finds that the load-weighted average annual price for power drops from $80.27 per megawatt hour to $78.53 per megawatt hour.
The report notes that “the price differences are the greatest in non-summer months, when wind output is the highest, load is the lowest, and supply margins are the greatest.” These price differences are a function of how the electricity market works; the clearing price for all resources is set by the last marginal unit that is needed to meet load. More wind power in the system, whose bids are close to zero because of no fuel costs and low operational expenses, edges out the use of peaker plants. Since these peaker plants are often much more expensive, the net effect of more wind power in the market is that the clearing prices get set lower– saving consumers money while still providing the same electricity.
While net prices are lower, the modeling shows that market prices are higher in the summer months due to variability of wind output. These higher prices are reflective of the fact that more peaking fossil fuel resources are used in those times when wind power isn’t as available, or when load is higher and more resources are required to meet demand. This dynamic could be altered by new energy storage technology, higher utilization of demand response, and increasing efficiency standards to drive demand reductions during peak hours.
On top of all that, the high wind case means that CO2 emissions will go down significantly from 3.2 million tons to 2.6 million tons. This is the equivalent of taking a million cars off the road, or forgoing 610 million gallons of gasoline.
The bottom line is that going beyond the Renewable Portfolio Standards in the PJM region would lower electricity prices, save ratepayer’s money, and lower emissions to help fight climate change.
When winds were at their strongest in California this month,wind turbines were providing the state with nearly twice as much electricity as nuclear reactors.
The Golden State saw a surge in new wind farms last year, taking its wind power capacity to 5,544 megawatts. That put it second in the nation behind Texas, which has more than 12,000 MW of installed wind capacity.
California also ranks second in the U.S. in the amount of employment associated with the wind industry, with more than 7,000 jobs, the [American Wind Energy Association] said.
Nationally, wind energy production grew 28% in the U.S. last year in what AWEA describes as the industry’s best year to date.
“We had an incredibly productive year in 2012,” said Rob Gramlich, interim chief executive of AWEA. “It really showed what this industry can do and the impact we can have with a continued national commitment to renewable energy.”
The wind isn’t blowing everywhere all the time, so actual electricity production from wind turbines is never as high as total capacity. But storms earlier this month pushed wind power generation in California above 4,000 MW. From Greentech Media:
Winds that reached over 90 miles per hour on mountain ridges blew down through the wind farms in California’s Altamont, San Gorgonio, and Tehachapi Passes and across the state’s wind installations, raising their outputs to a record-shattering 4,196 megawatts on [the evening of April 7], according the California Independent System Operator …
Peak wind output came at 6:44 PM. Total system generation was 23,923 megawatts at the time, making wind 17.5 percent of the state’s electricity supply.
The total system peak output was 27,426 megawatts at 4:07 p.m. that afternoon. In the hour before that, with the total system producing 23,145 megawatts, California got 6,677 megawatts of its electricity, or 28.8 percent, from renewables.
By comparison, the state has two nuclear power plants. Diablo Canyon’s twin reactors are capable of producing up to 2,200 MW of power. San Onofre hasn’t generated any electricity since January 2012, when radiation leaked into the ocean from damaged tubes, althoughregulators are considering allowing operations to resume soon at reduced capacity.
Air pollution, one of the “greatest hazards to human health,” is killing millions while causing climate change
- Andrea Germanos, staff writer
The UN has warned that air pollution is one of “the greatest hazards to human health.” (UN Photo/Kibae Park)A global shift to clean energy could save millions of lives while also helping to rein in runaway greenhouse gases, which have set the planet on a path of disastrous warming, UN officials stated Tuesday.
The shift would bring about a dramatic reduction in air pollution, which the United Nations World Health Organization (WHO) warned was one of “the greatest hazards to human health.”
The UN pointed to the burning of fossil fuels—including oil and natural gas production and diesel engine exhaust—as culprits in outdoor air pollution, which Dr. Maria Neira, the WHO’s Director of Public Health and Environment, told a meeting of the UN Environment Programme’s (UNEP) Climate and Clean Air Coalition (CCAC) was responsible for “3.3 million deaths every year.”
“If we increase access to clean energy … the health benefits will be enormous,” Neira told Reuters, and warned that the number of deaths from air pollution would rise with continued dependence on fossil fuels.
In addition to the pollution outdoors, indoor air pollution was cited for its deadly effects.
Neira said ”estimations we have now tell us there are 3.5 million premature deaths every year caused by household air pollution.”
Inefficient cook stoves are a major source of indoor air pollution, which the UN said can release “carbon monoxide and other pollutants at levels up to 100 times higher than the recommended limits set by WHO.”
The dangers of air pollution, indoor and out, are truly deadly and far worse than previously thought, the UN said.
Kandeh Yumkella, director general of the U.N. Industrial Development Organization, gave this sobering assessment: “Air pollution is causing more deaths than HIV or malaria combined.”
Solar power and other distributedrenewable energytechnologies could lay waste to U.S. power utilities and burn the utility business model, which has remained virtually unchanged for a century, to the ground.
That is not wild-eyed hippie talk. It is the assessment of the utilities themselves.
Back in January, the Edison Electric Institute — the (typically stodgy and backward-looking) trade group of U.S. investor-owned utilities — released a report [PDF] that, as far as I can tell, went almost entirely without notice in the press. That’s a shame. It is one of the most prescient and brutally frank things I’ve ever read about the power sector. It is a rare thing to hear an industry tell the tale of its own incipient obsolescence.
I’ve been thinking about how to convey to you, normal people with healthy social lives and no time to ponder the byzantine nature of the power industry, just what a big deal the coming changes are. They are nothing short of revolutionary … but rather difficult to explain without jargon.
So, just a bit of background. You probably know thatelectricity is provided by utilities. Some utilities both generate electricity at power plants and provide it to customers over power lines. They are “regulated monopolies,” which means they have sole responsibility for providing power in their service areas. Some utilities have gone through deregulation; in that case, power generation is split off into its own business, while the utility’s job is to purchase power on competitive markets and provide it to customers over the grid it manages.
This complexity makes it difficult to generalize about utilities … or to discuss them without putting people to sleep. But the main thing to know is that the utility business model relies on selling power. That’s how they make their money. Here’s how it works: A utility makes a case to a public utility commission (PUC), saying “we will need to satisfy this level of demand from consumers, which means we’ll need to generate (or purchase) this much power, which means we’ll need to charge these rates.” If the PUC finds the case persuasive, it approves the rates and guarantees the utility a reasonable return on its investments in power and grid upkeep.
Thrilling, I know. The thing to remember is that it is in a utility’s financial interest to generate (or buy) and deliver as much power as possible. The higher the demand, the higher the investments, the higher the utility shareholder profits. In short, all things being equal, utilities want to sell more power. (All things are occasionally not equal, but we’ll leave those complications aside for now.)
Now, into this cozy business model enters cheap distributed solar PV, which eats away at it like acid.
First, the power generated by solar panels on residential or commercial roofs is not utility-owned or utility-purchased. From the utility’s point of view, every kilowatt-hour of rooftop solar looks like a kilowatt-hour of reduced demand for the utility’s product. Not something any business enjoys. (This is the same reason utilities are instinctively hostile to energy efficiency and demand response programs, and why they must be compelled by regulations or subsidies to create them. Utilities don’t like reduced demand!)
It’s worse than that, though. Solar power peaks at midday, which means it is strongest close to the point of highest electricity use — “peak load.” Problem is, providing power to meet peak load is where utilities make a huge chunk of their money. Peak power is the most expensive power. So when solar panels provide peak power, they aren’t just reducing demand, they’re reducing demand for the utilities’ most valuable product.
But wait. Renewables are limited by the fact they are intermittent, right? “The sun doesn’t always shine,” etc. Customers will still have to rely on grid power for the most part. Right?
This is a widely held article of faith, but EEI (of all places!) puts it to rest. (In this and all quotes that follow, “DER” means distributed energy resources, which for the most part means solar PV.)
Due to the variable nature of renewable DER, there is a perception that customers will always need to remain on the grid. While we would expect customers to remain on the grid until a fully viable and economic distributed non-variable resource is available, one can imagine a day when battery storage technology or micro turbines could allow customers to be electric grid independent. To put this into perspective, who would have believed 10 years ago that traditional wire line telephone customers could economically “cut the cord?” [Emphasis mine.]
Indeed! Just the other day, Duke Energy CEO Jim Rogers said, “If the cost of solar panels keeps coming down, installation costs come down and if they combine solar with battery technology and a power management system, then we have someone just using [the grid] for backup.” What happens if a whole bunch of customers start generating their own power and using the grid merely as backup? The EEI report warns of “irreparable damages to revenues and growth prospects” of utilities.
Utility investors are accustomed to large, long-term, reliable investments with a 30-year cost recovery — fossil fuel plants, basically. The cost of those investments, along with investments in grid maintenance and reliability, are spread by utilities across all ratepayers in a service area. What happens if a bunch of those ratepayers start reducing their demand or opting out of the grid entirely? Well, the same investments must now be spread over a smaller group of ratepayers. In other words: higher rates for those who haven’t switched to solar.
That’s how it starts. These two paragraphs from the EEI report are a remarkable description of the path to obsolescence faced by the industry:
The financial implications of these threats are fairly evident. Start with the increased cost of supporting a network capable of managing and integrating distributed generation sources. Next, under most rate structures, add the decline in revenues attributed to revenues lost from sales foregone. These forces lead to increased revenues required from remaining customers … and sought through rate increases. The result of higher electricity prices and competitive threats will encourage a higher rate of DER additions, or will promote greater use of efficiency or demand-side solutions.
Increased uncertainty and risk will not be welcomed by investors, who will seek a higher return on investment and force defensive-minded investors to reduce exposure to the sector. These competitive and financial risks would likely erode credit quality. The decline in credit quality will lead to a higher cost of capital, putting further pressure on customer rates. Ultimately, capital availability will be reduced, and this will affect future investment plans. The cycle of decline has been previously witnessed in technology-disrupted sectors (such as telecommunications) and other deregulated industries (airlines).
Did you follow that? As ratepayers opt for solar panels (and other distributed energy resources like micro-turbines, batteries, smart appliances, etc.), it raises costs on other ratepayers and hurts the utility’s credit rating. As rates rise on other ratepayers, the attractiveness of solar increases, so more opt for it. Thus costs on remaining ratepayers are even further increased, the utility’s credit even further damaged. It’s a vicious, self-reinforcing cycle:
One implication of all this — a poorly understood implication — is that rooftop solar fucks up the utility model even at relatively low penetrations, because it goes straight at utilities’ main profit centers. (It’s already happening in Germany.) Right now, distributed solar PV is a relatively tiny slice of U.S. electricity, less than 1 percent. For that reason, utility investors aren’t paying much attention. “Despite the risks that a rapidly growing level of DER penetration and other disruptive challenges may impose,” EEI writes, “they are not currently being discussed by the investment community and factored into the valuation calculus reflected in the capital markets.” But that 1 percent is concentrated in a small handful of utility districts, so trouble, at least for that first set of utilities, is just over the horizon. Utility investors are sleepwalking into a maelstrom.
(“Despite all the talk about investors assessing the future in their investment evaluations,” the report notes dryly, “it is often not until revenue declines are reported that investors realize that the viability of the business is in question.” In other words, investors aren’t that smart and rational financial markets are amyth.)
Bloomberg Energy Finance forecasts 22 percent compound annual growth in all solar PV, which means that by 2020 distributed solar (which will account for about 15 percent of total PV) could reach up to 10 percent of load in certain areas. If that happens, well:
Assuming a decline in load, and possibly customers served, of 10 percent due to DER with full subsidization of DER participants, the average impact on base electricity prices for non-DER participants will be a 20 percent or more increase in rates, and the ongoing rate of growth in electricity prices will double for non-DER participants (before accounting for the impact of the increased cost of serving distributed resources).
So rates would rise by 20 percent for those without solar panels. Can you imagine the political shitstorm that would create? (There are reasons to think EEI is exaggerating this effect, but we’ll get into that in the next post.)
If nothing is done to check these trends, the U.S. electric utility as we know it could be utterly upended. The report compares utilities’ possible future to the experience of the airlines during deregulation or to the big monopoly phone companies when faced with upstart cellular technologies. In case the point wasn’t made, the report also analogizes utilities to the U.S. Postal Service, Kodak, and RIM, the maker of Blackberry devices. These are not meant to be flattering comparisons.
Remember, too, that these utilities are not Google or Facebook. They are not accustomed to a state of constant market turmoil and reinvention. This is a venerable old boys network, working very comfortably within a business model that has been around, virtually unchanged, for a century. A friggin’ century, more or less without innovation, and now they’re supposed to scramble and be all hip and new-age? Unlikely.
So what’s to be done? You won’t be surprised to hear that EEI’s prescription is mainly focused on preserving utilities and their familiar business model. But is that the best thing for electricity consumers? Is that the best thing for the climate?
America could be powered almost entirely with wind turbines and solar systems by 2030 at a cost comparable to what we’re spending for dirty power today, a new study finds. The necessary approach would surprise most people, and it would generate enough economic activity to make any capitalist drool: Build, build, build … and then build some more.
The analysis … challenges the common notion that wind and solar power need to be paired with fossil fuel or nuclear generators, so utilities can meet electricity demand when it’s not windy or sunny.
The paper instead proposes building out a “seemingly excessive” amount of wind and solar generation capacity — two to three times the grid’s actual peak load. By spreading that generation across a wide enough geographic area, Rust Belt utilities could get virtually all of their electricity from renewables in 2030, at a cost comparable to today’s prices, it says.
At 2030 technology costs and with excess electricity displacing natural gas, we find that the electric system can be powered 90%–99.9% of hours entirely on renewable electricity, at costs comparable to today’s—but only if we optimize the mix of generation and storage technologies. …
We find that 90% of hours are covered most cost-effectively by a system that generates from renewables 180% the electrical energy needed by load, and 99.9% of hours are covered by generating almost 290% of need. Only [9 to 72 hours] of storage were required to cover 99.9% of hours of load over four years. So much excess generation of renewables is a new idea, but it is not problematic or inefficient, any more than it is problematic to build a thermal power plant requiring fuel input at 250% of the electrical output, as we do today.
The findings support a growing awareness of the potential for renewable energy to power America — and a rejection of doomsayers and fossil fuel executives who say we must keep propping ourselves up with coal, natural gas, and oil.
So keep those wind and solar farms coming, America. And throw in a few batteries too.
The Republican minority in the Senate loves to obstruct confirmation of President Obama’s Cabinet nominees, but it isn’t saying boo about the man who appears set to become the nation’s next energy secretary.
President Obama’s pick to become the nation’s next secretary of energy is drawing criticism for his deep ties to the fossil fuel, fracking and nuclear industries. MIT nuclear physicist Ernest Moniz has served on advisory boards for oil giant BP and General Electric, and was a trustee of the King Abdullah Petroleum Studies and Research Center, a Saudi Aramco-backed nonprofit organization.
At the same time, Moniz has stressed the importance of moving away from coal and has promoted and called for more funding for renewable energy and energy efficiency. That’s earned him praise from the Natural Resources Defense Council. But other environmental and watchdog groups are campaigning against his nomination because of his industry ties.
[B]eyond his job in academia, Moniz has also spent the last decade serving on a range of boards and advisory councils for energy industry heavyweights, including some that do business with the Department of Energy. That includes a six-year paid stint on BP’s Technology Advisory Council as well as similar positions at a uranium enrichment company and a pair of energyinvestment firms.
Such industry ties aren’t uncommon for cabinet nominees, and Obama specifically praised Moniz for understanding both environmental and economic issues.
Still, Moniz’s work for energy companies since he served in President Clinton’s Energy Department has irked some environmentalists.
“His connections to the fossil fuel and nuclear power industries threaten to undermine the focus we need to see on renewables and energy efficiency,” said Tyson Slocum, director of the energy program at the consumer advocacy group Public Citizen.
Slocum pointed out that Moniz, if confirmed, will set research and investment priorities, including at the department’s network of national laboratories.
The Energy Department hands out billions of dollars in contracts and loan guarantees as it pushes energy research and development and administers the nation’s nuclear weapons stockpile and cleanup efforts.
Moniz is also coming under criticism for a big report on natural gas released by the MIT Energy Initiative in 2011. It called the environmental impacts of fracking “challenging but manageable,” endorsed natural-gas exports, and talked up gas as a “bridge fuel” that could help the country move away from dirtier fossil fuels and toward clean energy (a controversial notion).
MITEI and the study’s authors presented the study as independent, but did not disclose its authors’ significant financial ties to the oil and gas industry. … The MIT report … failed to disclose that a study co-chair, Anthony Meggs, had joined gas company Talisman Energy prior to the release of the study. Another study group member, John Deutch, has served on the board of the LNG company Cheniere Energy since 2006 and owns $1.4 million in Cheniere stock.
The PAI study also notes that the MIT study was funded by oil and gas industry sources, including a foundation closely linked to Chesapeake Energy. … The study was also advised by a committee dominated by oil and gas insiders.
The Senate Energy and Natural Resources Committee will hold a hearing on Moniz’s nomination on April 9. He’ll have to release a financial disclosure form by then, so we’ll soon learn more about how much money he’s made advising and consulting for energy companies. He’ll also need to submit an ethics agreement describing how he would avoid conflicts of interest.
A few Democratic senators might toss him hard questions at the hearing, but don’t look for Republicans to put up much of a fuss. Overall, Moniz is expected to be easily confirmed.
Mar. 25, 2013 — Scientists at the University of East Anglia have made an important breakthrough in the quest to generate clean electricity from bacteria.
Findings published today in the journal Proceedings of the National Academy of Sciences (PNAS) show that proteins on the surface of bacteria can produce an electric current by simply touching a mineral surface.
The research shows that it is possible for bacteria to lie directly on the surface of a metal or mineral and transfer electrical charge through their cell membranes. This means that it is possible to ‘tether’ bacteria directly to electrodes — bringing scientists a step closer to creating efficient microbial fuel cells or ‘bio-batteries’.
The team collaborated with researchers at Pacific Northwest National Laboratory in Washington State in the US.
Shewanella oneidensis is part of a family of marine bacteria. The research team created a synthetic version of this bacteria using just the proteins thought to shuttle the electrons from the inside of the microbe to the rock.
They inserted these proteins into the lipid layers of vesicles, which are small capsules of lipid membranes such as the ones that make up a bacterial membrane. Then they tested how well electrons travelled between an electron donor on the inside and an iron-bearing mineral on the outside.
Lead researcher Dr Tom Clarke from UEA’sschool of Biological Sciences said: “We knew that bacteria can transfer electricity into metals and minerals, and that the interaction depends on special proteins on the surface of the bacteria. But it was not been clear whether these proteins do this directly or indirectly though an unknown mediator in the environment.
“Our research shows that these proteins can directly ‘touch’ the mineral surface and produce an electric current, meaning that is possible for the bacteria to lie on the surface of a metal or mineral and conduct electricity through their cell membranes.
“This is the first time that we have been able to actually look at how the components of a bacterial cell membrane are able to interact with different substances, and understand how differences in metal and mineral interactions can occur on the surface of a cell.
“These bacteria show great potential as microbial fuel cells, where electricity can be generated from the breakdown of domestic or agricultural waste products.
“Another possibility is to use these bacteria as miniature factories on the surface of an electrode, where chemicals reactions take place inside the cell using electrical power supplied by the electrode through these proteins.”
Biochemist Liang Shi of Pacific Northwest National Laboratory said: “We developed a unique system so we could mimic electron transfer like it happens in cells. The electron transfer rate we measured was unbelievably fast — it was fast enough to support bacterial respiration.”
The finding is also important for understanding how carbon works its way through the atmosphere, land and oceans.
“When organic matter is involved in reducing iron, it releases carbon dioxide and water. And when iron is used as an energy source, bacteria incorporate carbon dioxide into food. If we understand electron transfer, we can learn how bacteria controls the carbon cycle,” said Shi.
The project was funded by the Biotechnology and Biological Sciences Research Council (BBSRC) and the US Department of Energy.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Thomas A Clarke, Gaye White, Julea N Butt, David J Richardson, Zhri Shi, Liang Shi, Zheming Wang, Alice C Dohnalkova, Matthew J Marshall, James K Fredrickson and John M Zachara. Rapid electron exchange between surface-exposed bacterial cytochromes and Fe(III) minerals. Proceedings of the National Academy of Sciences, March 25, 2013
Need to cite this story in your essay, paper, or report? Use one of the following formats: