Wednesday, January 2, 2008

Energy Tech Information

Archive for the 'Energy' Category

Hydrogen Storage For Cars?

Monday, December 24th, 2007

Hydrogen is the fuel of the future. Unfortunately, one problem remains: Hydrogen is a gas and cannot easily be pumped into a tank like gasoline. Storage in the form of solid hydrides, chemical compounds of hydrogen and a metal or semimetal, are good storage materials in principle, but have not been well suited to automotive applications. An American research team at the Ford Motor Company in Dearborn and the University of California, Los Angeles, has now developed a novel hydride that could be a useful starting point for the development of future automotive hydrogen-storage materials. As Jun Yang and his team report in the journal Angewandte Chemie, an “autocatalytic” reaction mechanism causes the composite made of three different hydrides to rapidly release hydrogen at lower temperatures and without dangerous by-products.

Certain hydrogen compounds, such as lithium borohydride (LiBH4 ) and magnesium hydride (MgH2), can release hydrogen and then take it up again. However, for automotive applications, they require temperatures that are too high to release hydrogen, the hydrogen release and uptake are far too slow, and decomposition reactions release undesirable by-products such as ammonia. In addition, these compounds can only be “recharged” under very high pressure and temperature conditions. The combination of two different hydrides (binary hydride) has previously been shown to improve things, as these compounds partly release hydrogen at lower temperatures than either of the individual components.

The researchers led by Yang went a step further and combined three hydrogen-containing compounds — lithium amide (LiNH2), lithium borohydride, and magnesium hydride — in a 2:1:1 ratio to form a ternary hydride. This trio has substantially better properties than previous binary materials.

The reason for this improvement is a complex sequence of reactions between the various components. The first reactions begin as soon as the starting components are ground together. Heating starts off more reactions, releasing the hydrogen. The mixture is “autocatalytic,” which means that one of the reactions produces the product cores for the following reaction, which speeds up the entire reaction sequence. The result is a lower desorption temperature; the release of hydrogen begins at 150 degrees Celsius. In addition the hydrogen is very pure because neither ammonia nor any other volatile decomposition products are formed. Recharging the ternary hydride with hydrogen can be accomplished under moderate conditions.

[via Wiley InterScience]

Experiments Reveal Unexpected Activity Of Fuel Cell Catalysts

Thursday, December 13th, 2007

Researchers at the U.S. Department of Energy’s Brookhaven National Laboratory have unveiled important details about a class of catalysts that could help improve the performance of fuel cells. With the goal of producing “clean” hydrogen for fuel cell reactions in mind, the researchers determined why two next-generation catalysts including gold, cerium, titanium, and oxygen nanomaterials exhibit very high activity. Their results will be published online in the December 14, 2007, edition of the journal Science.

Fuel cells combine hydrogen and oxygen without combustion to produce direct electrical power and water. They are attractive as a source of power for transportation applications because of their high energy efficiency, the potential for using a variety of fuel sources, and their zero emissions. However, a major problem facing this technology is that the hydrogen-rich materials feeding the reaction often contain carbon monoxide (CO), which is formed during hydrogen production. Within a fuel cell, CO “poisons” the expensive platinum-based catalysts that convert hydrogen into electricity, deteriorating their efficiency over time and requiring their replacement.

“Fuel cell reactions are very demanding processes that require very pure hydrogen,” said Brookhaven chemist Jose Rodriguez. “You need to find some way to eliminate the impurities, and that’s where the water-gas shift reaction comes into play.”

The “water-gas shift” (WGS) reaction combines CO with water to produce additional hydrogen gas and carbon dioxide. With the assistance of proper catalysts, this process can convert nearly 100 percent of the CO into carbon dioxide. Rodriguez’s group, which includes researchers from Brookhaven’s chemistry department, the Center for Functional Nanomaterials (CFN), and the Central University of Venezuela, studied two “next-generation” WGS nanoscale catalysts: gold-cerium oxide and gold-titanium oxide.

“These nanomaterials have recently been reported as very efficient catalysts for the WGS reaction,” said Brookhaven chemist Jan Hrbek. “This was a surprising finding because neither bulk gold nor bulk ceria and titania are active as catalysts.”

To determine how these nanocatalysts work, the research team developed so-called “inverse model catalysts.” The WGS catalysts usually consist of gold nanoparticles dispersed on a ceria or titania surface — a small amount of the expensive metal placed on the inexpensive oxide. But to get a better look at the surface interactions, the researchers placed ceria or titania nanoparticles on a pure gold surface.

“For the first time, we established that although pure gold is inert for the WGS reaction, if you put a small amount of ceria or titanium on it, it becomes extremely active,” Rodriguez said. “So although these inverse catalysts are just models, they have catalytic activity comparable to, and sometimes better than, the real deal.”

Using a technique called x-ray photoelectron spectroscopy at Brookhaven’s National Synchrotron Light Source, as well as scanning tunneling microscopy and calculations, the researchers discovered that the catalysts’ oxides are the reason for their high activity.

“The oxides have unique properties on the nanoscale and are able to break apart water molecules, which is the most difficult part of the WGS reaction,” Hrbek said. Added Brookhaven physicist Ping Liu: “After you dissociate the water, the reaction continues on to eliminate CO. But if you don’t have nanosized oxide particles, none of this will work.”

The researchers plan to continue their study of these catalysts at the NSLS and CFN in order to further explore the reaction mechanism and optimize its performance.

Methane From Microbes: A Fuel For The Future

Monday, December 10th, 2007

Microbes could provide a clean, renewable energy source and use up carbon dioxide in the process, suggested Dr. James Chong at a Science Media Centre press briefing today.

“Methanogens are microbes called archaea that are similar to bacteria. They are responsible for the vast majority of methane produced on earth by living things” says Dr. Chong from York University. “They use carbon dioxide to make methane, the major flammable component of natural gas. So methanogens could be used to make a renewable, carbon neutral gas substitute.”

Methanogens produce about one billion tonnes of methane every year. They thrive in oxygen-free environments like the guts of cows and sheep, humans and even termites. They live in swamps, bogs and lakes. “Increased human activity causes methane emissions to rise because methanogens grow well in rice paddies, sewage processing plants and landfill sites, which are all made by humans.”

Methanogens could feed on waste from farms, food and even our homes to make biogas. This is done in Europe, but very little in the UK. The government is now looking at microbes as a source of fuel and as a way to tackle food waste in particular.

Methane is a greenhouse gas that is 23 times more effective at trapping heat than carbon dioxide. “By using methane produced by bacteria as a fuel source, we can reduce the amount released into the atmosphere and use up some carbon dioxide in the process!”

[Society for General Microbiology]

Dam The Red Sea And Release Gigawatts

Thursday, December 6th, 2007

Damming the Red Sea could solve the growing energy demands of millions of people in the Middle East and alleviate some of the region’s tensions pertaining to oil supplies through hydroelectric power. Equally, such a massive engineering project may cause untold ecological harm and displace countless people from their homes.

In the Inderscience publication International Journal of Global Environmental Issues, Roelof Dirk Schuiling of Utrecht University in The Netherlands and his colleagues discuss the costs and benefits of one of the potentially most ambitious engineering projects ever.

Present technology allows us to shift and shape the earth on a relatively large scale and to control lakes and reservoirs for hydroelectric power generation. In the near future, however, it might be possible to build dams large enough to separate a body of water as large as the Red Sea, from the world oceans. A similar macro-scale engineering project is already planned for the Strait of Hormuz at the entrance of the Persian Gulf. This seawater barrier will exploit the evaporative cycle and influx of seawater to generate vast quantities of electricity.

Geochemical engineer Schuiling suggests that a dam Bab-al-Mandab could be used to stem the inflow of seawater into the highly evaporative Red Sea with the potential of generating 50 gigawatts of power. By comparison, the Palo Verde nuclear power plant, the largest nuclear station in the US has an output of just 3.2 gigawatts.

“Such a project will dramatically affect the region’s economy, political situation and ecology, and their effects may be felt well beyond the physical and political limits of the project,” says Schuiling.

Schuiling and his colleagues point out that the cost and timescales involved in creating such a hydroelectric facility are way beyond normal economical considerations. It is inevitable that such a macro-engineering project will cause massive devastation of existing ecologies. However, it will also provide enormous reductions in greenhouse gas emissions as well as offering a viable, sustainable alternative to fossil fuels for future generations. The ethical and environmental dilemmas are on an international scale, while the impact on ecology, tourism, fisheries, transport and other areas could have effects globally.

The researchers point out that the precautionary principle cannot be applied in making a decision regarding the damming of the Red Sea. “If the countries around the Red Sea decide in favor of the macro-project, it is their responsibility to limit the negative consequences as much as possible,” they conclude.

Iowa State Engineer Develops Laser Technologies To Analyze Combustion, Biofuels

Wednesday, December 5th, 2007

Let’s say a fuel derived from biomass produces too much soot when it’s burned in a combustion chamber designed for fossil fuels.

How can an engineer find the source of the problem? It originates, after all, in the flame zone of a highly turbulent combustion chamber. That’s not exactly an easy place for an engineer to take measurements.

“It’s fairly obvious when a combustor is not running well and producing a lot of soot and other pollutants,” said Terry Meyer, an assistant professor of mechanical engineering at Iowa State University. “But then how do you solve that problem? To do that we can open up the black box and look inside the combustion chamber itself.”

The tools that Meyer is developing to do that are highly sophisticated laser-based sensors that can capture images at thousands and even millions of frames per second. Those images record all kinds of data about what’s happening in the flaming mix of fuel and air.

“The goal is to probe this harsh environment to provide the knowledge required to reduce pollutant emissions and enable the utilization of alternative fuels,” Meyer said.

By selecting lasers of different wavelengths, Meyer’s combustion sensors can record where pollutants such as soot, nitric oxide and carbon monoxide are being formed. The sensors can also look for unburned fuel and capture data about fuel sprays, fuel-air mixing and energy release.

Meyer’s lab is now working on a two-year project to develop and advance laser techniques that are expected to help engineers improve the combustion systems that move vehicles, produce power and heat buildings. An important goal of the project is to analyze and improve the performance of alternative fuels in modern combustion systems.

Meyer’s research is supported by an $87,000 grant from the Grow Iowa Values Fund, a state economic development program. This grant is supplemented by a contribution of products and engineering expertise from Goodrich Corporation’s Engine Components unit in West Des Moines, a producer of fuel system components for aircraft engines, auxiliary power units, power-generating turbines and home heating systems. ConocoPhillips, the third largest integrated energy company in the United States, is supporting similar projects in Meyer’s lab. The projects are part of ConocoPhillips’ eight-year, $22.5 million research program at Iowa State. Meyer’s research is also drawing interest and support from sources such at the National Aeronautics and Space Administration and the U.S. Air Force.

Meyer started working with laser diagnostics when he was a doctoral student in mechanical engineering at the University of Illinois at Urbana-Champaign. The work continued when he spent six years as a scientist developing and applying laser techniques for the Air Force Research Laboratory at Wright-Patterson Air Force Base in Dayton, Ohio.

He made the move to Iowa State in 2006 and is working to apply some of the military’s sophisticated laser technologies to civilian applications.

And so Meyer’s system of high-speed lasers, frequency conversion units, mirrors and cameras is being built in his Multiphase Reacting Flow Laboratory on the ground floor of Iowa State’s Black Engineering Building.

It’s work that could have impacts far beyond his lab. Meyer said his research aims to reduce reliance on fossil fuels, which currently account for 85 percent of the world’s energy use.

Thermoelectric Materials One Key To Energy Savings

Tuesday, November 20th, 2007

Breathing new life into an old idea, MIT Institute Professor Mildred S. Dresselhaus and co-workers are developing innovative materials for controlling temperatures that could lead to substantial energy savings by allowing more efficient car engines, photovoltaic, cells and electronic devices.

Novel thermoelectric materials have already resulted in a new consumer product: a simple, efficient way of cooling car seats in hot climates. The devices, similar to the more-familiar car seat heaters, provide comfort directly to the individual rather than cooling the entire car, saving on air conditioning and energy costs.

The research is based on the principle of thermoelectric cooling and heating, which was first discovered in the early 19th century and was advanced into some practical applications in the 1960s by MIT professor (and former president) Paul Gray, among others.

Dresselhaus and colleagues are now applying nanotechnology and other cutting-edge technologies to the field. She’ll describe her work toward better thermoelectric materials in an invited talk on Monday, Nov. 26, at the annual meeting of the Materials Research Society in Boston.

Thermoelectric devices are based on the fact that when certain materials are heated, they generate a significant electrical voltage. Conversely, when a voltage is applied to them, they become hotter on one side, and colder on the other. The process works with a variety of materials, and especially well with semiconductors - the materials from which computer chips are made. But it always had one big drawback: it is very inefficient.

The fundamental problem in creating efficient thermoelectric materials is that they need to be very good at conducting electricity, but not heat. That way, one end of the apparatus can get hot while the other remains cold, instead of the material quickly equalizing the temperature. In most materials, electrical and thermal conductivity go hand in hand. So researchers had to find ways of modifying materials to separate the two properties.

The key to making it more practical, Dresselhaus explains, was in creating engineered semiconductor materials in which tiny patterns have been created to alter the materials’ behavior. This might include embedding nanoscale particles or wires in a matrix of another material. These nanoscale structures - just a few billionths of a meter across - interfere with the flow of heat, while allowing electricity to flow freely. "Making a nanostructure allows you to independently control these qualities," Dresselhaus says.

She and her MIT collaborators started working on these developments in the 1990s, and soon drew interest from the US Navy because of the potential for making quieter submarines (power generation and air conditioning are some of the noisiest functions on existing subs). "From that research, we came up with a lot of new materials that nobody had looked into," Dresselhaus says.

After some early work conducted with Ted Harman of MIT Lincoln Labs, Harman, Dresselhaus, and her student Lyndon Hicks published an experimental paper on the new materials in the mid 1990s. "People saw that paper and the field started," she says. "Now there are conferences devoted to it."

Her work in finding new thermoelectric materials, including a collaboration with MIT professor of Mechanical Engineering Gang Chen, invigorated the field, and now there are real applications like seat coolers in cars. Last year, a small company in California sold a million of the units worldwide.

Other Potential Applications

The same principle can be used to design cooling systems that could be built right into microchips, reducing or eliminating the need for separate cooling systems and improving their efficiency.

The technology could also be used in cars to make the engines themselves more efficient. In conventional cars, about 80 percent of the fuel’s energy is wasted as heat. Thermoelectric systems could perhaps be used to generate electricity directly from this wasted heat. Because the amount of fuel used for transportation is such a huge part of the world’s energy use, even a small percentage improvement in efficiency can have a great impact, Dresselhaus explains. "It’s very practical," she says, "and the car companies are getting interested."

The same materials might also play a role in improving the efficiency of photovoltaic cells, harnessing some of the sun’s heat as well as its light to make electricity. The key will be finding materials that have the right properties but are not too expensive to produce.

Dresselhaus and colleagues are continuing to probe the thermoelectric properties of a variety of semiconductor materials and nanostructures such as superlattices and quantum dots. Her research on thermoelectric materials is presently sponsored by NASA.

Increased Domestic Production Won’t Make US Self-Sufficient In Natural Gas

Monday, November 19th, 2007

A new report by the Energy Forum at Rice University’s Baker Institute for Public Policy finds that the United States will continue to rely on imported natural gas even if areas that are currently restricted are opened up to drilling.

Natural gas is already an important fuel in the United States, representing 22 percent of total primary energy use in 2006. About 20 percent of that gas was imported, the vast majority from Canada. Liquefied natural gas (LNG) imports have risen from virtually zero in 1986 to just in excess of 0.5 trillion cubic feet (tcf), or 2.9 percent of total U.S. natural gas consumption in 2006. The United States imports LNG from a variety of countries, including Trinidad and Tobago, Egypt, Nigeria and Algeria.

According to the new study, under a business-as-usual scenario, where U.S. lands are not opened up for drilling, by 2030, U.S. consumers could be relying on LNG imports for as much as 30 percent of total supply. This has strong implications for security of natural gas supply, as the United States becomes more reliant on LNG from the Middle East and Africa. U.S. end-use natural gas demand is expected to climb to 23.9 (tcf) in 2015 and 26.9 tcf by 2025, up from 20.0 tcf in 2006, according to study forecasts. This represents a gain of about 1.3 percent per year.

"Studies of the market outlook show that our high cost domestic production will increasingly have to compete against a swath of more competitively priced imports," said Kenneth Medlock, fellow for energy studies at the Baker Institute and a key author of the study. "In the short term, the net impacts on U.S. supply security are not all that worrisome. But long term, as our demand grows, we will have to worry more about security of supply."

In recent years, environmental and land-use considerations have prompted the United States to remove from energy development significant acreage that was once available for exploration. Twenty years ago, nearly 75 percent of federal lands were available for private lease to oil and gas exploration companies. Since then, the share has fallen to 17 percent.

Given the importance of the changing outlook for North American natural gas supply and U.S. oil and natural gas prices, the Baker Institute embarked on a two-year study, "Natural Gas in North America: Markets and Security," to investigate the future development of the North American natural gas market and the factors that will influence security of supply and pricing.

The Baker Institute Energy Forum developed a world gas trade model. The Baker Institute World Gas Trade Model (BIWGTM) simulates future development of North American natural gas trade based on the economics of resource supply, demand and commodity transportation, and it determines a market-clearing price in the process.

To determine whether the United States and its allies will become vulnerable to increasing market power of major international natural gas suppliers, like Russia and countries of the Middle East, and the role that existing drilling restrictions in the United States play in this question, scenario analysis is utilized to determine the possible effects of a complete lifting of restrictions on drilling in the Rocky Mountains and Outer Continental Shelf (OCS). The aim of these scenarios is to examine whether the impact of the increase in natural gas production from these now blocked U.S. regions would reduce the monopoly power of any potential large supplier or group of large suppliers and, similarly, would ameliorate the impact of a major accidental disruption of natural gas supply.

The Baker Institute’s scenario analysis shows that opening restricted areas in the OCS and Rocky Mountains to drilling and natural gas resource development will not render the United States energy independent nor will it even lower U.S. dependence on liquefied natural gas (LNG) imports in 2015 by a significant volume. Price impacts are also limited, with U.S. prices only registering marginal reductions.

But longer term, the study concludes, an opening of restricted areas to drilling and the contribution of expanded OCS and Rockies natural gas production could, nonetheless, be geopolitically important in combating the rise of a cartel in the international natural gas market, a so-called "GasOPEC." According to the study, "Reducing U.S. demand for LNG helps lower global natural gas prices and enhances available supplies for other major buyers in Europe and Northeast Asia. The wider swath of alternative supplies for Europe and Northeast Asia translates into significantly reduced market power of producers in Russia and the Middle East. Furthermore, the higher elasticity of supply from alternative sources as a result of allowing greater access to resources in the United States also reduces market power in the sense that a larger reduction in cartel supply would be needed to achieve any given increase in price." The study also notes that development of alternative energy could play a similar role.

One more surprising key finding of the study is that an opening of restricted areas for drilling for natural gas could have significant impacts on the flow of natural gas from Alaska to the lower 48 states. Under a business as usual scenario, where there is greater access to resources in the lower 48, the study finds that there would be delays in the development of the Alaska gas pipeline, reducing Alaskan production by as much as 0.95 tcf in 2025 (or a 40 percent reduction) relative to the case where access restrictions remain in place.

Young’s Experiment In A Hydrogen Molecule

Friday, November 16th, 2007

According to the authors the research could prove to be of great importance for the future development of quantum computation. The experiment illustrates the transition between the quantum and the macroscopic worlds. Conclusions appear in the latest issue of Science.

An international investigation involving the participation of the Consejo Superior de Investigaciones Cientificas (CSIC) has reproduced the experiment of Thomas Young in a molecule of hydrogen, the smallest molecular system that exists. In 1803 the English scientist tested a pattern of interferences in light from a distant source, on passing through a “double slit” and thus being refracted. This finding confirmed the theory that light had wave motion properties. The authors of this current research, which appears in the latest issue of the journal Science, uses electrons instead of light and the nuclei of the hydrogen molecule as emitting slits.

CSIC researcher Ricardo Diez, Vicedirector of the Centre for Materials Physics (a mixed body of the CSIC and the University of the Basque Country in Donostia-San Sebasti�n and co-author of the article, explains their experiment: "These interference patterns are the same as those produced, on a large scale, when sunlight passes through Persian blinds, throwing shadow patterns and, as it were, games, on the walls. This phenomenon is due to the fact that (light) particles, as with electrons, can also have wave motion behaviour."

At much smaller sizes, atomic planes can create interferences in the transmission of X rays, thus providing information about the internal structure of materials. This is the fundamental basis of the experimental techniques such as X ray diffraction, thanks to which the DNA double helix structure was discovered. Ricardo Diez explains, "The Laws that predict, for example, the trajectory of a car at a certain speed are not those that govern the behaviour of atomic-sized particles. On a nanometric scale sizes are measured in units a thousand million times smaller than a metre, and the behaviour of objects at this scale can prove to be surprising, almost magical even!"

The experiment
The researchers reproduced Young’s experiment in the smallest system existing — a molecule of hydrogen — which consists of two protons and two electrons. The research team used light generated by the large synchrotron accelerator at the Lawrence Berkeley National Laboratory (USA), to extract the two electrons from the molecule of hydrogen. The two protons carry out the role of the two electron-emitting apertures, separated by an extremely small distance — ten thousand millionths of a metre. On its journey to the detector, where they are collected, each one of the electrons shows an interference pattern that suggests wave nature rather than particle motion, and as if emission had taken place from the two points at the same time.

The interference pattern of each one of the two electrons extracted from the molecule is conditioned by the presence and the velocity of the other: the greater the difference in their speeds, the less the interaction between them and the more visible the interference patterns. Under these conditions, the system is more of a quantum nature. "The analysis of the patterns as a function of velocity enables the investigation of the subtle mechanisms of the transition between classical physics and quantum physics. It is necessary to understand the quantum relationship between a small number of electrons, such as those of hydrogen, as it is the basis of concepts as sophisticated as quantum cryptography or of the future development of quantum computation," concluded the CSIC researcher.

The study was led by University of Frankfurt researcher Reinhard Dorner and involved, moreover, the participation of German, North American, and Russian scientists.

Record-Breaking Hydrogen Storage Materials Discovered

Monday, November 12th, 2007

Scientists at the University of Virginia have discovered a new class of hydrogen storage materials that could make the storage and transportation of energy much more efficient — and affordable — through higher-performing hydrogen fuel cells.

Bellave S. Shivaram and Adam B. Phillips, the U.Va. physicists who invented the new materials, will present their finding at 8 p.m., Monday, Nov. 12, at the International Symposium on Materials Issues in a Hydrogen Economy at the Omni Hotel in Richmond, Va.

"In terms of hydrogen absorption, these materials could prove a world record," Phillips said. "Most materials today absorb only 7 to 8 percent of hydrogen by weight, and only at cryogenic [extremely low] temperatures. Our materials absorb hydrogen up to 14 percent by weight at room temperature. By absorbing twice as much hydrogen, the new materials could help make the dream of a hydrogen economy come true."

In the quest for alternative fuels, U.Va.’s new materials potentially could provide a highly affordable solution to energy storage and transportation problems with a wide variety of applications. They absorb a much higher percentage of hydrogen than predecessor materials while exhibiting faster kinetics at room temperature and much lower pressures, and are inexpensive and simple to produce.

"These materials are the next generation in hydrogen fuel storage materials, unlike any others we have seen before," Shivaram said. "They have passed every litmus test that we have performed, and we believe they have the potential to have a large impact."

The inventors believe the novel materials will translate to the marketplace and are working with the U.Va. Patent Foundation to patent their discovery.

"The U.Va. Patent Foundation is very excited to be working with a material that one day may be used by millions in everyday life," said Chris Harris, senior licensing manager for the U.Va. Patent Foundation. "Dr. Phillips and Dr. Shivaram have made an incredible breakthrough in the area of hydrogen absorption."

Phillips’s and Shivaram’s research was supported by the National Science Foundation and the U.S. Department of Energy.

Tags: , ,

Energy From Hot Rocks

Thursday, November 8th, 2007

Two UC Davis geologists are taking part in the Iceland Deep Drilling Project, an international effort to learn more about the potential of geothermal energy, or extracting heat from rocks.

Professors Peter Schiffman and Robert Zierenberg are working with Wilfred Elders, professor emeritus at UC Riverside, Dennis Bird at Stanford University and Mark Reed at the University of Oregon to study the chemistry that occurs at high pressures and temperatures two miles below Iceland.

"We hope to understand the process of heat transfer when water reacts with hot volcanic rocks and how that changes the chemistry of fluids circulating at depth," Zierenberg said. "We know very little about materials under these conditions."

The university team, funded by the National Science Foundation, will drill up to 4 kilometers, or 2.5 miles, into the rock. It will be one of three boreholes sunk as part of the Iceland Deep Drilling Project, which is supported largely by Icelandic power companies.

The island nation generates more than half of its electrical power from geothermal energy. Hot water and steam from boreholes can be used to run turbines for electricity or directly to heat homes and businesses. Iceland meets the rest of its electricity needs from hydroelectric power, and imports fossil fuels only for transportation.

The U.S. has lots of potential for geothermal energy generation, Zierenberg said. There are several plants in California, including the Geysers region in the north and at Mammoth Lakes. Although its share of energy generation in the state is small, the Geysers is the largest geothermal field in the world, Zierenberg said. There are also numerous abandoned oil and gas boreholes around the country — including in the Central Valley — that could potentially access hot water that could be used for space heating.

That would, however, require something of a cultural change. In Iceland, geothermal heating is used at a community level: hot water is pumped up and circulated around a town or neighborhood. Americans are more accustomed to individual power delivery, Zierenberg said.

The team expects to begin drilling in the summer of 2008.

Tags: , ,

Green Roofs Offer Energy Savings, Storm-Water Control

Thursday, November 1st, 2007

An article in the November 2007 issue of BioScience describes the history and summarizes the benefits and challenges of green roofs — roofs with a vegetated surface and substrate.

Although more expensive to construct than a typical roof, a green roof can reduce energy costs during a building’s lifetime and control storm-water runoff. Green roofs also provide havens for wildlife. Such structures are currently less common in the United States than in Japan and some European countries, notably Germany, and proponents urge their more widespread adoption.

The authors of the article, Erica Oberndorfer and her colleagues, argue for further research into the functioning of green roof ecosystems and into which plant species are most beneficial to include in roof plantings. The researchers note that the development of improved cost-benefit models for green roofs could spur the more widespread adoption of the technology.

A photograph of the dramatic, almost-complete green roof on the new California Academy of Sciences building in San Francisco appears on the cover of the issue.

Tags: , ,

Fuel Cell Breakthrough Gearing Up To Power Auto Industry

Tuesday, October 30th, 2007

The average price for all types of gasoline is holding steady around $2.95 per gallon nationwide, but the pain at the pump might be short-lived as research from the University of Houston may eliminate one of the biggest hurdles to the wide-scale production of fuel cell-powered vehicles.

Peter Strasser, an assistant professor of chemical and biomolecular engineering, led the research team in discovering a method to make a fuel cell more efficient and less expensive. The initiative is one of four ongoing fuel cell projects in development at the Cullen College of Engineering at UH.

A fuel cell converts chemically stored energy directly into electricity and is already two to three times more efficient in converting fuel to power than the internal combustion engine usually found in automobiles.

"A fuel cell is a power generation device that converts energy into electricity with very high efficiencies without combustion, flame, noise or vibration," Strasser said. "If a fuel cell is run on hydrogen and air, as planned for automotive fuel cells, hydrogen and oxygen molecules combine to provide electricity with water as the only byproduct."

The key to making a fuel cell work is a catalyst, which facilitates the reaction of hydrogen and oxygen. The most common, but expensive, catalyst is platinum. Currently, the amount of platinum catalyst required per kilowatt to power a fuel cell engine is about 0.5 to 0.8 grams, or .018 to .028 ounces. At a cost of about $1,500 per ounce, the platinum catalyst alone would cost between $2,300 to $3,700 to operate a small, 100-kilowatt two- or four-door vehicle — a significant cost given that an entire 100-kilowatt gasoline combustion engine costs about $3,000. To make the transition to fuel cell-powered vehicles possible, the automobile industry wants something better and cheaper.

"The automobile companies have been asking for a platinum-based catalyst that is four times more efficient, and, therefore, four times cheaper, than what is currently available," Strasser said. "That’s the magic number."

Strasser and his team, which includes Ratndeep Srivastava, a graduate student, Prasanna Mani, a postdoctoral researcher, and Nathan Hahn, a 2007 UH graduate, have met and, seemingly, exceeded this "magic number." The team created a catalyst that uses less platinum, making it at least four times — and up to six times — more efficient and cheaper than existing catalysts at comparable power levels.

"We have found a low platinum alloy that we pre-treat in a special way to make it very active for the reaction of oxygen to water on the surface of our catalyst," Strasser said. "A more active catalyst means that we get more electricity, or energy, for the amount of platinum used and the time it’s used for. With a material four to six times more efficient, the cost of the catalyst has reached an important target set by industrial fuel cell developers and the U.S. Department of Energy."

Although more testing of how the durability of this new catalyst compares to pure platinum is necessary, the preliminary results look promising.

"The initial results show that durability is improved over pure platinum, but only longer-term testing can tell," Strasser said. Long-term results may take some time, but industry expert Hubert Gasteiger, a leading scientist in fuel research with Aeta S.p.A. in Italy, is already excited.

"The automotive cost targets, which were developed several years ago, require that the activity of the available platinum catalysts would need to be increased by a factor of four to six," Gasteiger said. "The novel catalyst concept developed by Professor Strasser’s group has been demonstrated to provide an enhancement factor of greater than four, and, thereby, are very promising materials to achieve the platinum metals cost targets of typical hydrogen-oxygen automotive fuel cells. This is a very exciting and new development, even though more work is required to assure that the durability of these novel catalysts is equally superior to the current carbon-supported platinum catalysts."

Strasser’s preliminary results and research have been published in the October 2007 issues of Angewandte Chemie International Edition and Journal of the American Chemical Society.

Sponsored by $1.5 million in grants from the U.S. Department of Energy, National Science Foundation, major automotive fuel cell developers and NASA through the Houston Advanced Research Center, Strasser hopes companies will begin introducing fuel cell-powered cars within the next decade.

Tags: ,

The Race For Biofuels Driving Alternative Sources Of Biomass

Friday, October 26th, 2007

When will biofuels be at all local fuel pumps and from where will they come?

Researchers have been studying fuels from biomass for years. Now, with growing dependency on foreign oils and an energy-conscious society emerging, biofuels are fast becoming part of a fuel revolution that could reach pumps all across America.

Ethanol blends are already available at some gas stations. However, their availability varies from state to state, depending on the volume of ethanol produced. Sources of biomass for biofuel production in each state also vary widely.

"To see it everywhere, we have to make more of it on a regional basis," says Dr. Bill Rooney, professor of plant breeding and genetics, Soil & Crop Sciences Department, Texas A&M University. "The best source for biofuel in a region is contingent on the environment, growing season, water and fertility availability, stress resistance, and processing and conversion techniques. In any location, there will be several species grown for biomass."

Approximately 20 percent of grain sorghum is now used for ethanol production. Rooney is currently developing sorghum varieties specifically for bioenergy. He will discuss this topic on Wednesday, Nov. 7 during his talk, "Sorghum Breeding for Bioenergy Traits," at the International Annual Meetings of the American Society of Agronomy (ASA), Crop Science Society of America (CSSA) and Soil Science Society of America (SSSA). He will speak at 2:30 pm during the symposium "Breeding and Genomics of Crops for Bioenergy" at the Ernest N. Morial Convention Center in New Orleans, room 207.

Another presentation related to biofuels, "Sweet Fuel for the U.S.", will be given by Dr. Jorge Da Silva, associate professor of molecular genetics and plant breeding, Soil & Crop Sciences Department, Texas A&M University, on Tuesday, Nov. 6 at 10:15 am. His presentation will be during the symposium "Agronomic Aspects of Biofuel Crop Production" in room 214 of the Convention Center.

"Production of energy, such as ethanol, from sugar is more efficient than production from grains in both cost per unit and energy efficiency," Da Silva says. "Sugarcane is ranked first among all other crops for biomass production and can be a key component of biomass supply. Technology for producing ethanol from sugarcane is well established in tropical countries such as Brazil, where energy independence has been achieved."

Although there is no finite development timeline, there is clearly a race for biofuels as the cost of petroleum reaches previously unimaginable levels, reserves diminish, and environmental concerns soar. If won, this race could bring about a revolution as significant as Henry Ford’s creation of the Model T car.

Tags: ,

Electricity Grid Could Become A Type Of Internet

Wednesday, October 24th, 2007

In the future, everyone who is connected to the electricity grid will be able to upload and download packages of electricity to and from this network. At least, that is one of the transformations the electricity grid could undergo. Dutch researcher Jos Meeuwsen (Technical University Eindhoven) developed three scenarios for the Dutch electricity supply in the year 2050. The starting point is that in this year, 50% of the consumption will originate from sustainable sources.

Due to the security of supply and the connection with the European market, electricity networks will always be necessary says Meeuwsen. Further, due to an increasing demand for electricity it is important to include all possible energy options (including coal and nuclear energy) in the scenario development. The exact form of future networks will largely depend on the primary energy mix chosen. In all cases engineers face new and considerable challenges in the areas of network and system integration and the development and implementation of new technology. Moreover, in all scenarios the total network capacity must increase. Small-scale networks will adapt characteristics from the current large-scale networks, such as the possibility of ‘two-way traffic’ and the responsibility to maintain a stable system.

Demand follows supply
In particular, the number of ways in which the total electricity supply system can be held in balance in the future will need to be expanded as more electricity is generated from sustainable sources. This might even mean a paradigm shift from the current ‘permanently matching supply to demand’ to ‘continuously matching demand to supply’. Meeuwsen foresees a step-by-step integration of energy technology, ICT and power electronics that might result in an electricity system that exhibits many similarities with the Internet. Everyone connected to the system could then, within certain limits, upload and download packages of ‘electrical energy’ whenever they want. An important condition is, however, the technical feasibility of the centralised and/or decentralised storage of large amounts of electricity.

Three scenarios
Meeuwsen’s three different scenarios for the future of the electricity grid mainly differ in the size of the electricity generation facilities. The scenario ’super networks’ consists of large-scale production locations, transportation via high voltages, a considerable import of sustainable energy (in the form of biomass) and energy from offshore wind farms. The ‘hybrid networks’ scenario also includes large plants with high voltages that originate from offshore wind parks and large biomass stations. Additionally, small-scale generation takes place in and around cities and villages (wind, biomass and solar energy). Finally, in the ‘local’ scenario the number of local generators (in the form of micro-cogeneration units, solar energy panels, small-scale biomass plants at neighbourhood level and land-based wind turbines) is the greatest, yet large industrial processes and small consumers still make partly use of electricity from large-scale production resources.

The postdoctoral research ‘Electricity networks of the future: Various roads to a sustainable energy systems’ is part of the programme ‘Transition and transition paths: the road to a sustainable energy system’ funded by the NWO/SenterNovem Stimulation Programme Energy Research. The programme aims to develop knowledge in the natural sciences and humanities for the transition towards a sustainable energy supply.

Tags: ,

The Industrial Space Age

Thursday, October 4th, 2007

Could the exploitation of space solve the earth’s environmental crises?

The Soviet Union successfully launched Sputnik I fifty years ago on October 4th, marking the beginning of our use of space for political, military, technological, and scientific ends. Since then we have launched hundreds of satellites, space probes, telescopes, moon missions, and planetary landers. Now, political scientist Rasmus Karlsson suggests that space could provide us with a sustainable future not possible from an earthbound only perspective.

Writing in the October issue of Inderscience publication, International Journal of the Environment and Sustainable Development, Karlsson, a researcher at the University of Lund, Sweden, explains that over the years, two strands of thought on sustainable development have emerged. They are ecologism and environmentalism. Ecologism offers a solution by emphasizing the need for major socioeconomic reform aimed at a post-industrial era. Environmentalism, in contrast, focuses on the preservation, restoration, and improvement of the natural environment within the present framework.

However, Karlsson, suggests that there is a third approach to sustainable development that has until now been excluded from the agenda - namely a large-scale industrial expansion into space.

He suggests that access to the raw materials found on the Moon as well as unfiltered solar energy could be used to increase dramatically our stock of resources and energy while providing unlimited sinks for pollutants. Such an approach would satisfy two of the most demanding issues regarding sustainability, finding renewable energy sources and the disposal of pollutants.

Resource scarcity, pollution, and dwindling fossil fuels, have become of serious environmental concern in the last few decades. As such, environmentalists have called for massive reductions in energy and material consumption. Seemingly unrelated but running in parallel is that the promise of space exploration has been limited to technological optimists whose economic framework rarely acknowledges any such scarcity. Karlsson suggests that it is time to reconcile the politics of scarcity with this technological optimism and to devise a unified political vision for the 21st century that will allow lead to a truly sustainable planet by extending our reach into space.

Tags: ,

Clean Cities Program Saves 375 Million Gallons Of Gas In 2006

Tuesday, October 2nd, 2007

Clean Cities coalitions around the nation displaced the equivalent of 375 million gallons of gasoline in 2006, according to a recent report from the U.S. Department of Energy’s (DOE) National Renewable Energy Laboratory (NREL). The amount of gasoline displaced in 2006 was 50 percent more than the 250 million gallons in 2005.

Clean Cities coalitions are on track to reach 3.2 billion gallons of gasoline displaced in 2020, exceeding their established goal by 700 million gallons.

Through its almost 90 coalitions, Clean Cities works with government and industry partners (local, state and federal agencies; public health and transportation departments; transit agencies and other government offices; as well as auto manufacturers, car dealers, fuel suppliers, public utilities, public and private fleets, community business groups and professional associations) to increase the nation’s economic, environmental and energy security by reducing petroleum consumption in the transportation sector

According to the report:

  • Seventy-one percent of the 2006 gasoline displacement came from the use of alternative fuels. Thirty percent of that was from the use of compressed natural gas, mostly in heavy-duty vehicles.
  • The use of E85, a blend of 85 percent ethanol and 15 percent gasoline, grew substantially in 2006, largely because the number of E85 stations doubled — from 436 to 995 — in the year. E85 accounted for 24 percent of gasoline displacement from alternate fuels in 2006.
  • Coalitions reported acquiring almost 44,000 hybrid electric vehicles in 2006, a 61 percent increase over the 17,100 HEVs purchased in 2005. HEV use accounted for the displacement of approximately 9 million gallons of gasoline.
  • Idle reduction efforts displaced 8.4 million gallons in 2006, including 1.2 million gallons from truck stop electrification.
  • Almost 2 million gallons were saved by reducing the number of miles traveled.

“The significant progress Clean Cities made in 2006 shows impressive commitment by our coalition members,” DOE Clean Cities Director, Dennis A. Smith said.

The study was compiled from voluntary reports that represent a subset of the activities going on throughout the nation and indicates the impact of the coalitions and their priorities.

The full NREL study is available online here.

Clean Cities is part of the Office of Energy Efficiency and Renewable Energy’s Vehicle Technologies Program. The program addresses the challenge of moving the United States away from the infrastructure and practices that contribute to dependence on imported petroleum and toward energy independence and security. In support of this challenge, Clean Cities assists the nation in meet­ing its objectives for renewable and alternative fuels use. To find out more about Clean Cities, go here.

NREL is the U.S. Department of Energy’s primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for DOE by Midwest Research Institute and Battelle.

Tags: , , ,

Engineered Eggshells To Help Make Hydrogen Fuel

Wednesday, September 26th, 2007

Engineers at Ohio State University have found a way to turn discarded chicken eggshells into an alternative energy resource.

The patented process uses eggshells to soak up carbon dioxide from a reaction that produces hydrogen fuel. It also includes a unique method for peeling the collagen-containing membrane from the inside of the shells, so that the collagen can be used commercially.

L.S. Fan, Distinguished University Professor of chemical and biomolecular engineering at Ohio State, said that he and former Ohio State doctoral student, Mahesh Iyer, hit upon the idea when they were trying to improve a method of hydrogen production called the water-gas-shift reaction. With this method, fossil fuels such as coal are gasified to produce carbon monoxide gas, which then combines with water to produce carbon dioxide and hydrogen.

The eggshell plays a critical role.

"The key to making pure hydrogen is separating out the carbon dioxide," Fan said. "In order to do it very economically, we needed a new way of thinking, a new process scheme."

That brought them to eggshells, which mostly consist of calcium carbonate — one of nature’s most absorbent materials. It is a common ingredient in calcium supplements and antacids. With heat processing, calcium carbonate becomes calcium oxide, which will then absorb any acidic gas, such as carbon dioxide.

In the laboratory, Fan and his colleagues demonstrated that ground-up eggshells could be used in the water-gas-shift reaction. Iyer performed those early experiments; recent graduate Theresa Vonder Haar also worked on the project for her bachelor’s degree honors thesis.

Calcium carbonate — a key ingredient in the eggshells — captures 78 percent of carbon dioxide by weight, Fan explained. That means, given equal amounts of carbon dioxide and eggshell, the eggshell would absorb 78 percent of the carbon dioxide.

That makes it the most effective carbon dioxide absorber ever tested.

Energy experts believe that hydrogen may become an important power source in the future, most notably in the form of fuel cells. But first, researchers must develop affordable ways to produce large quantities of hydrogen — and that means finding ways to deal with the byproducts of chemical reactions that produce the gas.

According to the United States Department of Agriculture, the country produced nearly 91 billion eggs in 2006. That equates to about 455,000 tons of shell per year that could potentially be used in hydrogen production.

Still, Fan said, even if all that shell were utilized, it would only provide a portion of what the United States would need to seriously pursue a hydrogen economy.

"Eggshell alone may not be adequate to produce hydrogen for the whole country, but at least we can use eggshell in a better way compared to dumping it as organic waste in landfills, where companies have to pay up to $40 dollars per ton disposal cost," he said.

Before they could grind up the egg shell, the engineers needed to remove the collagen-containing membrane that clings to the inside; they developed an organic acid that does the job. About 10 percent of the membrane consists of collagen, which sells for about $ 1000/gram. This collagen, once extracted, can be used in food or pharmaceuticals, or for medical treatments. Doctors use collagen to help burn victims regenerate skin; it’s also used in cosmetic surgery.

"We like that our technology can help the egg industry to dispose of its waste, and at the same time convert the waste to a useful product," Fan said.

"And in the long term, we’re demonstrating that carbon-based fuel sources, like coal or biomass, can be efficiently converted to hydrogen and liquid fuel. The goal is an energy conversion system that uses a dependable fossil energy source, but at the same time has very little environmental impact."

Fan is currently working with a major egg company to produce large quantities of the eggshell granules for testing. The university plans to license the technology for further development.

Tags: , ,

To Maximize Biofuel Potential, Researchers Look For Sorghum’s ‘Sweet Spot’

Wednesday, September 12th, 2007

Picture this — IV (intravenous) lines in a sorghum field. It’s not as far-fetched as it sounds. It’s one way that scientists at the Texas Agricultural Experiment Station are researching crops that may contribute to the biofuel revolution.

In Beaumont, Dr. Lee Tarpley, plant physiologist, and College Station colleague, Dr. Don Vietor, professor of crop physiology, have focused their research on sweet sorghum.

While sweet sorghum and sugarcane are close relatives, the researchers have shown that the two species have different ways of moving and storing sugar. Tracer sucrose is inserted into growing plants, using a system similar to an IV. Once the sucrose is inside the plants, the researchers can track the movement and distribution.

They found that, due to the plant’s physiology, sweet sorghum appears to be more efficient in reusing the stored sugar to support growth of other parts of the plant. The mechanisms in sugarcane, however, allow it to accumulate very high levels of sucrose.

"The differences are critical, and need to be understood for breeders to develop new varieties specifically for the biofuel industry," Tarpley said. Sweet sorghum and sugarcane are both well suited for this purpose.

"While sorghum is an annual and can fit well into a crop rotation, sugarcane is a suitable perennial for many areas," Tarpley said. But to maximize the potential of sweet sorghum as a biofuel crop, breeders need to understand the physiology of the plant and not use sugarcane as a model.

"There is a large body of research on sugarcane that was previously thought to apply equally well to sorghum. Instead, we need to fully understand how sorghum moves and stores sugar in order to elevate to the next level in our breeding efforts," Tarpley said.

The study results were published in the June 2007 issue of BMC Plant Biology.

Tags: , ,

Engineers Perfecting Hydrogen-Generating Technology

Tuesday, August 28th, 2007

Researchers at Purdue University have further developed a technology that could represent a pollution-free energy source for a range of potential applications, from golf carts to submarines and cars to emergency portable generators.

The technology produces hydrogen by adding water to an alloy of aluminum and gallium. When water is added to the alloy, the aluminum splits water by attracting oxygen, liberating hydrogen in the process. The Purdue researchers are developing a method to create particles of the alloy that could be placed in a tank to react with water and produce hydrogen on demand.

The gallium is a critical component because it hinders the formation of an aluminum oxide skin normally created on aluminum’s surface after bonding with oxygen, a process called oxidation. This skin usually acts as a barrier and prevents oxygen from reacting with aluminum. Reducing the skin’s protective properties allows the reaction to continue until all of the aluminum is used to generate hydrogen, said Jerry Woodall, a distinguished professor of electrical and computer engineering at Purdue who invented the process.

Since the technology was first announced in May, researchers have developed an improved form of the alloy that contains a higher concentration of aluminum.

Recent findings are detailed in the first research paper about the work, which will be presented on Sept. 7 during the 2nd Energy Nanotechnology International Conference in Santa Clara, Calif. The paper was written by Woodall, Charles Allen and Jeffrey Ziebarth, both doctoral students in Purdue’s School of Electrical and Computer Engineering.

Because the technology could be used to generate hydrogen on demand, the method makes it unnecessary to store or transport hydrogen - two major obstacles in creating a hydrogen economy, Woodall said.

The gallium component is inert, which means it can be recovered and reused.

"This is especially important because of the currently much higher cost of gallium compared with aluminum," Woodall said. "Because gallium can be recovered, this makes the process economically viable and more attractive for large-scale use. Also, since the gallium can be of low purity, the cost of impure gallium is ultimately expected to be many times lower than the high-purity gallium used in the electronics industry."

As the alloy reacts with water, the aluminum turns into aluminum oxide, also called alumina, which can be recycled back into aluminum. The recycled aluminum would be less expensive than mining the metal, making the technology more competitive with other forms of energy production, Woodall said.

In recent research, the engineers rapidly cooled the molten alloy to make particles that were 28 percent aluminum by weight and 72 percent gallium by weight. The result was a "metastable solid alloy" that also readily reacted with water to form hydrogen, alumina and heat, Woodall said.

Following up on that work, the researchers discovered that slowly cooling the molten alloy produced particles that contain 80 percent aluminum and 20 percent gallium.

"Particles made with this 80-20 alloy have good stability in dry air and react rapidly with water to form hydrogen," Woodall said. "This alloy is under intense investigation, and, in our opinion, it can be developed into a commercially viable material for splitting water."

The technology has numerous potential applications. Because the method makes it possible to use hydrogen instead of gasoline to run internal combustion engines, it could be used for cars and trucks. Combusting hydrogen in an engine or using hydrogen to drive a fuel cell produces only water as waste.

"It’s a simple matter to convert ordinary internal combustion engines to run on hydrogen. All you have to do is replace the gasoline fuel injector with a hydrogen injector," Woodall said.

The U.S. Department of Energy has set a goal of developing alternative fuels that possess a "hydrogen mass density" of 6 percent by the year 2010 and 9 percent by 2015. The percent mass density of hydrogen is the mass of hydrogen contained in the fuel divided by the total mass of the fuel multiplied by 100. Assuming 50 percent of the water produced as waste is recovered and cycled back into the reaction, the new 80-20 alloy has a hydrogen mass density greater than 6 percent, which meets the DOE’s 2010 goal.

Aluminum is refined from the raw mineral bauxite, which also contains gallium. Producing aluminum from bauxite results in waste gallium.

"This technology is feasible for commercial use," Woodall said. "The waste alumina can be recycled back into aluminum, and low-cost gallium is available as a waste product from companies that produce aluminum from the raw mineral bauxite. Enough aluminum exists in the United States to produce 100 trillion kilowatt hours of energy. That’s enough energy to meet all the U.S. electric needs for 35 years. If impure gallium can be made for less than $10 a pound and used in an onboard system, there are enough known gallium reserves to run 1 billion cars."

The researchers note in the paper that for the technology to be used to operate cars and trucks, a large-scale recycling program would be required to turn the alumina back into aluminum and to recover the gallium.

"In the meantime, there are other promising potential markets, including lawn mowers and personal motor vehicles such as golf carts and wheelchairs," Woodall said. "The golf cart of the future, three or four years from now, will have an aluminum-gallium alloy. You will add water to generate hydrogen either for an internal combustion engine or to operate a fuel cell that recharges a battery. The battery will then power an electric motor to drive the golf cart."

Another application that is rapidly being developed is for emergency portable generators that will use hydrogen to run a small internal combustion engine. The generators are likely to be on the market within a year, Woodall said.

The technology also could make it possible to introduce a non-polluting way to idle diesel trucks. Truck drivers idle their engines to keep power flowing to appliances and the heating and air conditioning systems while they are making deliveries or parked, but such idling causes air pollution, which has prompted several states to restrict the practice.

The new hydrogen technology could solve the truck-idling dilemma.

"What we are proposing is that the truck would run on either hydrogen or diesel fuel," Woodall said. "While you are on the road you are using the diesel, but while the truck is idling, it’s running on hydrogen."

The new hydrogen technology also would be well-suited for submarines because it does not emit toxic fumes and could be used in confined spaces without harming crew members, Woodall said.

"You could replace nuclear submarines with this technology," he said.

Other types of boats, including pleasure craft, also could be equipped with such a technology.

"One reason maritime applications are especially appealing is that you don’t have to haul water," Woodall said.

The Purdue researchers had thought that making the process competitive with conventional energy sources would require that the alumina be recycled back into aluminum using a dedicated infrastructure, such as a nuclear power plant or wind generators. However, the researchers now know that recycling the alumina would cost far less than they originally estimated, using standard processing already available.

"Since standard industrial technology could be used to recycle our nearly pure alumina back to aluminum at 20 cents per pound, this technology would be competitive with gasoline," Woodall said. "Using aluminum, it would cost $70 at wholesale prices to take a 350-mile trip with a mid-size car equipped with a standard internal combustion engine. That compares with $66 for gasoline at $3.30 per gallon. If we used a 50 percent efficient fuel cell, taking the same trip using aluminum would cost $28."

The Purdue Research Foundation holds title to the primary patent, which has been filed with the U.S. Patent and Trademark Office and is pending. An Indiana startup company, AlGalCo LLC., has received a license for the exclusive right to commercialize the process.

In 1967, while working as a researcher at IBM, Woodall discovered that liquid alloys of aluminum and gallium spontaneously produce hydrogen if mixed with water. The research, which focused on developing new semiconductors for computers and electronics, led to advances in optical-fiber communications and light-emitting diodes, making them practical for everything from DVD players to television remote controls and new types of lighting displays. That work also led to development of advanced transistors for cell phones and components in solar cells powering space modules like those used on the Mars rover, earning Woodall the 2001 National Medal of Technology from President George W. Bush.

Also while at IBM, Woodall and research engineer Jerome Cuomo were issued a U.S. patent in 1982 for a "solid state, renewable energy supply." The patent described their discovery that when aluminum is dissolved in liquid gallium just above room temperature, the liquid alloy readily reacts with water to form hydrogen, alumina and heat.

Future research will include work to further perfect the solid alloy and develop systems for the controlled delivery of hydrogen.

The 2nd Energy Nanotechnology International Conference is sponsored by the American Society of Mechanical Engineers and ASME Nanotechnology Institute.

Tags: ,

‘Thin-Layer’ Solar Cells May Bring Cheaper ‘Green’ Power

Friday, August 24th, 2007

Scientists are researching new ways of harnessing the sun’s rays which could eventually make it cheaper for people to use solar energy to power their homes.

The experts at Durham University are developing light-absorbing materials for use in the production of thin-layer solar photovoltaic (PV) cells which are used to convert light energy into electricity.

The four-year project involves experiments on a range of different materials that would be less expensive and more sustainable to use in the manufacturing of solar panels.

Thicker silicon-based cells and compounds containing indium, a rare and expensive metal, are more commonly used to make solar panels today.

The research, funded by the Engineering and Physical Sciences Research Council (EPSRC) SUPERGEN Initiative, focuses on developing thin-layer PV cells using materials such as copper indium diselenide and cadmium telluride.

Right now the project is entering a new phase for the development of cheaper and more sustainable variants of these materials.

The Durham team is also working on manipulating the growth of the materials so they form a continuous structure which is essential for conducting the energy trapped by solar panels before it is turned into usable electricity. This will help improve the efficiency of the thin-layer PV cells.

It’s hoped that the development of more affordable thin-film PV cells could lead to a reduction in the cost of solar panels for the domestic market and an increase in the use of solar power.

Solar power currently provides less than one hundredth of one percent of the UK’s home energy needs.

The thin-layer PV cells would be used to make solar panels that could be fitted to roofs to help power homes with any surplus electricity being fed back to The National Grid.

This could lead to cheaper fuel bills and less reliance on burning fossil fuels as a way of helping to generate electricity.

Professor Ken Durose, Director of the Durham Centre for Renewable Energy, who is leading the research, said: “One of the main issues in solar energy is the cost of materials and we recognise that the cost of solar cells is slowing down their uptake.

“If solar panels were cheap enough so you could buy a system off the shelf that provided even a fraction of your power needs you would do it, but that product isn’t there at the moment.

“The key indicator of cost effectiveness is how many pounds do you have to spend to get a watt of power out”

“If you can make solar panels more cheaply then you will have a winning product.”

To aid its research the university has taken delivery of a £1.7 million suite of high powered electron microscopes, funded by the Science Research Investment Fund, which have nano-scale resolution allowing scientists to see the effects that currently limit the performance of solar cells.

One of the microscopes is the first of its kind in the UK and Professor Durose said: “This instrument will put the North East right out in front.

“We are working on new ideas in renewable energy and this opens up tremendous opportunities in research.”

Durham, Newcastle and Northumbria universities, The New and Renewable Energy Centre (NaREC), in Blyth, and the Centre for Process Innovation (CPI), in Wilton, have formed a consortium to bid to host the Energy Technologies Institute (ETI) in the North East.

The Consortium bidding to host the Energy technologies Institute (ETI) in NorthEast England, has been named as one of three short-listed finalists to host the headquarters of this national centre, which will be responsible for the allocation of approximately £1bn of private and public research funds into renewable and low carbon energy.

The North East Consortium will now face competition from Scotland and the Midlands, at a final selection presentation in London on September 6th. Made up of representatives from industry sponsors and Government, the selection panel will then make their recommendation to the ETI board, with the host location to be formally announced in early October.

Mark Pearson Energy and Process Innovation Manager at One NorthEast, a member of the ETI bid team commented: “This announcement by Durham University highlights the strength in depth we have in energy research and development in the North East, and the opportunities and change this can generate in our regional economy and built environment.

“Our bid to host the ETI recognises our capability across the region to make this initiative a success for the UK.”

Tags: ,

« Previous Entries

Source: http://www.lockergnome.com/news/category/energy/
[more links available from this site]




Tired of High Gas Prices? Use Water For Gas
Tell Me More...

ME Webhost mewebhost.com



New Years Special
ME Webhost

New Years Special

No comments: