Published in the journal Science researchers have created a new membrane that separates closely related molecules and is far sturdier than others. This newly designed material is bound to lead to lowered chemical processing costs and use in other applications of separation.
Separating chemicals has been estimated to consume around 10 percent of the world’s energy production. In creating fresh water up to 60 percent of the energy cost is used to separate substances from the pure water. This particular membrane is the culmination research starting in the 1990’s and is focused on separating xylenes, an organic compound family, from each other.
The main problem with separating these compounds is that each has very similar properties, in fact the mass and boiling points of each are exactly the same throughout the family. Even in physical size, “They differ in size by a tenth of a nanometer.” To further complicate the matter, researchers were looking for a process that is feasible at room temperature to further reduce the energy costs.
The final membrane begins with a commercial polymer that is spun into hollow fibers, linked together into mats, and then heated until only a carbon fiber membrane remains. From tests the researchers have found the membrane uses 10 – 20 times less energy than common methods of separating xylenes.
General Electrics has been facing hardships with the oil decline and overall in general conglomerate management. Having shed GE Capital as well as its appliance business, it is refocusing on power production and energy distribution.
GE’s power division is expecting demand for higher peak generators that can produce up to 55 gigawatts of power and an upturn in the market share for H-class turbines. Overall the demand for power generators is expected to rise quite a bit, with heavy duty gas turbines the majority of those products.
Biofuel production still has some obstacles before it begins powering our daily lives. The main issue is contamination from a variety of sources, and a little bit of biological engineering aims to reduce the need for filtering.
To make a batch of biofuel plant matter and specific bacteria and yeasts are mixed together, then held in favorable conditions so that the microbes can eat the starch molecules and excrete a substance we can burn. One hang up with this process is that a whole plethora of microbes are capable of digesting the raw plant matter, yet only a small portion of them create fuels for us to burn.
If fuel-making microbes can eat the food source, so can other bacteria or yeast. When other microbes start growing, they reduce the efficiency of the process.
Sterilization is nothing new to production facilities, but to reach the purity required for use, a large amount of energy will be required to sterilize a production-sized batch. Another option is to add antibiotics to control the unintended additions. Unfortunately the remaining materials after fuel is extracted is exported to feed livestock and would increase the likelihood of drug resistant bacteria in the future. So in lieu of these methods a bit of engineering of the biological persuasion will give the proper microbes a fighting chance.
By manipulating the abilities of the bacteria and yeast into super-powered forms, scientists were able to get the microbes to digest non-natural materials for biofuels. In tests purposely contaminated with competing composting bacteria, the newly strengthened microbes were able to flourish naturally where other contaminating strains were severely hampered. The specifically engineered yeast and bacteria was able to out compete the other fauna, eventually starving them to extinction within the plant matter, leaving a purer form of biofuel.
Mineral oil has been the chief raw material for fuel additives like isooctane, but new processes will produce gasoline additives from biological sources. For the first time additives to reduce premature ignition in engines will be produce from purely renewable resources.
These biobased additives will use a biobased isobutene as a starting material in a new process, sourced originally from sugar. Usually isooctane is produced from isobutene, but because the starting resource is biobased the new process will account for small differences in the properties of both chemicals. The production teams are intimately aware of the challenge of contamination from the raw materials and will approach the entire process with overall cost effectiveness in mind.
Click here for the full article by Fraunhofer-Gesellschaft.
California is a natural gas powered state, but wind and solar are rapidly increasing in popularity. Hydroelectric power locations are dwindling, locations becoming more scarce and less efficient as more power plants are introduced. Even though these renewable power sources have increased in the total share of power supply to California, natural gas has increased its share as well, about 13% in just the last year. This phenomenon is due to the lack of reliability in wind and solar power generation, as these power plants cannot compete with the demand for power, natural gas plants are run to make up the difference.
The French power-group EDF will review possible dangers of upcoming British nuclear power plants on July 28. At the present, two billion euros have been spent of the budgeted 22 billion, and some of the initial partners have already refused to continue with the project. EDF will most likely need to restructure and cut costs to complete funding this project, endangering jobs for the goal of creating more.
In principle, EDF has a guaranteed price for the future production at Hinkley Point for 35 years. But there is no guaranteed demand for all the output and 35 years is a long time in Britain not to have any changes to a major contract. While EDF could normally use any excess production at Hinkley Point to feed into the continental electricity grid, there are exchange rate risks and perhaps market access issues.If there are no other financial backers, UK funds or companies, EDF could better turn its back on Britain.
Click here for the full article by Marel Michelson.
The Haitian electric utility company E-Power S.A. began construction of a 30 Megawatt power facility for Port-au-Prince in 2008. While under construction the 2010 magnitude 7.0 earthquake struck. Along with immense loss of life and mental wel-being for the people, the electric infrastructure of Haiti was irreparably damaged. Recovery was in danger.
“So much was destroyed in 2010 that we had no way for Hyundai to unload the equipment for the new plant near Port-au-Prince,” says Ludwig von Lignau, operations manager at E-Power. “There was also no bridge from the Dominican Republic that could handle this equipment, and so the U.S. Army agreed to build a makeshift dock near the site.”
E-Power’s facility was commissioned to run outdated software as their control system, but in order to better aid with the recovery endeavors leaders implemented a custom controls system redesign. By adapting non standard implementations and culling excessive operator interactions the plant is now providing 35% of the local power with unimaginable reaction time.
“We’re able to respond very fast to changing electricity demand because we can start units in just 10 minutes. We can react quickly to high-frequency requirements.” -von Lignau
Click here to reed the full article by Jim Montague.
It is estimated that the oceans hold 4 billion tons of uranium. This amount of uranium would be enough to power the world’s major cities for thousands of years, the trouble is getting it out of the water. Scientists have shown progress on using a material that binds to uranium dioxide in seawater and can later be treated to remove the uranium. This process would entail dragging braided polyethylene fibers coated with amidoxime through the oceans.
The process is still inefficient and expensive, but finding alternatives to uranium ore mining is a necessary step in planning for the future of nuclear energy.
Uranium is only found in seawater at a concentration of 3.3 micrograms per liter, that converts to 1 particle of Uranium to every 3,000,000,000,000,000 particles of the remainder of seawater. The material is inefficient in that only 6 grams of Uranium is adsorbed for every kilogram of the material, or an efficiency of .6% after 8 weeks of collection.
If constant extraction via this method were to be enacted a fleet would need roughly 693,000 kilograms of the material being dragged at all times, just to fuel a single Gigawatt nuclear power plant for the same duration.
Click here for the full article by Jennifer Hackett.
After a successful test of the Radio-Frequency Quadrupole Linac, RFQ, engineers and physicists from Lawrence Berkeley National Laboratory will be upgrading this superconducting linear accelerator. On its first trial run the RFQ accepted nearly 100% of the source beam without failure, a remarkable feat with so many complex control systems.
This upgrade plans to adapt the front end of the accelerator producing high-intensity proton beams for experiments.
The lab’s current RFQ, which sits at the beginning of the laboratory’s accelerator chain, accelerates a negative hydrogen ion beam to 0.75 million electronvolts, or MeV. The new RFQ, which is longer, accelerates a beam to 2.1 MeV, nearly three times the energy. Transported beam current, and therefore power, is the key improvement with the new RFQ. The current RFQ delivers 54-watt beam power; the new RFQ delivers beam at 21 kilowatts – an increase by a factor of nearly 400.
Innovation on the new upgrade hinges on a waveform cut out of positioning vanes within the accelerator. The waveform is designed with longer distances between peaks and troughs as the beam travels along the accelerator. This lengthening accounts for increased speed as the beam accelerates, keep the time between peaks and troughs equal the entire journey through the RFQ.
Data is exploding around the world in every industry, and manufacturing is running into a logistics problem. With the increase in data points and measurements, how can we use it to efficiently organize and improve performance of our processes?
The entire industry of manufacturing makes up a significant portion of global energy consumption, but on a per unit basis that has been decreasing continuously. Manufacturers are using data to work smarter and reduce their use of energy. The best use of data analysis for improving energy efficiency is determining untimely consumption and excess usage. It can also prioritize specific retrofits for machinery that consume more energy per unit than newer models.
By garnering data as granular as machine specific measurements a production manager can see troublesome machinery in detail, and plant managers can coordinate repairs or replacements before production is halted from breakdowns. In certain energy production situations catastrophe can be avoided before start-up by knowing the precise condition of pipelines and machinery.
Click here for the full article by Bill Kenworthy.