Burning wood instead of coal?

burnwoodThis blog has frequently commented on the now controversial use of coal as a source of energy for power plants and other industrial energy uses. Since this became a “front-and-center” issue, a lot of research and plant-scale experimentation has gone into how to keep coal as a fuel, an approach termed “Clean Coal” technology. This concept has now reached the end of the road: (a)Scrubbing flue gases with alkalis such as ammonia or caustic is too expensive and/or produces sludges that are difficult to dispose of. (b)Carbon dioxide capture (e.g. in underground caverns) is  also too expensive and only makes sense when a large carbon dioxide demand (e,g, from tertiary oil recovery is nearby(see Sept 8th, 2014 post) – not the case for a most large power plants) and (c) Gasifying instead of burning the coal, which concentrates the CO2 stream has led to the construction of two hugely expensive plants in Missouri (see January 2nd, 2016 post)and Sasketchewan that, even if and when onstream at design capacity, will never be duplicated. There is also the Skyonic approach (see June 26, 2014 post) that makes sodium bicarbonate from the CO2 scrubbed from flue gases, but this technology is nor scalable, since there is limited worldwide demand for sodium bicarbonate. China, which has much greater pollution from coal than the U.S. (particulates as well as carbon dioxide) still talks about Clean Coal, but seems willing to spend the large amount of extra money to, for example, gasify coal instead of burning it.logs

Recently, we have heard that Congress will soon pass a bill that would encourage the use of wood (and other biomass) as an energy fuel that does not add to the carbon footprint. This is conceptually true. The decay of wood releases the contained carbon into the atmosphere, in a sense similar to burning it. The releases carbon is then reabsorbed as a new tree needs the carbon to grow. So, what’s wrong with this picture? Firstly, we need to restrict CO2 release now to deal with global warming, not many decades from now when the presumably (but not guaranteed) reforested trees start to absorb large mount of CO2. Secondly, to make any impact, we would have to decimate much forested land to make any appreciable impact on replacing coal with wood, leaving aside the logistical problem of getting massive amounts of wood to power plants.   It has been pointed out that switchgrass and other such waste biomass has a much shorter carbon cycle, but this again begs the question of getting this material in huge quantities to power plants.

The push for wood as an energy source comes from legislators from states with huge forests. Need I say more? But perhaps readers of this post will cheer that they are really not polluting the atmosphere when they burn wood in their fireplace, because somewhere a tree is being planted that will reabsorb all of the carbon going up their chimney.




Posted in Energy Industry | Tagged , , | Leave a comment

New Analysis on Antarctica doubles sea level rise projections


Source: Google Images

Continuing from my last post, I want to bring readers up to date on what many scientists now project for increases in global sea level rises caused by Greenhouse Gas emissions. Before getting to Antarctica, let’s look at the latest EPA projections on atmospheric CO2 levels. Starting at 400 parts per million(ppm) in 200, CO2 levels would rise to 1300ppm(!) by 2100 at current emission levels or, more likely to 600-800ppm as countries take steps to reduce carbon burning. The corresponding rises in average U.S. temperature  vary from 12 degrees F to 3 degrees, the latter figure corresponding to a dramatic future reduction in carbon emissions.

Sea level rise comes from two sources, (a) melting of Arctic, Greenland and other ice and (b) warming and consequent expansion of the sea water. Between 1870 and now, the sea level has risen about 7.5 inches. The EPA study projects levels to rise between 1 and 4 feet (depending on scenarios) by 2100. The base case shows New York City levels to rise by 2.3 feet and Galveston, Texas by 3.5 feet. Other studies give relatively comparable projections. What climate change “deniers” point out is that you have to accept these exponential (not linear) projections of the effect of CO2 accumulation in the atmosphere.

A new study of Antarctican ice has come up with a hypothesis that would accelerate the melting of the huge ice sheet there. Observations of Greenland’s glaciers have shown destabilizations termed “hydrofracture” in which water formed by the melting of snow and ice on top of a glacier causes it to break up and collapse. If this were to occur in Antarctica, massive ice sheets would crash into the sea when cliffs 100 meters or more above sea level become unstable and collapse.

Over the years of Earth’s history, there were two warm eras where seas were much higher than they are now. In the Pliocene period, about 3 million years ago, the atmospheric CO2 level was about where it is right now and sea levels were about 30 feet higher(!). In the Eemian period, about 120,000 years ago,  sea levels were 20 to 30 feet higher and global temperature about where it is now. Interestingly, if the entire Antarctica ice sheet were to melt, seas would rise about 50 feet.

So, these are some numbers to ponder, particularly if you live in a coastal areas and look out at the sea.




Posted in Energy Industry | Tagged | Leave a comment

Surprised? Scientists knew about Greenhouse gases long ago

imgresIt was fascinating to learn that the concept of “global warming” (or cooling) by the presence (or absence) of certain gases in our atmosphere was discovered about two hundred years ago, as discussed in a recent article in Distillations, the magazine of the Chemical Heritage Foundation. This is how scientists came to this knowledge.

Joseph Fourier, best known for his mathematical genius, made calculations to try and determine what set the temperature of the earth. He balanced the energy coming from the sun against the outgoing energy (in infrared form) and concluded that the average earth’s temperature should be around zero degrees Fahrenheit. He didn’t know or understand about the effect of atmospheric gases trapping infrared radiation. Another French scientist, Claude Pouilet speculated that water vapor and carbon dioxide might act to do this. A British scientist, John Tyndall  in 1859 set up an experiment to measure the amount of radiant heat absorbed by various gases. He demonstrated that oxygen, nitrogen and hydrogen are transparent to infrared radiation, while water vapor, carbon dioxide, and methane absorbed such radiation. Tyndall speculated, perhaps concluded, that aqueous water vapor was responsible for the higher-than calculated (by Fourier) earth temperature and therefore created the beneficial climate of our planet.

Several decades later Arrhenius thought about this and continued Fourier’s calculations, now also thinking about carbon dioxide. He recognized that the amount of water vapor in the air varies substantially with the seasons, while the amount of carbon dioxide is relatively constant, though very slowly increasing. He calculated that a doubling of carbon dioxide could increase the earth’s temperature 11-14 degrees F. (This remarkably close to current models which postulate a 5.5 to 9 degrees F  increase for a doubling of CO2) .He also concluded that historical ice ages could have come about due to a large decrease in atmospheric CO2.

Living in a cold climate, Arrhenius did not worry about a possible rise in the earth’s temperature. In fact, he suggested that an increase in atmospheric CO2 would beneficially affect the colder regions of the earth, bringing about more abundant  crops, etc.

So, it seems that the scientific community has long been aware of the effect of Greenhouse gases. But there was more concern about the possibility of another ice age than about the melting of glaciers and the rise of ocean levels. This is now our problem.



Posted in Chemical Industry, Energy Industry | Tagged | Leave a comment

Intelligent textiles: Another technology breakthrough

imagesIn my previous post I discussed how chemical companies are trying to cope with how technology is changing the workplace: the need to train insufficiently skilled/ educated workers to use the automated controls and robotics now increasingly used in plants that must compete in a globalized world. Now a broad-scaled initiative to help the U.S. in creating jobs and develop leadership in technology is starting to make progress in another area: the production of specialized fabrics that weave in tiny ceramic, metal and fiberglass fibers as “semiconductors, LED’s, solar cells and sensors that can see, hear, communicate, store energy, warm or cool a person or monitor a wearer’s health.” Clothes that include sensors and chips will then become another form of “wearable technology”, joining the Apple Watch and fitness monitors.

This is still in its early stages, but is receiving strong support.  MIT, the Department of Defense and a number of textile and other companies are cooperating in a private-public consortium Advanced Functional Fabrics of America (AFFOA) to accelerate innovation in high-tech, U.S.-based manufacturing involving fibers and textiles. The developers state that the fabrics made from these fibers will have the ability to see, hear and sense their surroundings; communicate; store and convert energy; monitor health; control temperature; and change their color. The consortium, with $75 million in Federal funding out of total initial funding of $ 317 million, will focus on developing these new technologies and training the workforce required to operate and maintain these production systems. Two dozen start-up incubators are planned at different locations.

The aim is to create an entire new industry, based on a number of breakthroughs in fiber technology and use in the manufacture of fabrics. With a history of losing textile manufacturing, first to Southern U.S. and then to China and elsewhere, the government of Massachusetts, which is a partner in AFFOA, claims that the consortium will unlock new advances in military technology and support the development of new manufacturing methods, bringing new employment opportunities back to the state.

In an example of how new technology of this kind is already being applied, Inman Mills, a South Caroline company founded in 1901, has successfully transitioned from making shirting and apparel lining – a business lost to overseas competition – to making flame-resistant fabrics, with fibers including silica to fiberglass. The next step is to make these fibers “smart”, leading to the technologies being developed by AFFOA.Smart textilesCredit: Bloomberg Business News


Posted in Chemical Industry, Manufacturing | Tagged , | Leave a comment

Chemical Manufacturing: Worker education a new priority

images First, the good news! The U.S., perhaps surprisingly to some, has now topped China in Manufacturing Competitiveness, according to a Deloitte study. The other news is not necessarily bad, but is worth noting. Jobs in manufacturing have been changing rapidly, as automation has proceeded in almost every industry. And that has created a conundrum: Lots of new jobs have been created, but a large “skills gap” has developed, with close to half of the estimated 3-4 million new jobs being created going unfilled, unless extensive training can be provided.

An article in the May 23rd issue of Chemical & Engineering News discussed what a number of chemical companies are doing to deal with this issue, with help from SOCMA and other agencies. There are basically two problems. First, a number of older workers are retiring, taking with them a great deal of knowledge. Secondly, a substantial part of the pool of potential new workers is sadly lacking in STEM knowledge and skills as needed to operate the increasingly sophisticated controls and machinery being installed in both existing and new plants. And some millenials are hesitant to apply for jobs in industry.

Turning to the chemical industry, SOCMA is offering, free of charge, a worker training curriculum called Chemical Operator Training (COT) that includes some of the necessary math, chemistry and work process skills. This course doesn’t guarantee quality instruction, but is helpful to a number of firms. But some small or Community colleges that offer an associates degree in industrial systems technology are considering grafting COT to this program so that participants can obtain a degree that includes operator training. SOCMA is also working on grafting COT to a course offered by the Manufacturing Skill and Standards Council which offers a certification program called Certified Production Technician, a 160-hour course accredited by the American National Standards Institute. People involved in developing these combined programs are enthusiastic about the potential for a “wholly new career path in manufacturing”, including a degree.

Still, many millenials are not used to or happy about the prospect of working a five day, eight hour job. Manufacturing jobs are often not considered attractive career paths. This may, in part, account for the fact that there were an estimated 600,000 unfilled jobs in 2011!

Studies have shown that people with degrees earn more than those without. So, programs that offer degrees in courses that offer STEM education as part of operator training for chemical jobs may be a real sweet spot, as the U.S, continues to pursue industrial competitiveness.


Posted in Chemical Industry, Manufacturing | Tagged | Leave a comment

U.S. Oil Self-sufficiency: Not all it’s “fracked” up to be

imagesThe recent( $30-45 per barrel) price range of crude oil is likely to last for quite some time if prognosticators are to be believed. This is partly because Iranian oil has come on the market and the Saudis show no sign of giving up market share to their sworn enemy. But it is also because it is now clear that hydraulic fracking of shale for crude oil is uneconomical for most operators  at these price levels, a situation which has already reduced U.S. oil output by about a million barrels a day , with many rigs shutting down. For those who projected the U.S. to become the world’s “swing producer” controlling the price of oil the way OPEC has been doing for a number of decades, it has now become clear that this could only occur if oil prices get back to the        $ 70-80 per barrel range or higher and if Saudi production starts to decline with aging wells. Then, the potentially unlimited amount of U.S. shale-based crude oil – given the country’s prolific shale deposits – could put the U.S. into the driver’s seat. But there is no reason for oil prices to reach a sustained level substantially above the current range unless or until world demand reaches a considerably higher level and other supply sourced start to decline.

Furthermore,  it seems that hydraulic fracturing of shale may not be the panacea that its advocates have promoted ever since the technology started to be widely employed ten years ago. While local opposition to fracking due to ground water contamination, poor remediation practices and noise was more anecdotal than widespread,  bad publicity was prominent and some states, notably New York, have banned fracking altogether. Leakage of gas into the atmosphere during the fracking operation has received increasing attention as methane is a worse Greenhouse gas than carbon dioxide, though strict regulations have been enacted to deal with this problem. Still, the growing move to reduce carbon emissions from all fuel burning (industrial and automotive) has spawned increasing negativism toward fracking as a technology since it facilitates the continued use of hydrocarbon fuels versus renewable energies and has even become an issue in the presidential campaign.

And now the opponents have a new tool to aim at the fracking industry: the very rapid growth of low level earthquakes in Oklahoma, Texas and California.

.Oklahoma quakesAs these graphics show, the number of generally Level 3 and  greater earthquakes in Oklahoma has dramatically risen in the last several years, parallel to the rapid increase in hydraulic shale cracking in that state. Studies at Southern Methodist and University of Texas have shown that the disposal of spent fracking water by pumping into wells thousands of feet down has, in such cases caused earth faults to slip, resulting in small earthquakes. Disposal and injection wells have actually been known to induce seismic activity since the 1960s, but in mostly rare cases. Now there has been good correlation between fracking water disposal wells and earthquake activity in locations like the DFW Airport. The U.S. Geological Survey has said that they can turn earthquakes on and off by injecting liquid into the ground.

What we can say is this:  Hydraulic fracking of shale with horizontal drilling in the U.S. would be an almost unlimited technology to produce crude oil at prices in the $60-100 per barrel range at some point in the future. However, price uncertainty and growing domestic opposition to fracking make this a theoretical option at this time.




Posted in Energy Industry | Tagged , | Leave a comment

Promising Carbon Capture Technology backed by ExxonMobil

imagesTechnology approaches to reduce the amount of carbon dioxide released into the atmosphere from industrial sources – principally hydrocarbon fuel-based power plants- has been evaluated in some of my earlier posts. It is now generally agreed that scrubbing flue gases with alkaline liquids, followed by stripping out the carbon dioxide, is uneconomical, except for the possible case of the Skyonics technology (see post dated June 26, 2014) which makes bicarbonate of soda and hydrochloric acid useful for fracking, but has limited broad-scale application (Small size of bicarbonate market, sale of HCL requires large local market, such as nearby large fracking installations.). I have also covered the approach where power plants do not burn the coal, but make synthesis gas from gasified coal as a technique to produce a concentrated carbon dioxide stream( Kemper lignite-based plant described in post dated January 2nd, 2016).  This technology, which is also being applied in a grassroots Canadian plant( in both cases using large government grants), is now also deemed to be uneconomical due to very high capital investment as well as high operating costs.

But now we come to an entirely new approach to carbon dioxide capture, namely use of a special type of fuel cell, which has recently received a strong vote of confidence from ExxonMobil Research. The concept is very interesting and will be described below. What is not yet clear are the economics for this approach and whether it is truly scaleable.


This fuel cell, as built and commercially used by Fuel Cell Energy, uses a high-temperature molten carbonate salt mixture. Reformed natural gas (i.e. hydrogen) and oxygen are reacted to generate power, producing carbon dioxide and water. In a typical application, the produced carbon dioxide is recycled, but in the carbon capture and sequestration (ccs) mode, the carbon dioxide-steam mixture is chilled to about 40 degrees below zero where carbon dioxide becomes a liquid and is separated and stored underground. The fuel cell then needs to replace the removed carbon dioxide and captures it from the incoming flue gas from the power plant, which substitutes for normally used air pumped into the fuel cell. Importantly, the system can also strip out 70 percent of the smog-producing oxides of nitrogen present in the power plant flue gas(!).

The company has installed relatively conventional fuel cells in fifty or so locations around the world. Now, Fuel Cell Energy wants to hook its ccarbonate cells up to power plants.The concept ( fuel cells with ccs) has been proved out in relatively small scale cells, with  part of the funding from a  $ 2.5MM grant by DOE. Now, a very major scale-up is planned, with more meaningful funds becoming available from ExxonMobil. Vijay Swarup. vice president for research and development at ExxonMobil Research and Engineering says that while commercial application at power plants is years away, the ccs-oriented fuel cell application “could be a game changer.”

Fuel Cell Energy claims that current ccs technologies, such as used at the Kemper plant, nearly double the cost of power. Their approach uses considerably less so-called parasitic power( a term used to identify the percent of the power needed to run the complete system) and therefore provides a strong economic incentive.

Posted in Energy Industry | Tagged | Leave a comment