NewEnergyNews: 10/01/2010 - 11/01/2010/


Gleanings from the web and the world, condensed for convenience, illustrated for enlightenment, arranged for impact...

The challenge now: To make every day Earth Day.



  • TTTA Wednesday-ORIGINAL REPORTING: The IRA And The New Energy Boom
  • TTTA Wednesday-ORIGINAL REPORTING: The IRA And the EV Revolution

  • Weekend Video: Coming Ocean Current Collapse Could Up Climate Crisis
  • Weekend Video: Impacts Of The Atlantic Meridional Overturning Current Collapse
  • Weekend Video: More Facts On The AMOC

    WEEKEND VIDEOS, July 15-16:

  • Weekend Video: The Truth About China And The Climate Crisis
  • Weekend Video: Florida Insurance At The Climate Crisis Storm’s Eye
  • Weekend Video: The 9-1-1 On Rooftop Solar

    WEEKEND VIDEOS, July 8-9:

  • Weekend Video: Bill Nye Science Guy On The Climate Crisis
  • Weekend Video: The Changes Causing The Crisis
  • Weekend Video: A “Massive Global Solar Boom” Now

    WEEKEND VIDEOS, July 1-2:

  • The Global New Energy Boom Accelerates
  • Ukraine Faces The Climate Crisis While Fighting To Survive
  • Texas Heat And Politics Of Denial
  • --------------------------


    Founding Editor Herman K. Trabish



    WEEKEND VIDEOS, June 17-18

  • Fixing The Power System
  • The Energy Storage Solution
  • New Energy Equity With Community Solar
  • Weekend Video: The Way Wind Can Help Win Wars
  • Weekend Video: New Support For Hydropower
  • Some details about NewEnergyNews and the man behind the curtain: Herman K. Trabish, Agua Dulce, CA., Doctor with my hands, Writer with my head, Student of New Energy and Human Experience with my heart




      A tip of the NewEnergyNews cap to Phillip Garcia for crucial assistance in the design implementation of this site. Thanks, Phillip.


    Pay a visit to the HARRY BOYKOFF page at Basketball Reference, sponsored by NewEnergyNews and Oil In Their Blood.

  • ---------------
  • WEEKEND VIDEOS, August 24-26:
  • Happy One-Year Birthday, Inflation Reduction Act
  • The Virtual Power Plant Boom, Part 1
  • The Virtual Power Plant Boom, Part 2

    Sunday, October 31, 2010


    Funding up for grabs in Carbon Trust OWA access systems competition
    Rikki Stancich, 22 October 2010 (Wind Energy Update)

    "In the UK’s bid to install 33GW of offshore wind power by 2020, some 6,000 turbines will need to be installed in UK waters within the decade…[T]he average size of wind farm [is] expected to rise from 30-100 turbines to as many as 2,500 per site by the end of the decade…[They will be] 65km – 285km from the coast…[in a far harsher] operating environment…[than] today’s offshore wind farms, which are less than 25km from shore…

    "The UK’s Carbon Trust, via its Offshore Wind Accelerator programme, is…offering several million in funding for the best solutions for transfer systems, vessel design and launch and recovery systems…to dramatically improve the availability of turbines and the safety of people…"

    click to enlarge

    [Phil de Villiers, Manager, Offshore Wind Accelerator program:] "The Offshore Wind Accelerator programme includes eight developers who have options to develop 30 GW in UK waters…[It is focused] on four key areas, including the foundations and structures (which account for up to 45% of capital expenditure), on electrical systems (how to minimize losses and build redundancy into layouts; wake effects (maximizing yield through layout); and access systems through operations and maintenance…[T]he original equipment manufacturers are best positioned to deliver [wind turbine] cost benefits…"

    [Phil de Villiers, Manager, Offshore Wind Accelerator program:] "Today’s [100MW, 30 turbine] wind farms are less than 20 kilometres from shore…[and] require relatively simple access systems…[R]isks are manageable…[T]he next-generation wind farms could be as far as 65 kilometres from shore…Ocean conditions are more severe and the average wave height is much higher…[W]e need a new O&M strategy. We need a manned platform or mother ship on which a crew is positioned on a permanent basis, with turbine access vessels…"

    click to enlarge

    [Phil de Villiers, Manager, Offshore Wind Accelerator program:] "...[We need] to move away from friction-based [transfer] systems to avoid operations being restricted by the sea state…[Also, it] will require different transport systems…[that] have sea bridges – a gangway that locks into place on the turbine, with hydraulics that could heat-compensate the bridge…A likely future strategy is to have a mother ship with smaller vessels launched from the mother ship for operations and maintenance…The bigger the vessel, the more stable and easier it is to transfer, but larger vessels cost more to run, they are slower, and they are more difficult…[but] the smaller vessel compromises on transfer capacity…"

    [Phil de Villiers, Manager, Offshore Wind Accelerator program:] "...[W]e’ll need to build new service vessels…We estimate over the next ten years a global send of around £2 billion on commissioning and building mother ships, service vessels and transfer ships. Around 50% of that will be spent in UK waters…[W]e need the certification companies and industry groups to continue to inform the regulators."


    South Africa: 5GW solar park to kick off CSP deployment
    Annabel Eaton, 15 October 2010(CSP Today)

    "An ambitious undertaking in South Africa, industrial gateway to the African continent, is the proposed creation of a R150 billion solar park, which on completion, would have the capacity to produce approximately 5 000 MW of renewable energy…[I]f implemented, [it] will be carried out in phases, with the initial phase having a capacity of 1 000 MW. The solar park would incorporate Eskom’s 100 MW concentrated solar power (CSP) plant, which has received part funding from the World Bank.

    "…South Africa, a country fraught by rolling blackouts, has to increasingly turn towards alternative energy sources to help counteract its dire energy shortage and decrease its dependence on coal-fired power stations. The latter cannot be relied upon given the unreliability of South Africa’s future coal reserves…Furthermore carbon emissions need to be reduced in order to adhere to international green principles and combat global warming."

    click to enlarge

    "Apart from lowering carbon emissions, a particular advantage of solar power is that it can be deployed relatively quickly and incrementally. Other advantages it offers are industrial development opportunities, job-creation and portfolio risk management…A prefeasibility study…carried out by the Clinton Climate Initiative (CCI) in conjunction with South Africa’s Department of Energy…indicated that Upington in the Northern Cape, the proposed location for the solar park, is ideal…

    "…[V]arious proven solar technologies, such as concentrated photovoltaic solutions, and CSP technologies including power tower and trough technologies, are [reportedly] being examined…According to Ministry of Energy special advisor, Jonathan de Vries, photovoltaic, which turns sun directly into electricity, would be used for peak power, CSP which can store heat would be used for base-load power…[T]he first phase for the production of 1 000 MW would be built in increments from a range of solutions, and this initial phase would be used to assess the performance of the various solar technologies…"

    click to enlarge

    "…Drivers for CSP in the Northern Cape…shows that apart from attracting international manufacturers to South Africa, CSP manufacturing would create opportunities for domestic production, with resultant skills development and job creation. CSP manufacturing could become a new, local industry…Depending on investor response, site preparation for the Upington solar plant could start as early as 2011, and the first power plants could start producing by the second half of 2012.

    "This proposed initiative is but one element of the South African government’s goal to diversify its energy sources, to help overcome the country’s power supply crisis. In early October the Department of Energy released proposals, which form part of its draft integrated electricity resource plan, to reduce coal dependency by almost half by 2030…[The] draft plan suggests that coal contribute 48% to the energy mix by 2030, followed by renewable energy (16%), then nuclear (14%), and finally a range of smaller alternative energy options…[with] 52 248 MW of new capacity by 2030…"


    3TIER Releases Q3, 2010 Wind Performance Maps for Europe and North America; Winds generally up in North America and down in Europe
    October 27, 2010 (3 Tier Group)

    "3TIER®…released wind performance maps for the third quarter of 2010 covering both Europe and North America. It is the first such map 3TIER has produced for Europe after strong interest in the analysis in North America. The maps reveal that much of North America experienced higher than average wind speeds during the quarter. Meanwhile virtually all of Europe experienced normal or below normal wind speeds with the exception of the UK and other small pockets that saw significantly elevated wind speeds."

    [Kenneth Westrick, founder/CEO, 3TIER:] "The maps highlight how short-term weather patterns can significantly disrupt normal climatic expectations…While the performance maps clearly illustrate the variability of wind resources…the good news is that we have the scientific expertise and technology to account for these fluctuations, incorporate them into a project's financials, and forecast their occurrence with a considerable degree of certainty."

    click to enlarge

    "In Europe, a prolonged high-pressure system over Russia caused an extreme heat wave and depressed wind speeds. This blocking event also depressed wind speeds below their long-term averages across most of central and northern Europe. Nonetheless, isolated regions saw wind speeds 10% or more above average including the UK, southern Sweden, a band from the Balkans through Romania, and along the Mediterranean coast of France and northern Italy.

    "North America experienced a less patch-worked pattern, with wind speeds reaching 10% above average or more across a wide band from Texas through the Great Lakes into eastern Canada and the northeastern US. Likewise, most of the Intermountain region and Rocky Mountains also saw elevated wind speeds."

    click to enlarge

    [Kenneth Westrick, founder/CEO, 3TIER:] "We can assess with a high degree of accuracy, what the performance of a project or region will look like over a 40 year period as it is impacted both positively and negatively by normal climatic fluctuations…Banks, developers, and financial stakeholders look to 3TIER and its sophisticated information services to provide an independent and objective quantification of the potential resource in order to understand the long-term risk, and to maximize power production and profitability."

    "3TIER generated the Q3 Wind Performance Maps by combining observations and numerical weather prediction (NWP) modeling. The map illustrates departures from the long-term mean that range from -10% to +10%, showing a pattern that is indicative of the climate state during the quarter. It provides an indication of how wind projects should have performed relative to their long-term production average based on their location."


    Taiwan Takes Its Place Among Industry Leaders
    Bettina Weiss, October 2010 (The Grid/PV Group)

    "Taiwan's high-tech industry is one of the great success stories in the world economy…According to the PV Status Report 2010 announced by the European Commission in August, PV cell production capacity in Taiwan is projected to reach 3GW in 2010 making Taiwan the 3rd largest solar cell production site in the world. Last quarter solar sales rose 95 percent to $1.1 billion…

    "…Taiwan is making its mark contributing to global PV capacity…[and] contributing its share on the supply side…Last year, the Taiwan Legislative Yuan gave its final approval to the Renewable Energy Development Act to increase solar demand and bolster the development of Taiwan’s green energy industry…[It] authorized the Government to enhance feed-in tariff incentives for the development of solar energy and other programs to enhance industrial development. Among the goals of the law was the mandate to increase Taiwan’s renewable energy generation capacity by 6.5 GW to a total of 10 GW within 20 years."

    click to enlarge

    "Since the feed-n tariff was established, Taiwan's Ministry of Economic Affairs (MOEA) had received 981 photovoltaic (PV) system projects filed for application for feed-in tariff subsidization with total installation capacity of 151MWp, far exceeding the target of 64MWp for 2010…Global equipment and materials suppliers from around the world have been supporting Taiwan’s aggressive ramp in solar capacity…

    "Taiwan offers a useful model for high tech and solar industry development…Hsinchu Science Park, founded in 1980 to spur technological development by offering tax breaks and other incentives to lure investors…boasts over 400 companies, including famous names such as Taiwan Semiconductor Manufacturing Co., the world's biggest chip foundry…"

    click to enlarge

    "…[T]he government has played a key role in establishing special funds for national scientific development, working out long-term plans and organization of specific responsibilities, and completing experimental centers for academic research…[It] takes an active planning role in bolstering national competitiveness through science and technology legislation…[and has] devised policies to develop important technologies based on the recommendations of the national science and technology conferences, which solicit opinions from industry, government agencies, and academic institutions…

    "The Industrial Technology Research Institute (ITRI) is the non-profit research institute located in Taiwan under the supervision of the Republic of China Ministry of Economic Affairs. It conducts R&D and other programs to advance private sector growth, spur innovation and accelerate commercialization. ITRI has over 6,000 employees and an operating budget of about US$510 million, half of which comes from private sources…[ITRI is] highly flexible in creating partnership with the private sector…[for] technology diffusion, technical assistance, research, special projects and technology transfer…[and works] to tie R&D and technology development to cost reduction and process efficiency in the fab…"

    Saturday, October 30, 2010

    One Way To Fight Climate Change (The Enforcer)

    It may someday come to this but for now there are alternatives. (See below) From Aaron Leder via Vimeo

    Another Way To Fight Climate Change (Solar Roadtrip3)

    The solar roadtrip visits the Rockies and the Sunbelt and shows off everything from solar power plants to a solar-powered winery and a solar-powered baseball park. It’s amazing what the gifts of this good earth have to offer, isn’t it? From TheSolarIndustry via YouTube

    And Another Way To Fight Climate Change (Wind Means Hometown Jobs)

    The enemies of New Energy are attempting to mislead the public into believing that incentive money provided to create a level playing field with Old Energy is being invested in foreign economies. It is not true but a misinformed electorate could lever its political leaders into withholding the New Energy incentives, thereby forcing a tragic further downturn in a truly domestic industry. The real losers would be those in the U.S. blue collar workforce. Just check out this story of how wind saved a town deserted by Maytag. From americanwindenergy via YouTube

    Friday, October 29, 2010


    A beautiful coalition against dirty energy
    Van Jones, 28 October 2010 (Grist)

    "New polls are showing that the majority of Californians reject Proposition 23, a November ballot initiative -- funded by Texas oil companies -- that would effectively repeal the state's landmark clean energy and environmental protection laws.

    "What the polls do not show and what few news outlets are covering, is the striking diversity of voices that are demanding clean energy, and rejecting the false notion that protecting the planet and our public health will hurt the economy…Last week, the No on Prop 23 campaign experienced a surge of support from groups that included a council of inter-faith leaders, university academics, the wealthiest man in the country (Bill Gates), an award-winning Hollywood director (James Cameron), and even President Obama and former Vice President Al Gore."

    click to enlarge

    "They join the rank and file of a coalition that is rarely witnessed in our modern age of ultra-polarizing politics…This coalition includes social justice organizations of all creeds and colors, whose missions are to empower the voices of the working class and communities of color -- including immigrants. These groups understand that less smog means less asthma, fewer trips to the emergency room, and healthier neighborhoods for their children.

    "The coalition also includes a group of investors who represent more than $421 billion in assets, much of it in the clean tech sector. They make the case that clean energy technology is the next wave of the industrial revolution, and California is poised to become a leader in innovation, job creation, and commercialization of these technologies. However, they also warn that reversing course on policy -- precisely what Prop 23 aims to achieve -- will cause investment to flow elsewhere (mainly to places like China and parts of Europe)…"

    click to enlarge

    "What's happening in California is truly amazing. Hundreds of thousands, if not millions of voices, from literally every political, ethnic, faith, and socio-economic spectrum, all pulling for the same cause. This beautiful coalition gives us a glimpse of the green path forward toward clean energy, a prosperous sustainable economy, and a healthier planet…[T]he face of politics has begun to change, slowly…The green movement has suffered setbacks…But what we see happening in California gives the green movement a reason for continued optimism…

    "…The fight has unmasked the opponents of clean energy, as well as vetted their arguments…[and] this fight has revealed our friends and allies…[H]uge swaths of Californians, from all walks of life, can find common value in supporting cleaner air and a commitment to growing the clean technology sector…The fight is far from over, and with Election Day approaching in less than a week, the stakes are higher than ever. But victory in California can give us a model for the coalition that is needed to achieve a green growth victory in Washington, D.C. and the rest of the nation."


    Algae-Based Biofuels; Demand Drivers, Policy Issues, Emerging Technologies, Key Industry Players, and Global Market Forecasts
    Mackinnon Lawrence and Clint Wheelock, 4Q 2010 (Pike Research)

    "…In the face of increasing petroleum scarcity, oil prices, oil price volatility, and greenhouse gas emissions, algae remain an untamed resource, but full of potential…[but] they demand ideal growing conditions…[A]lgae can be considered both a prolific nuisance and a fickle partner.

    "Interest in algae as an energy source began in earnest in the 1970s through the U.S. Department of Energy’s Aquatic Species Program (ASP). The effort was discontinued in 1996 due to production challenges and oil prices at $10 a barrel. However, oil prices have been climbing…A variety of startups, the forging of strategic partnerships with large multinationals, and the organization of university-based research consortiums worldwide are evidence of a mounting global response to the challenge of domesticating algae for the production of biofuels…"

    click to enlarge

    "…[Algae have]…High per-acre productivity (capable of producing 2 to 20 times more oil per acre than leading oilseed crops)…[are a] Non-food-based feedstock…[can] grow on non-arable land (e.g., deserts)…[use] a wide variety of water resources (including wastewater and seawater)…[can produce] a wide range of fuels and valuable co-products…[but] the algae-based biofuels industry is still relatively young. The industry boasts tremendous success in the lab, but has yet to produce a drop of fuel for commercial sale.

    "On paper, algae could displace petroleum altogether; however, the industry will take time to develop…By 2020, Pike Research forecasts that the production of biofuels derived from crude algae oil will reach 61 million gallons per year…and a market value of $1.3 billion…[for] a robust compound annual growth rate (CAGR) of 72%…"

    click to enlarge

    "With 50% of all algae activity, the United States is poised to ramp up production the earliest among world markets…The first commercial-scale facilities with a potential production capacity of 1 million gallons will likely come online in the 2014 to 2016 window…[A]lgae potential is greatest in regions where there is an abundance of land, water, and sunlight…Algae’s ultimate threat is over-hype…Producing algae-based biofuels is essentially an agricultural challenge…

    "Production consists of three key technological hurdles, the costs of which must decrease significantly…First, cultivating algae biomass consistently, across regions with varying climates and environments and at scale, has yet to be demonstrated…Second, harvesting and extracting algal biomass from the slurry remains a difficult challenge…Third, techniques for extracting the oils out of the algae are mainly suitable for analytical and laboratory-scale procedures, or for the recovery/removal of high-value products. To produce algal biofuels as a competitive bulk commodity, the extraction techniques must become more efficient and effective…"


    Ride for Renewables; 100% By 202
    Tom Weiss, October 2010 (Ride for Renewables blog)

    October 23, 2010; Riverfront Bike Path (Greening the Grid One Home at a Time)
    "…On the way, saw a home near Litchfield with a residential wind turbine and solar tracker, so I had to stop. Pedaled down a side road to get to the house and knocked on the door to ask about their setup. Turns out the owner wasn’t home, but his daughter…was clearly very knowledgeable about the system, and was happy to give me a quick tour…"

    click to enlarge

    October 24, 2010; Springfield, Illinois (Coal in my Backyard)
    "…Was keeping my eyes on the Dallman coal plant looming over the shores of Lake Springfield…Rode down one street, but it was a dead end. Another was a residential neighborhood with no lakeside access…Later did a little research on the coal plant across the water and learned that the Sierra Club had settled a lawsuit allowing the construction of a new power plant with the lowest pollution rates in the nation; the purchase of 120 MW of wind power; closure of the Lakeside power plant (the #3 dirtiest plant in the country); increasing energy efficiency funding 10-fold; and cleaning up three other coal boilers to the lowest SO2/NOx emission rates for existing boilers nationwide. Better than nothing, for sure, but we don’t need less polluting coal plants in our communities. It’s time to move beyond coal completely…"

    October 24, 2010; Springfield, Illinois (Coal in my Backyard)
    "… Cruising up the frontage rode to Interstate 55 on the way to Springfield, saw a large single wind turbine and rolled down a gravel road to get a closer look. Discovered that the Gob Nob Wind Turbine sits right on top of the former Crown coal mine! Could not have a more fitting site to be generating green energy for the people of Illinois…"

    click to enlarge

    October 25, 2010; Springfield, Illinois (Appeal to the President)
    "…Got to the Capitol grounds and found the precise location where the President gave his speech, but the grounds were closed. So filmed the video outside the gate (as best I could, with so many people coming up and asking about the trike), making a heartfelt appeal to the President to follow through on the bold and inspiring proclamations he issued in his announcement speech…"


    Your Guide to Real—and Fake—Green Products
    Kimberly Palmer, October 27, 2010 (U.S. News & World Report)

    "It turns out that many of the so-called “green” products in our homes might not be so green after all. The latest study from TerraChoice, an environmental marketing firm, found that 95 percent of consumer products make some kind of false claim about their environmental-friendliness. Even allegedly BPA-free baby toys might contain that unwanted compound.

    "The problem, TerraChoice reports [in
    Greenwashing Report 2010; The Sins of Greenwashing], is that anyone can slap a “green” or “all-natural” label on a product. Many consumers don’t realize how unregulated and ill-defined these labels are, and pay more for the products that have them…"

    From the Greenwashing Report 2010 (click to enlarge)

    "…[H]ow can consumers navigate the world of green—and not-so-green—products? …[L]ooking for third-party approvals, such as the Green Seal, can help you separate legitimate environmental-friendliness from the fakers, as can Internet searchers of products and ingredients. E-mailing the company directly when answers prove elusive is another option…The Smart Mama [can] do a lot of that research…

    Green-washing as TerraChoice labels it, has also spread to the world of global travel. Resorts from Thailand to South Africa promise “green” getaways in the form of organic dining, local nature trips, and carbon offsets. Whether or not the trips are actually green, especially considering that they probably require a 14-hour jumbo jet ride, depends on the details."

    From the Greenwashing Report 2010 (click to enlarge)

    "One eco-tour operator in Jackson Hole, Wyoming, in partnership with the nonprofit, fuels its vehicles with biodiesel, uses only existing hiking trails in order to minimize the impact of visitors on the environment, uses washable lunch plates, and serves organic and local produce. Concerned travelers can also offset their carbon footprint further by funding wind energy, reforestation, and renewable energy projects through The nonprofit’s motto is that people should "reduce what you can, offset what you can't."

    "…[T]he government is also taking a closer look at companies that falsely use its “Energy Star” label…[and] the U.S. Federal Trade Commission might tighten its own rules about green marketing…[Meanwhile, do] research before buying into any “green” labels, because on their own, they don’t mean much."


    Hub City Inc. Innovates “Green” High-Efficiency Gearbox
    Troy McQuillen, October 25, 2010 (Hub City Inc.)

    "HERA, or High Efficiency Right Angle gear drive is a new [concept in gear drive units] that can efficiently convert 90% of the energy required to drive the unit to usable power. This figure is staggering compared to traditional worm drive units which convert 50% of the drive energy, or less, to usable power…

    "Worm Drive gearboxes are found in nearly every sector of the economy wherever motion is required and transferred along a production system. Such systems can include the any type of conveyor, manufacturing of products for recreation, transportation, military, packaging, construction, communication, material handling, medical and food processing equipment, and hundreds of other factory situations…"

    click to enlarge

    "Because HERA delivers more torque, smaller units can be used on existing systems. Smaller gearboxes cost less. And HERA lasts longer than standard boxes, so they need to be replaced less often, which is another cost-saving feature of HERA.

    "HERA’s unique (Patent Pending) technology uses a completely re-thought concept of delivering torque while increasing the unit’s efficiency. In fact, HERA can deliver twice the torque (more drive power) with less energy. When paired with a high-efficiency electric motor, or not, HERA actually allows the motor to work less while HERA delivers more power."

    click to enlarge

    "Typically in a production system, energy coming into the system is consumed first by the motor, upwards of 26%. After that, gearbox transmissions suck up another 24% of the energy and convert it to heat. In this scenario, 50% of the energy is wasted, and much of it is converted to heat, which causes air conditioning to run more. With HERA, 90% of the energy is effective, saving customers thousands of dollars in energy conservation.

    "Hub City, along with representatives from one of their distributors recently conducted an audit of a large beverage food processing company. The audit concluded that, if they replaced every traditional worm drive gearbox within their plant with HERA, energy conservation would save kilowatts equivalent to $300,000+! And the cost of the conversion would be recouped in 4 months…"

    Thursday, October 28, 2010


    Elevated Crustal Temperatures in West Virginia: Potential for Geothermal Power
    David Blackwell, Zachary Frone, and Maria Richards, October 2010 ( Southern Methodist University)

    It turns out West Virginians are wasting their time tearing down their mountains, filling up their valleys and ruining their freshwater sources in pursuit of coal because, as documented in the new study below, the state has way more potential as a geothermal energy producer.

    Geothermal development offers plenty of jobs, plenty of power and causes none of the environmental degradation associated with the mining, delivery and burning of coal.

    Bonus to geothemal development in West Virginia: There would be plenty of mountaintop real estate left to build wind projects.

    It's hard to see the downside so it will be interesting to see what excuse Massey Energy uses to squirm out of this better choice for West Virginia.

    The good news: This report heralds a huge opportunity for geothermal developers willing to take on the coal-run state. Such developers will find a population of willing workers fed-up with coal and ambitious for New Energy.


    A large area in eastern West Virginia has been found to have elevated heat flow and upper crustal temperatures compared to the rest of the eastern United States. The high heat flow has been recognized based on interpretation of bottom-hole temperature (BHT) data from oil and gas drilling in the area. The high heat flow is located within the western part of the Appalachian Mountains. There are several possible explanations for the observed thermal regime given the existing data that can only be resolved by more accurate determination of the temperature and heat flow in this area. The temperatures are high enough to make this the most attractive area for Geothermal Energy development in the eastern 1/3 of the country and the heat in place is sufficient to support large scale development of Enhanced Geothermal Systems.


    A significant area of high temperature at depth due to high heat loss from the interior of the Earth has been identified in the eastern part of West Virginia. The finding is the result of detailed mapping and interpretation of bottom-hole temperature (BHT) data from oil and gas wells conducted as part of an ongoing project to update the Geothermal Map of North America and to revise previously calculated stored heat resource values for the United States. There are several geologic situations that could be responsible for the West Virginia elevated temperatures. The data suggest that West Virginia possesses a significantly higher thermal profile than previously estimated and at such quantities and temperatures as to be capable of supporting commercial geothermal energy production.
    As a result of the new data, the previous estimate of West Virginia’s geothermal resources between depths of 3-10 km is revised upwards to 113,300 EJ, a 78% increase from the Future of Geothermal Energy Report (Tester et al., 2006) calculation of stored heat. The revised estimate of geothermal potential from this stored energy is 18,800 MWe (electrical power) at a 2% recovery factor.

    Updating the Geothermal Map of North America

    In 2008, the SMU Geothermal Laboratory began a multi-year project to update the Geothermal Map of North America (GMNA) (Blackwell and Richards, 2004) and estimates of U.S. Enhanced Geothermal Systems (EGS) resources. The 2004 GMNA and thermal data set have been used by SMU and others to estimate stored heat content, crustal heat flow, and EGS resource values (Tester et al., 2006, referred to subsequently as FGE2006). EGS resources are high-temperature (>150 ºC) geologic formations where development of power systems requires the enhancement or creation of a subsurface reservoir for fluid circulation.

    The GMNA was developed from roughly 3,600 heat flow and 12,000 BHT data measurements along with regional thermal conductivity models. However, large areas of the Central and Eastern United States contain few data points and have been under sampled in all previous national geothermal resource assessments. Since the previous GMNA data sets were completed, approximately 7,500 new data points have been analyzed and currently more are being processed for this project. The data was collected from oil, gas, water, and thermal gradient wells from New York, Pennsylvania, West Virginia, Ohio, Indiana, Illinois, Kentucky, Tennessee, and Michigan. As a result of the new heat flow determinations, estimates of heat content and MWe potential for Michigan, Pennsylvania and West Virginia are substantially increased.

    Data Sources, Manipulation, and Density

    This study focused on aggregating the BHT data and developing new thermal conductivity models for estimating crustal heat flow for portions of the Eastern and Midwestern U.S. The parameters considered include: BHT, surface temperature, heat flow, thermal conductivity of the rocks, and heat production of the rocks at depth. The depths of measurements used to update the West Virginia geothermal resource range from 600 m to as deep as 6.15 km. These temperatures have been corrected for the disturbance due to the drilling process to represent the in situ Earth temperature at the point of measurement using the Harrison correction (Harrison et al., 1983).

    Thermal conductivities have been estimated from published geologic sections and thermal conductivity information for the rock types encountered in the wells (see Joyner, 1960). Geologic sections from the AAPG Correlation of Stratigraphic Units of North America were used to generate a thermal conductivity model for the eastern United States (AAPG-COSUNA, 1994). Thermal conductivity and temperature gradient were combined to calculate updated estimates of stored heat, heat flow, temperature-at-depth, and technical geothermal MWe generating potential for these states…The mapping has identified a large area in eastern West Virginia with elevated heat flow…

    click to enlarge

    A total of 1,455 wells with BHT data are included in the updated West Virginia heat flow and temperature data set. From those 55 wells have high thermal gradients and heat flows, i.e., above 30°C/km and 60 mW/m2 respectively. In comparison, the 2004 GMNA contours in West Virginia are based on only four data points. Interpretations of temperature contours are more constrained where there are multiple wells in proximity to each other (central portion of WV, Figure 1). There is significant scatter in the BHT data, even within a close proximity, due to the marginal conditions under which they are measured and recorded. Working with equilibrium temperature data is the most accurate method of resolving these differences and is one of the next steps for refining the state’s geothermal assessment.

    Methodology of Temperature and Heat Calculations

    The BHT data from hydrocarbon wells are measured as part of the drilling - logging process. A plot of the corrected BHTs from the hydrocarbon wells is shown…The highest corrected temperature in the data set was in western West Virginia with 152°C at 6.15 km. This site is located outside the area of highest estimated temperatures, but useful for calibrating the deeper temperatures. The solid lines on Figure 2 are individual wells with multiple BHT values representing single well temperature-depth curves.

    click to enlarge

    The methodology for calculating the EGS stored heat resource follows the process used in the FGE2006. In order to determine thermal energy (or “heat”) in place, calculations use lithology, thermal conductivity, gradient, and thickness of sedimentary rock (depth to basement). Temperatures were calculated for the depths of 3.5 to 9.5 km at an interval of 1 km (Figure 3). Values, averaged for 1 km intervals, were used in the recoverable resource analysis in the FGE2006 (Chapter 2 Appendix). In this study the West Virginia heat-in-place was calculated based on 1 km x 1 km x 1 km blocks centered at depths of 3.5 to 9.5 km using the same assumptions and equations shown in FGE2006 and Blackwell et al. (2006). Synthesis of the new data yields a geothermal resource estimate of 113,300 EJ of stored energy between depths of 3 and 10 km. This is an increase of 78% over the previous estimate, based on limited data, in the FGE2006.

    Results 1, Temperature-at-Depth

    A series of maps of calculated temperature-at-depth for West Virginia for depths of 4.5, 5.5, 6.5, and 7.5 km are shown…These maps show the increasing temperatures and the significantly large area over which the temperatures higher than 150 °C occur. In the hottest area, predicted temperatures are as high as 200°C at 5 km in discrete locations, and reach 175°C conservatively over a large area (about 110 km by 170 km). Several wells have been drilled in the depth range of 4.5 to 5.5 km in western and central West Virginia. This information is helpful in calibrating the temperatures to these depths across the state, determining the extent of the significant geothermal resources, as well as determining drilling conditions at depths similar to those needed for geothermal development.

    click to enlarge

    Results 2, Location of High Temperatures in West Virginia Thermal Area

    The area of high temperatures lies in eastern West Virginia at the edge of the Appalachian fold belt and the Appalachian Sedimentary Basin, which formed during the Appalachian orogeny 300-350 million years ago. The basin is bounded on the east by the structures of the Allegheny Mountains. The basin is deepest along the western edge of the Appalachians. The basin depths in the area of high temperatures are 5 to 7 km. Thus, the lithology of the rocks at high temperature may be both sedimentary and basement (igneous and metamorphic). The uppermost basement lithology is dominantly granite or gneiss based on a few basement samples from West Virginia, including a granite sample outside the thermal area of interest from the Rome Trough at 5,971 m depth.

    click to enlarge

    Geologic details of the thermal area are shown…with contours of temperature overlaid on a geologic cross-section located at the position shown [above]…The primary basement structural features of West Virginia are identified…: the Rome Trough (or graben, i.e. a down-dropped block); and the Upland Horst (or up-thrown block) bounded on the east by a buried normal fault (Shumaker and Wilson, 1996). These structural features underlie the west edge of the thin skinned deformation associated with the Appalachian Valley and Ridge province. The folds are related to near horizontal faults that translate into near vertical faults beneath the folds as shown near the east (right side of the section). Along the section, the thermal anomaly seems to coincide in part at least with the uplifted basement block (Upland Horst) geographically related to the area east of the Rome Trough. Note that the section is highly vertically exaggerated (about 20:1) so that the thermal anomaly is much broader in respect to the relief than appears on the section.

    The high heat flow values are primarily located in the West Virginia counties of Tucker, Randolph, Pocahontas, and Greenbrier. The trend of the highest heat flow area is almost N/S. To the west and north the temperature mapping is controlled by additional BHT data to depths of at least 2-3 km. To the east in Virginia thermal data are limited and from relatively shallow wells. East of the area of high temperature wells in western Virginia up to 600 m deep can have temperature gradients as low as 10-11°C/km. Curiously, the warm springs area of the Appalachians occurs just east of the high heat flow area and in the region where the measured heat flow in shallow wells is low. Whether these lower heat flows or the warm springs are characteristic of the deeper thermal regime is not known at this time.

    Results 3, Interpretation

    The collocation of the high temperature area and the area of folding and faulting suggests that the high temperatures are geologically related. However, the nature of the relationship is not proven and there are several possible associations that have different implications on the temperature-at-depth interpretations.

    Firstly, the high heat flow areas could be associated with an underlying change in the basement (igneous/metamorphic) rock type and thus the radioactive heat production of the crust. In this case a deep heat conduction related source for the thermal area is implied with wells drilled to basement in West Virginia forming some constraint. The granite and granite gneiss lithologies encountered are typical of a high basement heat flow area, thus this model is definitely a possibility. In this situation the predicted temperatures shown at depths of 5 to 10 kilometers would be valid.

    Secondly, the correlation could be due to some large scale inhomogeniety in the thermal properties of the rock at depth that disturbs the temperature in some complicated way. In this case, the projected temperatures on Figure 3 could be somewhat high below 4 to 6 km (below the depth of the data points), but the area still would have higher temperatures than the surrounding lower heat flow areas.

    A third possibility is that there is fluid flow along the thrust faults, the steep faults, or along dipping aquifers (permeable layers) in the deformed Appalachian Valley and Ridge province (Appalachian Mountain belt). It is interesting that the anomaly is somewhat along the line of the Allegheny Front…- a fundamental geologic structure that seems to localize the warm springs locations in Virginia…On the basis of the very limited data available there are no large area thermal anomalies associated with these hot springs (Blackwell and Richards, 2004). The geochemistry of these springs also is not suggestive of high temperature-deep circulation. If a fluid flow phenomenon of this sort gives rise to the observed thermal anomaly then the calculated temperatures at depths of 5+ km may be high. However, the minimum temperature in the moving fluid would have to be above the measured temperatures which are up to 140 °C (284 °F), definitely into the temperature range for binary power generation. Therefore the final conclusion is that, regardless of the interpretation, significantly elevated temperatures will occur in the depth range of 4 to 10 km.

    Areas for Further Investigation 1, Questions to Still Consider

    The individual BHTs and estimated thermal conductivity data have a relatively high uncertainty. Because drilling makes up a significant percentage of the cost for geothermal development, the most favorable areas will be the highest temperatures at shallowest depths, making the most favorable areas somewhat site specific.

    Also site specific is the rock type within the depth of interest, somewhat below the depths of the current drilling. Much of the thermal energy resides in basement rocks below the sedimentary section. Since basement is usually defined as areas of metamorphic or igneous rocks, the composition and lithology of basement is actually extremely variable.
    The West Virginia basement lithology where present is as complicated as the surface exposures. Because the generic description “granite” is often used for basement, the lithology is not exactly specified.

    Areas for Further Investigation 2, Confidence Level in Current Analysis

    The accuracy of the current data and the temperature calculations is difficult to assess. Comparisons with detailed temperature and coinciding geologic sections indicate about a ±10% error associated with the temperature interpretations within the depths drilled. At depths below the drill holes the extrapolated temperatures are less certain. In the areas of hydrocarbon development, wells that have been drilled to 3 to 6 km depths, so the predicted temperatures can be checked against actual measurements. Where this has been done and the agreement is within ± 20 °C in the 3 to 6 km depth range, although this does not include the thermal area of interest in eastern West Virginia. This information has been compared to the calculated values with similar results to the BHT comparison.

    Areas for Further Investigation 3, Gaps in the Current Analysis

    Major gaps include low quality thermal data, potential errors in matching the thermal conductivity to the well lithology, and areas with little or no data coverage. Addressing these limitations will require measurement of equilibrium temperatures in wells in the thermal anomaly regions and specific matching of the geologic sections to the wells. In areas with low thermal data density, holes drilled specifically for heat flow might be necessary as part of the exploration stage of development.

    Areas for Further Investigation 4, Favorability for Enhanced Geothermal Systems

    The calculated heat-in-place of 113,300 EJ is the starting point for the geothermal resource base and electrical generating capacity. The favorability for EGS development depends primarily on reservoir temperature and the suitability of target formations for the generation of reservoirs. The lithology, nature of the existing pore space, stress regime, and types and orientation of fractures are features that need to be determined in the depth range of interest for development. Geothermal exploitation in West Virginia could involve the development or enhancement of fracture systems to generate high temperature (>150 °C) reservoirs in the low permeability formations. There is also the potential for low temperature (65˚C-150˚C) geothermal development at shallower depths.

    Lastly, quantification of the most favorable rock composition and structure for EGS development remains to be accomplished. Most of the experimental EGS sites have been in granite (in a strict geologic sense) because of the expected homogeneity of the rock type. In fact there may be situations where layered sedimentary rocks might be equally or more favorable as the orientations of fractures might be easier to predict and the rock types may be more extensively fractured. Tight (low porosity) sandstones are often developed by fracturing into oil and gas reservoirs. As shown on the cross-section…there are thick sandstones at depth in the Appalachian Basin.

    Recommended Areas for Future Research

    Further research on the relationship of the Appalachian Mountain belt as a geothermal structure should be completed. The edge of the Appalachian Mountain belt extends either at the surface or as a subsurface feature from the Canadian border to the Mexican border in a serpentine course that crosses the states of Alabama, Mississippi, Arkansas, and Texas in addition to the states crossed by the Appalachian Mountains. The area of the same structure buried beneath Gulf Coast sediments is a significant thermal feature of Texas, Louisiana and Mississippi (Negraru et al., 2008) that also could be amenable to geothermal development.


    This reconnaissance investigation of the thermal regime of the eastern U.S. has defined a significant thermal anomaly along the Appalachian Mountain trend in West Virginia and demonstrated that temperatures high enough for electrical power generation occur at depths greater than 4 to 5 km in large areas of eastern West Virginia. This finding opens the possibility of geothermal energy production near the heavily populated Eastern Seaboard. Further research is needed to refine estimates of the magnitude and distribution of West Virginia’s geothermal resource and to understand the cause of the high heat flow values. The presence of a large, baseload, carbon neutral, and sustainable energy resource in West Virginia could make an important contribution to enhancing the U.S. energy security and for decreasing CO2 emissions.


    Senators from Both Sides Agree: Time to Rev Up Wind Energy
    Jackie Savitz, October 27, 2010 (Huffington Post)

    "…Millions of Americans cannot find work. Manufacturing plants and factories are being closed and American jobs are being shipped overseas. The federal deficit just reached $1.3 trillion…We also have an unhealthy and embarrassing addiction to dirty fossil fuels which is making us sick, polluting our environment, and sending some of the money we do have overseas.

    "…[But there] is a bipartisan bill in the Senate that would stimulate development of clean energy to replace fossil fuels, and create lasting American jobs…Senators Carper, Snowe, Brown, and Collins, have introduced an important piece of legislation to provide critical financial incentives for the investment and production of offshore wind energy. This bill extends tax credits for offshore wind facilities through 2020, sending a clear signal to companies and investors that wind has continued federal support, just like the oil and gas industry has had…"

    Power where power is needed (click to enlarge)

    "Most of the problems facing America today can be solved, or at least lessened, if we make a real commitment to renewable, sustainable, clean energy, and specifically, to offshore wind…[That] would create hundreds of thousands of permanent jobs that cannot be shipped overseas… in the fields of research & development, manufacturing, construction, installation, and maintenance…Good paying, sustainable jobs right here in America…[This will] help to reduce our debt…regain our footing in the global economy and reclaim our position as makers instead of takers.

    "The world is moving on without us. As we sit paralyzed by debates about continued drilling for oil, Europe and China are leading the way in the global renewable energy market. The longer we wait, the tougher it will be to compete…Offshore wind and other renewable energy sources like land-based wind and solar can generate enough power and electricity to permanently end our oil [and fossil fuel] addiction. Offshore wind power is clean and abundant and it won't run out. It will also never pollute the environment, make us sick, or contribute to the disastrous effects of global climate change."

    The Google-backed offshore wind transmission backbone (click to enlarge)

    "…[W]e are making some strides…Secretary Salazar signed the first lease for commercial offshore wind energy, which would make Cape Wind the first wind farm on the Outer Continental Shelf. Another handful of offshore wind projects are being considered off the coasts of Delaware, Rhode Island, New Jersey, Maine, and even Ohio, in Lake Erie. These projects need to… undergo full environmental reviews…[and] be sited in appropriate areas to minimize disturbance to wildlife…

    "Google recently announced it would help to finance a sub-sea cable network off America's mid-Atlantic coast, which will help offshore wind developers connect to the U.S. power grid…along the eastern seaboard from New Jersey to Virginia and will help bring offshore wind power to the population centers that need it…But we need to do a lot more…We need to send a strong signal to companies and investors that this growing industry is stable and will have continued federal support…We urge Congress to pass the Carper-Snowe-Brown-Collins bill as soon as possible…"

    Road to Recovery: What's Working - Wind Turbines
    Maggie Kerkman, October 27, 2010 (Fox News)

    "…The group of [high tech] workers assembling [a housing for a wind turbine engine] was hand-picked and trained in Germany by a German company called Nordex. These workers will soon be doing some of the training as Nordex USA expands in Jonesboro where the company plans to add up to 700 jobs in the area by 2014…Workers here couldn’t be happier. They’re getting paid an average of about $17 an hour and they’re working at a state of the art facility."

    click to enlarge

    "…Before being hired at Nordex, [Brad Scott] was out of work for a year and a half, after a Chinese company closed its factory and moved his job overseas…Now Scott’s back on the job, in one of the teams doing precision work on the generator housing."

    click to enlarge

    "The Jonesboro plant has been in the works for about two years. Nordex is one of a handful of European companies leading the charge for wind energy expansion in the U.S. One of the reasons the company picked Jonesboro was for its central location...According to Joe Brenner, Nordex USA’s VP of Production…[the company has] projects in the east and in the north and potential projects all over the country…[P]artnerships with the local [Jonesboro] university and a community college also helped assure future cooperative research and a trained workforce."

    click to enlarge

    "Nordex USA’s President and CEO, Ralf Sigrist, says wind energy is the future. Now he just has to convince the rest of the U.S. that what Europe has been doing for years is how America can fill part of its expanding energy needs. Wind energy, says Sigrist, has high upfront costs but low costs over time. To be profitable, wind energy companies sign long-term fixed rate deals. They may not be the cheapest energy right now, says Sigrist, but the deals could be very competitive over time given the price volatility of fossil fuels."

    CSP and CHP: A good fit?
    Jason Deign, 21 October 2010 (CSP Today)

    "…[With] expertise is combined heat and power (CHP) generation…the thermal fluid of a CSP [solar power] plant, when not being heated by the sun [such as when there is a temporary cloud cover], could be warmed by the jacket water and back-end exhaust heat from a CHP system set to run the turbines at part or full power…

    "…[W]hether it would be worthwhile in practice would depend on relative gas and electricity export prices of the site…[It is] traditionally cost prohibitive…Whether the payback from the electricity gained will justify the cost of integrating CHP into a CSP site is currently unknown…"

    schematic of CHP and how it is an advantage (click to enlarge)

    "Certainly, [CHP for solar power plants] is not a notion that seems to have occurred to many others. Industrial groups contacted…such as Alstom and Bechtel, were not aware of activity in this area….[and Acciona Energy, a leading builder of solar power plants, has] not carried out any research in this area…

    "… Spanish CHP industry body Cogen Spain…[has] no definite study on combining cogeneration and solar thermal…[and] has not considered starting these kinds of studies…[The idea is also new] for the Combined Heat and Power Association…[but a spokesman said it] could have benefits for both industries…"

    There is no theoretical reason CHP can't be incorporated into this solar power plant schematic (click to enlarge)

    "…[A] challenge that some analysts foresee is that in a CSP-CHP setup one of the main benefits of cogeneration—being able to provide low-temperature heat for other uses—would be lost…[because solar power plants are typically] in the middle of the desert, probably a long way from residential or commercial demand for heat…[Therefore,] some developers are now looking at industrial applications for waste solar thermal heat…

    "…[O]ne developer, at least, is excited about putting CSP and CHP together…[as]
    a micro-CSP-powered CHP unit to provide electricity and heating for single buildings…Initially producing 1-2 kWe and 4-8 kWth, rising to 3 kWe and 12 kWth in production, the Digespo units are “based on a vision of a society which will be more multi-layered…a prototype operating at 65% to 70% efficiency [is expected to be] in place at the Hilton Hotel in Malta next June. While not quite [a] utility-scale application…it may at least go some way towards raising awareness of the potential for using CHP alongside CSP."

    Ocean energy sector on the cusp of growth
    Jijo Jacob, October 20, 2010 (International Business Times)

    "The global ocean energy sector is witnessing a sharp turnaround with more than 45 wave and tidal prototypes being ocean tested in 2010 and 2011, after only a dozen were installed in 2009…[M]ore than 1.8 GW of ocean projects in 16 countries are currently in the pipeline.

    "The global ocean energy project pipeline is poised to begin scaling if these initial projects are successful, according to a new study by IHS Emerging Energy Research. The growth]… has attracted a slew of established energy companies with renewable growth ambitions, including leading European utilities and global technology suppliers — many with hydro and offshore wind experience…[Oceans] cover more than 70 percent of the Earth's surface [and] generate two types of energy: thermal energy from the sun's heat, and mechanical energy from the tides and waves."

    click to enlarge

    "All coastal areas consistently experience two high and two low tides over a period of slightly greater than 24 hours. For those tidal differences to be harnessed into electricity, the difference between high and low tides must be at least five meters, or more than 16 feet…[T]here are only about 40 sites on the Earth with [such] tidal ranges…Boosted by government and policy support, the UK is currently the world’s leading market for ocean energy, with 300 MW of projects in the pipeline…over the next five years…

    "The UK government hopes to add 1.3 GW by 2020, driven by its need to meet legally binding 2020 renewable targets. Ireland, France, Portugal, South Korea and Australia are also key ocean energy markets and will remain the industry’s primary focus for the next decade…[T]here are no tidal power plants in the United States currently. But [DOE] says conditions are good for tidal power generation in both the Pacific Northwest and the Atlantic Northeast regions…"

    click to enlarge

    "Wave power devices extract energy directly from surface waves or from pressure fluctuations below the surface. There is enough energy in the ocean waves to provide up to 2 terawatts of electricity (a terawatt is equal to a trillion watts)… Wave-power rich areas of the world include the western coasts of Scotland, northern Canada, southern Africa, Australia, and the northeastern and northwestern coasts of the United States…[T]he Pacific Northwest…could produce 40–70 kilowatts (kW) per meter (3.3 feet) of western coastline. The West Coast of the United States is more than a 1,000 miles long...

    "Of the various forms of ocean energy, tidal energy is poised to mature first, with the promise of providing predictable, lower-cost electricity and a standard design…Tidal is attracting major original equipment manufacturers (OEMs)…"

    Wednesday, October 27, 2010


    Smart Grid: Ten Trends to Watch in 2011 and Beyond
    Ben Gohn and Clint Wheelock, 4Q 2010 (Pike Research)

    From the Stuxnet worm to “the Bakersfield effect” to HAN, the Smart Grid has begun to spin off its own slanguage. The resultant opacity only makes the Smart Grid seem more threatening (which increases the Bakersfield effect of consumer rejection).

    The short report below sorts out some of the complexity (and reassures that there are ways to control the Stuxnet worm that slipped past internet security in a memory stick and seemed to attack the U.S.power grid).

    The report is not easy because it is a primer and update about something that is not easy. Ultimately, it is reassuring to know the digital village has a security force working to protect HANs (home area networks).

    The report can be eased into, level by level. Start with the graphs showing how long until Smart Grid is everywhere. Pretty quickly, acronyms like SCADA (supervisory control and data acquisition), AMI (advanced metering infrastructure) and DG (distributed generation) will be as easy to recognize as LOL. 8-)


    During the last few months of this decade, the electric utility industry has experienced a momentous season. The business of electricity generation, transmission, distribution, and consumption has been thrust to the forefront of public discourse – as both a villain and savior – in the fight against climate change and the struggle for energy independence and security, among other global priorities.

    The “smart grid,” the integration of new embedded computing and communications technologies into the fabric of the power network, is widely seen as the means to adapting our electrical infrastructure to meet these global needs. Basic economic justifications for technical advancements have been enhanced (or distorted) by sweeping regulatory mandates and large national economic stimulus spending plans. The reality of the smart grid is coming into focus, too slowly for some, but at a faster pace than typically seen in this industry. Existing players are transforming, new players are entering, and consumers are awakening.

    The months ahead should witness the maturation of the smart grid as all the trials, mandates, and pilots move toward production deployment. There are dozens of trends that bear watching and scrutiny. In the following section, Pike Research focuses on ten such trends that will be most influential in the emerging smart grid sector.

    click to enlarge

    Ten Trends

    Security Will Become the Top Smart Grid Concern

    Grid security has always been an industry concern, though usually one that lingers in the background. The infamous smart meter hacking demonstration at the 2009 Black Hat conference may not have broken any new technical ground among metering vendors, but it did raise cyber security awareness within the smart grid community. However, once metering vendors demonstrated reasonable solutions, the sense of alarm quickly passed.

    If anyone in the smart grid community still has a sense of cyber security peace and serenity after the summer of 2010, they need to check their pulse. The Stuxnet worm, discovered in July 2010, awakened the industry to the tangible and very complex threats to the supervisory control and data acquisition (SCADA) systems that run today’s “semi- smart” grid and are poised to take a central position in a fully integrated and interconnected “really smart” grid.

    Stuxnet is a relatively silent worm that specifically targets and embeds itself into SCADA systems, providing a potential means to wreak havoc. It blasted through many of the axioms that allowed utility managers to sleep at night:

     “My SCADA system is safe because it is not connected to the Internet” – Stuxnet apparently entered via USB memory sticks, perhaps distributed at your favorite smart grid conference.

     “I keep my SCADA Windows Machine updated with the latest security patches and antivirus protection” – Stuxnet exploited zero-day vulnerabilities in Microsoft Windows and avoided detection by the best protection software. Stuxnet existed for months (years?) before detection. Moreover, many SCADA controllers are not managed as part of the normal enterprise IT network and are NOT kept up to date with almost daily security patches.

     “At least the threats are limited to my Windows-based management consoles” – Stuxnet not only infected Windows machines, but also aimed to infect the SCADA Programmable Logic Controllers in the field.

    The technical analysis on Stuxnet continues, and it appears to be a very sophisticated attack not aimed at the electrical infrastructure. But if nothing else, the threats security experts have been warning of for years have now moved from theory to reality. Since the industry is taking greater notice, especially regulators and government (including the U.S. Congress), utilities will need to determine what cyber security measures are required – even as standards and regulations are still evolving.

    On the standards front, the recently released (September 2010) National Institute of Standards and Technology (NIST) "Guidelines for Smart Grid Cyber Security," at three volumes and 537 pages, is a testament to both the unprecedented industry efforts to establish clear smart grid security guidelines and the incredible complexity and difficulty in doing so. The document has already become a bit of a lightning rod for criticism, which is in itself a productive outcome.

    The North American Electric Reliability Corporation (NERC) CIP (Critical Infrastructure Protection) specifications, which have thus far been the closest thing to a general security specification for utilities – much to the chagrin of serious security experts – are being extensively revised. More importantly, utilities that have treated these as a nuisance paperwork exercise – yielding such silliness as almost all assets being declared “non-critical” – will be increasingly pressured to use these imperfect tools to actually assess and correct their vulnerabilities, lest they risk a starring role in the cyber equivalent of the BP gulf oil spill.

    Serious investment will be required – even if the actual solutions and standards remain cloudy. Pike Research forecasts that worldwide smart grid cyber security spending will reach over $1.7 billion in 2013, from approximately $700 million in 2010 (see Chart 2.1). Regulators should be primed to welcome and approve such investments, even in the face of an increasingly restive consumer community.

    click to enlarge

    Distribution Automation Will Rival AMI as the Most Visible Smart Grid Application

    For the small percentage of the general public that might recognize the term, “smart grid” often means “smart meters.” While logical for the public at large, this misinterpretation has also been adopted by many in the industry since advanced metering infrastructure (AMI) has captured the most mindshare within the smart grid discussion. Yet, this emphasis is changing as distribution automation (DA), spurred by the threat and opportunity of plug-in electric vehicles (PEVs) and distributed generation (DG), moves from the “toy” stage to the earliest phases of actual commercial adoption.

    The technological word association with the term “distribution automation” is likely to evolve as the industry confronts the realities of transforming the distribution network from a one-way to a multi-way power network. Once reserved to simple remote control of field-based switches and sensors (which has never been simple) for the primary aim of increasing reliability, distribution automation technologies are beginning to comprehend the demand response capacity of conservation voltage control (CVR) and the energy management possibilities of dynamic load distribution.

    Various demonstration projects and leading production deployments, such as at Progress Energy, will accelerate industry discussion. Duke Energy’s efforts to foster a smarter distribution network by encouraging the development of generalized computing and communications “platforms” deep within the distribution network, as embodied by Echelon’s recently launched ECoS and similar products, are likely to yield new innovations that will address the multi-way distribution system. The final outcomes may not be clear, but more industry conversation will arise even as traditional distribution automation technologies increase their market penetration and integration into a broader smart grid.

    Distribution automation projects, whether they target leading-edge applications or traditional reliability improvements, have another important attribute. They have the potential to deliver tangible benefits without requiring intensive consumer engagement or behavior change.

    click to enlarge

    The “Bakersfield Effect” Will Continue, but Some Consumers Will Actually LIKE the Smart Grid

    Smart grid history will likely remember 2010 as the year of the “Bakersfield Effect.” This expression refers to the birthplace of loud consumer pushback on smart meters, which were blamed for dramatically higher electricity bills experienced by Pacific Gas & Electric (PG&E) customers in the summer of 2009. Similar problems reported by Oncor’s customers in Texas added fuel to the fire. It has taken months for independent investigations and testing to confirm what industry insiders already knew: there were no major technology issues. However, there was a potentially huge disconnect regarding customer communications, relationships, and expectations.

    The Bakersfield Effect is having a larger impact on smart grid development than most imagined. Consider the following:

     For many consumers across the United States, their introduction to the smart grid has come from national news stories chronicling problems in California, whether substantiated or not. For many consumers, this feeds an already well-cultivated antagonism toward their local utility.

     The uproar has spotlighted whether the benefits of smart meters (and the smart grid by association) are big enough, accessible enough, and near enough to warrant the extra costs being passed on to consumers. A white paper published in September 2010 by leading consumer groups (National Association of State Utility Consumer Advocates [NASUCA], the National Consumer Law Center, Public Citizen, Consumers Union, and AARP) best summarizes the concerns. The industry has been surprisingly ill prepared to answer questions, especially from the consumer’s perspective.

     Fringe groups, sometimes derisively called the “tin foil hat crowd,” are questioning the health effects of RF-based AMI systems. Normally, these claims would be easily dismissed, but the backlash atmosphere has spooked a few municipalities into banning smart meter deployments within their borders.

     Regulators and especially politicians, aware of the ruckus, are responding with a mix of rational questioning and irrational opportunism. Meanwhile, many utilities have responded with a mix of surprise and cluelessness. The Maryland Public Utility Commission’s (PUC) initial rejection of Baltimore Gas and Electric’s (BGE) smart meter plan, despite a $200 million American Recovery and Reinvestment Act (ARRA) federal stimulus commitment, rocked the industry. A revised plan was ultimately approved, but with significant strings attached.

    Utilities and their regulators, contemplating their own smart grid programs, are taking long, hard looks at the experiences of PG&E, Oncor, and BGE. Pike Research believes this scrutiny will have both positive and negative effects on current and future smart grid deployments, including the following:

     The industry is placing much greater emphasis on consumer communications, including benefits to be expected, as well as specific logistics around local deployment plans. This should be a positive development toward treating consumers as true customers rather than grid load points.

     Regulators are increasingly requiring specific articulation of consumer benefits within proposed smart grid programs, delivered earlier within the deployment timelines. This may be disruptive in the near term, but should create stronger programs in the long term with better consumer acceptance.

     Some regulators, faced with restive consumer groups, may be increasingly afraid of allowing utilities to expose true time-based electricity costs to consumers, potentially blunting the conservation and peak-shifting benefits sought by smart meter deployments. Such reticence could significantly delay the growth of HAN and other consumer-facing technologies.

     Some of the messaging toward consumers around smart metering and the smart grid in general will go beyond energy efficiency and peak-shifting, which depends on consumer behavior changes, toward other benefits such as PEV support and distributed generation.

    These applications have clearer, if more futuristic, perceptions among consumers. Even as the Bakersfield Effect continues to reverberate, consumers in Texas and other leading deployments and large-scale pilots will begin to use the pricing programs and energy management technologies made available through smart metering. Although it probably will not be widely reported in the popular press, many of these consumers will actually like these offerings. Certified in-home devices are just now becoming available in Texas and are viewed as essential customer service tools by energy retailers there.

    Positive feedback from consumers, combined with quantifiable energy efficiencies by the utilities, should eventually help overcome the more irrational consumer fears related to the smart grid.

    click to enlarge

    Smart Meter and AMI Focus Will Shift Toward Europe and China

    Although Europe got an early start in deploying smart meters, most of the recent attention has been on North America, with Texas, California, and Ontario leading the way.

    In 2011, attention will begin to shift back toward Europe as major programs in the United Kingdom, France, and Spain, totaling up to 100 million meters in aggregate, move toward deployment.

    The U.K. program is notable for the desire to leverage a common communications infrastructure to support multiple retailers (which are responsible for the meters) across different distribution system operators, for both electricity and gas. There is also a stated desire on the part of regulators to mandate the provision of in-home displays for consumers. British Gas has already announced plans for a multi-million meter deployment in partnership with Vodafone using cellular infrastructure to the meters and ZigBee within the homes. Other communications infrastructure vendors are pushing for different to-the-meter mandates, including Arqiva’s effort to leverage its nationwide tower network, together with Sensus radio equipment. The regulatory confusion should clear up in 2011, paving the way for broader deployments across the United Kingdom.

    French regulators have given their blessing to the “Linky” smart meters being piloted by EDF distribution subsidiary ERDF. Three different pilots of 100,000 meters each, using equipment from Itron, Landis+Gyr, and Iskraemeco, are unique in that they all conform to an ERDF-specified interoperability standard. Thus, these projects have effectively created one of the only multi-vendor interoperable smart meter systems. The opportunity for vendors is large (35 million meters), and the multi-vendor collaboration has spawned the Interoperable Device Interface Specifications (IDIS) industry association that aims to enable a true, open, multi-vendor interoperability option.

    As large as these European deployments are, they pale in comparison to China’s smart metering deployment plans, which will come into focus in 2011. China’s State Grid Corporation has thus far focused on large-scale, high-voltage transmission system buildouts. However, it has also quietly been working on specific smart metering standards (mostly PLC-based), and initial tenders across a number of provincial utilities already total over 40 million meters. Ultimately, plans for over 700 million smart meters across China by 2020 are being discussed, dwarfing the plans of any other region. Whatever the timeline, the vast quantities involved will certainly focus the industry’s attention on smart meters. Moreover, with a standards regime seemingly favoring indigenous smart meter manufacturers, Chinese vendors are likely to become even stronger competitors across the globe.

    click to enlarge

    The “Year of the HAN” Will Not Arrive … Yet

    As the first advanced smart meters with built-in home area network (HAN) interfaces were installed 2 to 3 years ago, visions of a robust in-home device market, including displays, thermostats, and smart appliances, took hold. Consumer retailers such as Home Depot and Best Buy would stock their shelves with devices that would connect to the meter, and eager consumers would snap these up save on their electricity costs through time-based rates. All that was required was certification of some HAN standards and a critical mass of installed smart meters.

    The first requirement seemed to be set in mid-2008 thanks to the initial certification of the ZigBee Smart Energy Profile (SEP) 1.0, an imperfect but workable HAN device standard.

    In 2010, the installed base of advanced smart meters reached multiple millions across Texas and California, setting up what home device startup companies hoped would be the “year of the HAN.” Alas, this is has not come to pass, nor is it likely to in the near future.

    Several factors have combined to delay and perhaps even reroute HAN momentum. First, planned enhancements to the ZigBee Smart Energy Profile grew dramatically when U.S. standards efforts, accelerated by the ARRA stimulus funding, pressured the ZigBee community to adopt the Internet Protocol (IP). End-to-end IP networking is theoretically a good thing, but this effectively tossed out a mature, proven mesh networking standard for something still in development. Thus, at least a year was added to the approval of an updated profile, now expected in the first half of 2011. While deployed smart meters can be remotely updated with new ZigBee firmware, most in-home devices cannot, causing many utilities to wait until the new profile is proven before contemplating large HAN device deployments.

    More importantly, the Bakersfield Effect consumer backlash has made utilities cautious about launching programs aimed at “changing consumer behavior,” which is what HAN technology-enabled time-based pricing is all about. HAN vendors had hoped that the cycle of utility pilots aimed at proving HAN technology was nearing an end. However, at the behest of regulators, these pilots are being replaced by new pilots aimed at measuring consumer acceptance. Consumer groups are asking whether the relatively complex HAN technology, including individually authenticated secure in-home devices controlled across the utility AMI network, is justified. They are comparing the technology to lightweight price signaling methods, such as those that have been used for years via one- way pagers or other networks. For device makers, it is distressingly late in the game to be asking such fundamental questions.

    One bright spot is the deregulated state of Texas, where ZigBee SEP 1.0 devices are finally passing local test certifications. These devices are being made available to consumers through a number of energy retailers that are eager to provide differentiated services. Consumer response to these devices will be a leading indicator of the ultimate success of the original HAN model.

    In the meantime, impatient energy management device vendors are adapting to partners and channels that avoid the utility or the need for advanced smart meters. Broadband and wireless telecom companies, home security firms, and traditional home automation vendors are all including home energy management in their solution portfolios, with the potential of supporting time-based pricing and demand response programs as utilities make them available. These different takes on the “HAN” will supplement, and perhaps even supplant, the original vision of utility-managed, AMI-connected HANs. It will take some time to see what choices consumers will make. While 2011 will provide more data, the issue is unlikely to be definitively settled any time soon. Chart 2.5 depicts Pike Research’s worldwide forecast for meter-connected HAN devices.

    click to enlarge

    The Demand Response Business Transformation Will Accelerate

    The curtailment service provider (CSP) business model, as pioneered by leaders such as EnerNOC, has been an interesting and important path toward the more efficient use of grid resources. However, just as personal digital assistants (PDAs) and cell phones are no longer separate devices, Pike Research believes that demand response (DR) will become an application within a company’s broader energy management platform or service, transforming today’s pure-play CSP market.

    Some CSPs, including EnerNOC, are already moving to offer a broader suite of energy management services. Vendors in adjacent markets will target demand response directly or in partnership with local utilities using their own products or via acquisition of smaller high-tech CSPs. Companies in adjacent markets, including building and energy management systems (BEMS/EMS), IT, and communications, are already touching customers who demand energy efficiency products, solutions, and services. These companies can therefore capitalize on existing customer relationships while leveraging the smart grid technology investments being made by utilities (such as AMI).

    Most influential in spurring this shift from traditional load curtailment and peak load shifting will be the entrance of BEMS, EMS, and IT players that currently do not offer curtailment as a standalone service. BEMS from Johnson Controls, Honeywell, and Siemens will continue to provide the electro-mechanical building optimization services they have historically offered, with a more acute focus on energy management.

    Additionally, these large OEMs may partner with or acquire specialized energy efficiency technology companies with deep analytical abilities (e.g., Optimum Energy).

    The entrance of EMS players that do not currently provide clients with standalone DR will also shake up the industry. Firms like Minneapolis-based Verisae will provide both a software-as-a-service (SaaS) and a software play – thereby increasing visibility to the building (or a portfolio of buildings) and enhancing building optimization. For example, Verisae will provide end users with an EMS (of which DR is a component), asset management, and carbon management – all on the same backbone.

    Additionally, the major IT companies, notably Cisco and IBM, are currently analyzing the EMS space. They will significantly disrupt the traditional DR market – providing devices that will be deployed on an open network that is integrated with a BEMS. This will increase the amount of granular energy consumption data and thus expand the opportunity for DR services. The powerful combination of Cisco’s EnergyWise switching technology with IBM’s Tivoli software enables the management of the consumption of IP-compliant devices on the network. Such integration offers the ability to manage BEMS assets (HVAC, lighting, and security) and provide real-time-pricing (RTP) demand response.

    The benefits from this shift should be large, as commercial buildings represent a significant portion of overall electricity consumption, but have yet to fully leverage available technology to manage that consumption. This transition will help market players access this low-hanging fruit provided by demand response.

    click to enlarge

    The ARRA Smart Grid “Stimulus” Will Finally Have a Positive Impact

    The American Reinvestment and Recovery Act of 2009 included two major components aimed at accelerating smart grid deployment: $3.4 billion for the Smart Grid Investment Grant (SGIG) program and $615 million for smart grid demonstration projects. As might have been expected, the road from idea to implementation has been full of unexpected detours, with many in the industry wondering how much this plan for massive spending actually helped.

    In 2008, before the world economy slipped toward “the great recession,” various smart grid projects were already underway. In the United States, spurred by the Energy Independence and Security Act (EISA) of 2007 and state regulations, utilities in Texas and California were moving from smart meter pilots toward full deployment, with many other utilities watching closely. Canada, specifically Ontario, was already in full deployment mode. The industry was positioning itself for an unprecedented period of growth. While the uncertainty caused by the financial collapse of late 2008 certainly clouded these growth prospects, concerns were generally limited to how much this might delay planned deployments.

    As the U.S. stimulus package took shape at the start of 2009, and numbers as high as $32 billion aimed at the smart grid were being discussed, the industry cheered. However, it also froze. Utilities on the verge of launching or deploying smart grid programs shifted all their energy toward securing a piece of the stimulus pie, preparing to submit proposals by the August deadline and then waiting for the winners to be announced in November.
    Thus, 2009 was a lost year for most grid vendors. It may be impossible to fully separate the impact of the SGIG freeze from the overall effect of the economic meltdown, but it is safe to say few in the industry felt their business was “stimulated.” Yet, all expected to see 2010 make up for the loss as the SGIG money began to flow.

    SGIG awards were announced in November 2009; however, the flow of money was held up as the strings attached to the grants were debated and negotiated. Tax questions, “Buy American” restrictions, reporting requirements, asset ownership issues, and security requirements had some grant winners quietly grumbling they might have to turn down their awards due to the associated restrictions. By summer, most of these issues had been resolved, contracts began to be awarded, and vendor selections were announced.

    As such, 2011 and 2012 are shaping up to finally realize the acceleration promised by the SGIG program. In addition, those projects that did not win SGIG awards are now stabilizing, and in many cases moving faster, after perhaps experiencing some extra scrutiny from their regulators.

    Long term, other related aspects of the stimulus are likely to show even greater positive impact. The acceleration of standards development led by NIST and supported by a technical community animated by the promise of the SGIG money continues to reshape segments of the industry. Moreover, the various demonstration projects that go beyond smart metering should de-risk more advanced smart grid deployments, including renewable generation integration, distribution automation, and microgrids.

    Starting in late 2011, Pike Research expects industry discussion to shift from “plans” toward “results,” and this transition will finally offer a tangible “stimulus” for the industry.

    click to enlarge

    The Standards “Horse” Will Begin to Catch the Deployment “Cart”

    One rational criticism of worldwide smart grid stimulus efforts has been that deployments are proceeding before good standards have been fully defined. This is especially true of AMI deployments, where smart meters with a divergent range of communications capabilities are being deployed without clarity on all the potential use cases. Others respond that the alternative is “analysis paralysis” and that existing well-defined use cases more than justify the current investments.

    In any case, smart grid standards are beginning to catch up to deployments. There may be too many parallel standards efforts underway around the world, but this still represents a dramatic improvement over the existing vendor-specific application silos. In the United States, NIST’s progress has been impressive, especially compared to the usual pace of standards development, and similar progress can be argued for Europe.

    There are many domains that are benefiting from standards acceleration, but perhaps none is more important than the widespread adoption of the Internet Protocol within the grid communications infrastructure. The use of IP may not be the magical panacea that proponents, mostly from the telecom industry, might imply. However, it is the key enabler of an integrated and layered networking infrastructure that will smooth the implementation of the advanced AMI, distribution automation, and substation automation applications at the heart of the smart grid.

    The coming year or so will see all the talk about IP adoption show up in real, production- ready smart grid products. Pike Research expects to see Itron’s IP-based upgrade to the OpenWay smart meter system, perhaps enhanced with the “special sauce” implied by its partnership with Cisco. The new-and-improved ZigBee Smart Energy Profile and IP-based stack will be finalized, and perhaps deployed within the millions of meters installed throughout California and elsewhere. IP-based implementations over PLC networks in Europe should also see some deployment action. And perhaps the distributed “platforms” promoted by Echelon, SmartSynch, and Ambient will start hosting third-party applications for enhanced distribution automation functions.

    Standards are nice on paper, but are not real until multi-vendor interoperability – or at least cooperation – is deployable in the field. Pike Research anticipates that in 2011, those that have invested in standards will be rewarded and the laggards will be punished.

    click to enlarge

    Data Management Will Be the Next Bottleneck to Smart Grid Benefits

    Much has been said about the “data tsunami” faced by utilities deploying AMI systems and other smart grid technologies as they move from monthly meter reads (or as infrequent as annually in parts of Europe) toward readings every 15 minutes (see Error! Reference source not found.). The focus has been on handling this vast amount of data, but the challenge of deriving useful information from this mountain of data may prove even more daunting.

    The smart grid is driving utility back-end IT systems and applications toward an even more radical transformation than the communications infrastructure. Anyone who has experienced the internal rollout of a new enterprise resource planning (ERP) system knows that far more than just learning new software interfaces is involved. If done correctly, such a rollout requires a substantial reengineering of most business processes, and even of the company structure itself. New smart grid applications will have the same effect within utilities of all sizes and types.

    Different utilities find themselves at dramatically different levels of preparedness for this transformation. Many are still patching together home-grown siloed applications, each with their own proprietary interfaces and databases for asset management, workforce management, outage management, meter management, billing, and customer relationships. Others may have already invested in some integrated applications, either through a single vendor or with the assistance of an IT systems integrator using service oriented architecture (SOA) or other standard architectures.

    However, few of these may be prepared for the business process changes that are inevitably required when smart metering systems (often “owned” by the customer/billing organization) start spewing real-time data to the outage management systems (“owned” by operations) and provide critical data for distribution network planning (“owned” by engineering. Even more challenging is linking to external third parties.

    As a case in point, some of the utilities leading the charge in California are wrestling with the provision of “real-time” metering data to third parties such as Google and Microsoft if customers request, as required by new California regulations. Such provision seems simple given the new AMI systems that are coming online, but ensuring the ability of the back-end systems to gather, process, clean, and distribute the information, modulated by customer permissions, in a timely way (hours) is challenging.

    Pike Research anticipates that the ability of utilities that are currently deploying smart grid systems (particularly AMI) to successfully work through these challenges will vary.

    Most will find these issues to be the next significant bottleneck toward fully exploiting the full potential of the smart grid.

    Existing Data and Telecom Vendors Will Get Serious About the Smart Grid

    Recent months have witnessed many grand announcements from players outside the traditional utility industry touting the opportunities and high-level strategies surrounding the smart grid. Industry giants, including Cisco, Intel, Verizon, AT&T, Vodafone, and Microsoft, have all launched major smart grid initiatives. A few actual product announcements have followed, but the reality behind these broad visions has remained cloaked in mystery.

    In 2011, the rubber should meet the road for these vendors. Telecom services vendors will find out if their revamped pricing and services plans in the United States will be enough to entice utilities to abandon their “build private networks where I can, use public networks where I must” mantra. Vodafone will learn whether its smart meter partnership with British Gas will sway the U.K. regulators to include cellular technology within the envisioned centralized “Data Communications Company” (DCC) for AMI infrastructure. Cisco, while delivering some well-considered products for substations and home-area trials, has yet to demonstrate any game-changing technology, especially for the home, that some utilities hope for and other vendors fear.

    Pike Research expects additional partnerships, such as the one announced between Itron and Cisco, to emerge. However, these will be pressured to offer demonstrable results as the overall market matures, standards solidify, and technology and vendor choices are made.

    Pike Research forecasts an approaching period of peak smart grid investment. Utility outsiders need to stake out their territory now, or the market window will close.