Tag Archives: wind energy

Electron democracy

A long-belated companion to Steven Chu’s “Time to fix the wiring” essay I posted earlier, this is the white paper I co-authored for the same McKinsey & Company series. Given the roughly five-month delay in uploading this, I suppose “Time to post the writing” might be an apt subtitle… :)

Ever the stickler for citing sources (in university, while writing up a chemical engineering lab report, I once cited a colleague’s report I made use of, in my bibliography of sources – yes, I was a wild one) I was pleased McKinsey kept the footnote crediting the work John Robb and Jeff Vail.

Four years on, it’s encouraging to see how wrong the essay has turned out to be — because all the recent developments are for the better. It would be as if an investor bought a bunch of boring utility stocks for the safe, reliable dividends, only to discover at the end of the year that they got a bunch of capital appreciation as well.

Though on that note, I think fossil-fuel burning utilities are already a risky investment now, because renewables are already eroding their business model in some countries… and since renewables will get dramatically cheaper going forward as production scales up, the phenomenon will inevitably repeat itself around the world.  (Speaking of uploading delays, clearly I’ll have to get to part 2 of this series…)

When the essay was written (late 2008), grid energy storage seemed a long, long way from commercialization, so our assumption had been that large-scale hydro plants and smaller-scale fuel cell facilities would complement renewables’ intermittency.  (The EV / PHEV adoption rate is such that these are unlikely to offer any appreciable grid storage by 2030, either…)

With Germany’s announcement of a program to subsidize battery-based residential energy storage systems, enabling companies to ramp up production and get the economies of scale with which to drive aggressive cost reductions, it looks like fuel cells will face a lot of pressure at the residential scale.

As for the resiliency benefits of on-site power generation, that seems to have become a priority for many tech companies, in areas where subsidies for on-site generation are available.  (I could justify mild subsidies, because on-site generation minimizes the need to maintain or expand transmission infrastructure, which can be expensive.)

One wonders if some of these companies are worried that a renewables future will destabilize the grid: this is a “myth”conception, as many utilities point out.  I read somewhere that when Germany began its Energiewende — (renewable) energy transformation — the feeling was that the grid could only handle 5% intermittent renewables (ie. wind + solar). Then it became 10%, and then 20%. Then it became 40%. The latest I’ve seen is 60% with the possibility of 80% for continental Europe. As technology improves, that will only increase. Especially if/when electricity-to-hydrogen or electricity-to-natural gas technology matures, allowing for large-scale storage of excess, intermittent electricity.

On the fuel cell side, Bloom Energy seems to have become adept at acquiring subsidies market share in the on-site generation space, despite the fact that their technology is less efficient than combined-cycle gas turbines.  (That said, turbines are generally LOUD and therefore not suitable for on-site location.)  As such, when it comes to larger-scale on-site 24/7 fuel cell power generation, since Ballard isn’t in that game anymore, I root for the folks at ClearEdge Power, whose use of cogeneration makes it possible to achieve overall energy efficiencies of 90%+, even if only a portion of that becomes electricity. :)

 

– – – – – – – –

Electron-Democracy

By Matthew Klippenstein and Noordin Nanji

3 March 2009

The way electric power is generated and distributed will change substantially over the next two decades. Power will be democratized, as small-scale production at the individual and community level moves from niche to normal. The resulting “electron-democracy” will still have centralized power plants, but power grid activity will increasingly be dominated by innumerable incremental energy flows between small producers and consumers. This is likely to happen whether or not public policy mandates a shift away from dependence on fossil fuels.

Most centralized plants (hydro excepted) cannot easily adjust to demand fluctuations, leading to steeply discounted off-peak rates and the need to acquire additional plants for high-demand periods. More broadly, an expansive transmission grid dominated by a few central power plants is vulnerable to disruption from both natural phenomena and human malevolence.

In contrast, smaller-scale power generation can respond more nimbly to market demand, in a shorter time frame, with lower capital costs. Filling supplemental power needs with niche supplies rather than primary power facilities creates new generation options that that otherwise would be impractical. Finally, a grid fed by a broad, physically dispersed heterogeneous mixture of power sources would provide robust protection against disruption.1

Putting these strands together and looking forward, the distributed grid might look like this: intermittent wind and solar power generation would be complemented by load-supplementing fuel cell plants, in much the same way that peak power and base load power plants interact today. Electric vehicles (EVs), plug-in hybrid electric vehicles (PHEVs), and batteries would serve as grid energy storage when excess energy is being produced. The latter is analogous to the role of pumped-storage hydroelectric in current utility systems, where water is pumped from a lower reservoir to a higher one for later use in generating hydroelectric power.

Considering the intensifying pace of climate change, governments should play an ambitious role in the transition from today’s grid to tomorrow’s electron democracy. Governments could coordinate with local business to develop centers of excellence for distributed power in targeted industries. Mechanisms such as feed-in tariffs—which grant favorable rates for those generating power from renewables and clean-tech sources—could facilitate the development of these regional technology clusters. They would bring ancillary economic benefits as well.

We are hopeful that by 2030, our energy system will be considerably less dependent on fossil fuels, particularly for electric power generation. Supported by a diverse array of renewables, our energy needs could be met with an overlapping set of complementary clean technologies. In doing so, we would strongly curb our global warming emissions. We would then be poised not only to stabilize the climate, but to transcend the Fossil Fuel Age entirely and open a new “Age of Sustainability” in our human story.

– – – – –

1 A closer examination of these topics is available from Jeff Vail (A Theory of Power) and John Robb (Brave New War) in their writings on “rhizome” at jeffvail.net and “resilient communities” at globalguerrillas.typepad.com, respectively.

Steven Chu’s “Time to Fix the Wiring” at four years

Former US Energy Secretary Steven Chu’s recent resignation — his farewell letter is here  — is no doubt celebrated in the fuel cell quarters as passionately (or more so) than it is mourned in the rest of cleantech.  Early in his term, Chu infamously argued (infamously, at least, to fuel cell enthusiasts) that fuel cell electric vehicles (FCEV’s) needed four miracles for commercial success, namely:

  1. most hydrogen comes from natural gas (so why not just use that as a fuel?)
  2. improvements in hydrogen storage were needed
  3. fuel cells needed to improve
  4. there was no distribution system in place

While many of my colleagues were hostile to Chu — some more than others (an inside joke) — I was largely unfazed, as Ballard had by then moved on to “everything except automotive fuel cells” in light of the commercialization timelines.  (Which reflected points 3 and 4 above.)  And Chu seemed open-minded towards stationary fuel cells.  From the MIT Technology Review article:

“I think that hydrogen could be effectively a “battery” in the sense that suppose you had a way of using excess electricity–let’s say a nuclear plant at night, or solar or wind excess capacity, and there was an efficient electrolysis way of turning that into hydrogen, and then we have stationary fuel cells. It could effectively be a battery of sorts. You take a certain form of energy and convert it to hydrogen, and then convert it back [into electricity]. You don’t have the distribution problem, you don’t have the weight problem. In certain applications, you don’t need as many miracles for it to happen.”

Chu, ARPA-E, and solar

Many people have already written panegyrics to Chu’s departure, Climate Progress and Grist among them.  Even coming from the fuel cell industry, I think on balance he deserves a lot of praise for carrying out the US Department of Energy’s ARPA-E program to fund next-generation energy research.  Even if he did get a bunch of things wrong, among them the prediction that solar needed breakthroughs to achieve commercial viability.

“But Chu noted that solar power, for one, is still far too expensive to compete with conventional power plants (except on hot summer days in some places, and with subsidies). Making solar cheap will require “transformative technologies,” equivalent to the discovery of the transistor, he said.”

In the past four years, it’s gotten there in Germany, is on the cusp in Australia, and is probably already there in several sunnier climes.  The cost-reductions in that industry have come almost exclusively from economies of scale and the nearly-universally-applicable cost-learning, or experience curve.

Mind you, given my political leanings, I’m generally supportive of government-driven industrial policy.  :)  Societies generally last a lot longer — centuries longer — than any individual businesses, so it makes sense that societies may want to fund projects with a payoff too far out for individual businesses to care about.  That said, I support the notion that “moonshot” projects should ideally have partial private-sector funding, so that business people have skin in the game, and can search out ways to commercialize achievements made on the way.

An intro to “Time to Fix the Wiring”

The above provides good context with which to revisit the essay Chu (and one of his underlings?  :)  ) wrote for a McKinsey & Company series on the future of energy, exactly four years ago today.  This was part of their “What Matters” umbrella, which covered energy, biotech and other topics.

They’ve since taken the series offline — I suppose they need to keep things fresh — but I was able to get permission from a McKinsey representative to reprint the essay below.

Hindsight is 20/20, of course, and in this case renewable energy has progressed far beyond his Olympiad-ago assessment.  Solar’s costs have come way down, as noted above; renewables may be now viable for 40% of a grid instead of  25% he cites, and some of the geothermal breakthroughs he discusses, can probably be borrowed from the shale gas fracking industry.

All in all, the essay is a reminder to environmentally and stewardship-inclined alike, that the clean energy sector has come  astonishingly far in four years.  I’ll delve into further detail when I continue my series on our renewable destiny. :)

—————

Time to fix the wiring

By Steven Chu

26 February 2009

Imagine that your home suffers a small electrical fire. You call in a structural engineer, who tells you the wiring is shot; if you don’t replace it, there is a 50 percent chance that the house will burn down in the next few years. You get a second opinion, which agrees with the first. So does the third. You can go on until you find the one engineer in a thousand who is willing to give you the answer you want—“your family is not in danger”—or you can fix the wiring.

That is the situation we face today with global warming. We can either fix the wiring by accelerating our progress away from dependence on fossil fuels, such as coal, oil, and natural gas, or we can face a considerable risk of the planet heating up intolerably.

The need to act is urgent. As a start, governments, businesses, and individuals should harvest the lowest-hanging fruit: maximizing energy efficiency and minimizing energy use. We cannot conserve our way out of this crisis, but conservation has to be a part of any solution. Ultimately, though, we need sustainable, carbon-neutral sources of energy.

It’s important to understand where we are now. Existing energy technologies won’t provide the scale or cost efficiency required to meet the world’s energy and climate challenges. Corn ethanol is not a sustainable or scalable solution. Solar energy generated from existing technologies remains much more expensive than energy from fossil fuels. While wind energy is becoming economically competitive and could account for 10 to 15 percent of the electricity generated in the United States by the year 2030 (up from less than 1 percent now, according to the US Energy Information Administration), it is an intermittent energy source. Better long-distance electricity transmission systems and cost-effective energy storage methods are needed before we can rely on such a source to supply roughly 25 percent or more of base-load electricity generation (the minimum amount of electrical power that must be made available). Geothermal energy, however, can be produced on demand. A recent Massachusetts Institute of Technology (MIT) report suggests that with the right R&D investments, it could supply 10 percent of US power needs by 2050 (up from about 0.5 percent now).

Coal has become a dirty word in many circles, but its abundance and economics will nonetheless make it a part of the energy future. The United States produces more than half of its power from coal; what’s more, it has 27 percent of the world’s known reserves and, together with China, India, and Russia, accounts for two-thirds of the global supply. The world is therefore unlikely to turn its back on coal, but we urgently need to develop cost-effective technologies to capture and store billions of tons of coal-related carbon emissions a year.

Looking ahead, aggressive support of energy science and technology, coupled with incentives to accelerate the development and deployment of innovative solutions, can transform energy demand and supply. What do I mean by such a transformation? In the 1920s and 1930s, AT&T Bell Laboratories focused on extending the life of vacuum tubes, which made transcontinental and transatlantic communications possible. A much smaller research program aimed to invent a completely new device based on breakthroughs in quantum physics. The result was the transistor, which transformed communications. We should be seeking similar quantum leaps for energy.

That will require sustained government support for research at universities and national labs. The development of the transistor, like virtually all 20th-century transformative technologies in electronics, medicine, and biotechnology, was led by people trained, nurtured, and embedded in a culture of fundamental research. At the Lawrence Berkeley National Laboratory—part of the US Department of Energy and home to 11 Nobel Laureates—scientists using synthetic biology are genetically engineering yeast and bacteria into organisms that can produce liquid transportation fuels from cellulosic biomass. In another project, scientists are trying to develop a new generation of nanotechnology-based polymer photovoltaic cells to reduce the cost of generating solar electricity by more than a factor of five, making it competitive with coal and natural gas. In collaboration with scientists from MIT and the California Institute of Technology, yet another Berkeley Lab research program is experimenting with artificial photosynthesis, which uses solar-generated electricity to produce economically competitive transportation fuels from water and carbon dioxide. If this approach works, it would address two major energy challenges: climate change and dependence on foreign oil producers.

In the next ten years, given proper funding, such research projects could significantly improve our ability to convert solar energy into power and store it and to convert cellulosic biomass or algae into advanced transportation fuels efficiently. Combined, this would mean a genuine transformation of the energy sector.

The world can and will meet its energy challenges. But the transformation must start with a simple thought: it’s time to fix the wiring.

This article was originally published in McKinsey’s What Matters. Copyright (c) McKinsey & Company. All rights reserved. Reprinted with permission.

Of whales and wind turbines

(originally written June 20, 2012 — part of my Great Upload of 2013)

>

So I was reading up on my Ray Kurzweil last night, because it’s good to read people you disagree with once in a while — but preferably no more often than that.  ;)

– – – – –

Kurzweil

Ray Kurzweil is a futurist who believes we’re heading into a singularity where, in this century, life will transcend biology and we’ll reach some sort of a higher condition of life.  His ideas could probably be summed up as:

– it took billions of years to go from single-celled creatures to multi-celled ones

– then hundreds of millions of years to get to human-like creatures

– then hundreds of thousands of years for homo sapiens to create cities

– then thousands of years for us to start making upgrades (artificial hips, pacemakers and the like)

– and in a short span of time, we’ll transition from a biological-molecule-based form of consciousness to a silicon-based one

>

His ideas are pretty much distilled in this decade-old article/manifesto, which he wrote during the heady days of the dot-com bubble.  As such, though the trends in computing power have probably continued, economic progress… has not.

He made surprising choices in some graphs (e.g. patents issued over time) in that he didn’t factor in the huge effect of a rising population.  It looks like the number of patents issued per year went up tenfold from 1900 to 2000, but the US population also increased four-fold from roughly 70 million to 280 million.  So patents per person “only” went up 2.5x in a century.

There’s a big wrinkle though, which is that the US urban population went from about 40% to 80% in the century of 1900 to 2000, so the urban population probably rose about 8x [28 million to 200 million] in that century.  So in the past century, patents-per-urban-person might only have gone up… twenty percent?  A bigger wrinkle is the fact that half of US patents nowadays go to foreigners, and the biggest wrinkle is probably that patents aren’t a great way of measuring innovation.  They might be the best available measuring-stick, but that doesn’t mean they’re all that accurate…

>

Cities and Whales

The urban-population factor is important because recent research purports to show that as metropolitan areas get bigger, they tend to “speed up” — since Metro Vancouver has twice the population of Metro Calgary, one would expect Vancouver to have 15% higher per-capita mean income and patenting rates.  Of course, local factors like the tar sands mean that these general trends come with massive, massive margins of error.  :)

The reason for this trend might be that as cities get bigger, people can become more and more specialized, and nudge the boundaries of human knowledge just a bit further in one tiny area.  And with so many people around them, there’s a better chance they’ll run into someone who can make use of that knowledge.  And there is a symmetric downside: apparently per-capita crime and other social ills also tend to increase about 15% with each doubling in city size.

>

This “speeding up” with bigger size is the opposite of what happens in publicly-traded companies, which tend to “slow down” as they get bigger — fewer patents per person, lower per-person revenues, etc.  (The trend surely holds true for privately-held companies too, but since public companies release quarterly financial statements it’s waay easier to crunch public company data than private companies’.)  This phenomenon could elegantly, partially explain why public-sector bureaucracies often seem worse than private-sector ones: few private companies ever reach the size of governments!

A similar “slowing down” with size occurs in biology, a phenomenon known as Kleiber’s Law.  (Not to be confused with George Clooney’s girlfriend Stacy “Keibler”, or the cookie-making “Keebler” elves.)

>

In the critter world, when animals double in size, their metabolic (food) requirements tend to only increase by 70-ish percent.  To use math terms, the exponent describing the relationship between metabolic rate and mass, is between 2/3 and 3/4.  And before you ask, yes indeed, there is the usual academic bun fight over what exactly that exponent is!  :)  To use a better example than the one offered in Wikipedia, if we were to compare a 200 tonne blue whale with a 20 gram mouse, the whale weighs 10,000,000x more, but only requires about 10,000,000^0.7 = 80,000x as much food.

Another example of how life seems to “slow down” for big creatures is the reasonably-accurate factoid that many mammals, big and small, have a lifespan of about one to one-and-a-half billion heartbeats.  And indeed, whales live a lot longer than mice — in the absence of whalers.  And cats.  :)  I could imagine that for our earliest mammalian ancestors, this might have represented a good balance between “durable enough to have offspring” and “not so resource-intensive as to starve other important bodily functions of nutrients”, but then I imagine a lot of plausible-sounding, completely-inaccurate explanations.  :)

Rambling aside, as animals get bigger, they get more efficient with their food inputs.  Which brings us to wind turbines!

>

…and wind turbines

One of the few things I remember from my chemical engineering economics course is that the cost of components in a chemical plant increases more slowly than size, with the exponents generally in the 0.5-0.8 range.  We could think of this as a rough industrial analogue, or maybe even an extension, of Kleiber’s Law.

This trend applies to wind turbines, because if you scaled up a turbine so its blades and everything else were twice as big, you’d need more than twice the material, but you could probably extract quadruple the energy.  (Taller turbines can access stronger winds, and the blades would rotate through 4x the cross-sectional area, but various losses would eat away at that.)  The net effect is that bigger wind turbines are more efficient per-tonne-of-construction-material.  Not unlike that whale.  :)

>

And back to Kurzweil

Before setting sail on that cetacean tangent (ie. talking about whales) we were examining how a lot of the technological progress feeding Ray Kurzweil’s optimism might not have come from exponentially-improving calculation power, but from a one-time migration of people from the countryside to the cities.

If population growth and urbanization were big drivers for the extraordinary progress we made in the 20th century, it stands to reason that we might see a slowing-down of things in the 21st century, as world population (and world urban population) level off and start falling.  This would be a bit of a downer for techno-optimists’ utopian visions, but would fit the more pessimistic notion that the human condition is a cycle between harsher and milder dystopias.

As an admirer of the great Greek tragedies, I’m in the latter camp.  And while I’m as overconfident in my opinions as most men, I have an ace up my sleeve: as per page 3 of the TIME magazine article, people with mild depression are more accurate at predicting future events!  Nice of the universe to finally throw us folks a bone…!  ;)