Category Archives: technology

Pondering a palatable pipeline…

I guest-hosted TWiE podcast episode 137 a few days ago, an episode devoted to the Alberta oil sands / tar sands. If you ask me (and I realize none of you have :) ) it’s well worth a listen!

The week’s guest was US energy analyst Robert Rapier, who had visited Fort McMurray on a Canadian government junket for journalists. He came back with a five-part essay on his experience, and some valuable, contextualizing factoids.

Shockingly, he showed data suggesting that the Alberta tar sands are now only slightly more greenhouse gas-intensive than “average” petroleum. (In other words, the emissions associated with turning the bitumen into usable oil, are only slightly higher than average.) Heavy oil extracted from California is actually worse!

This creates the situation where – for once – the Harper Government™ hadn’t drifted into fiction, in its years-long lobbying effort to prevent Europeans from labeling tar sands oil as a high-carbon fuel. I never saw that one coming.

Rapier spent time with the Pembina Institute as well, to try to get part of the other side of the story. For instance, though industry touts that it only uses one percent of the annual flow of the Athabasca river, seasonal variations are extreme; one percent of annual flow is equivalent to one-third of daily flow, at certain times of year. And while he wanted to visit nearby First Nations communities, that part of the visit got cancelled at the last minute. (Now, there’s the Harper Government™ I’ve come to know and love… to loathe.  :)  )

Continue reading


APSC150 speech

My August Canadian EV car sales stats update went up recently. Which was cool.

Cooler still, I had a chance to wax poetic about sustainability, and my new-found optimism that we’ll avoid the worst of our dystopian horrors. I was invited to be a guest lecturer for an engineering course at UBC (APSC 150) where I had the privilege to slightly shape the minds of about four hundred first-year students. And show them how, here in the first world, #WeAreWhales. (The cryptic comment is described in the slide deck, here.)

Coolest of all, I’ve achieved a Wiki-immortality of sorts! I’m a Wikipedia footnote in the Tesla Model S article! Or, rather, one of my older GreenCarReports columns is. The one describing the vehicle’s Canadian sales figures for the first half of 2013. :)

Wiki Klippenstein

Of course, Wiki’s being the infinitely editable sites that they are, my fame will well be fleeting. Which brings to mind to Hindu parable of Indra and the ants, whose punchline was once majestically translated as “former Indra’s, all“. :) For all our works and purpose, pride and presence, in time’s great fulness we are all returned into the Void from whence we came.

The Innovator’s Dilemma, Toyota edition


This car — yes, this car — has impeded Toyota’s electric efforts

My post on how The Innovator’s Dilemma explains why Toyota lags in electric vehicles — and how Kleiber’s Law explains there’s nothing for them to worry about (yet), is now up on GreenCarReports.

While the Tesla stats were cooler to have dug up, and will probably enjoy a broader readership, this particular piece was more gratifying to write; the Innovator’s Dilemma is a fairly well-known concept in business circles, but there’s a tendency to incorrectly think that all industries get changed and disrupted quickly. To adapt from yesterday’s screed, the world of software changes a lot more quickly than the world of stuff.

And Kleiber’s Law probably (partially) explains why.

The GCR article had to be edited down, and some of the rejected detritus included this little comparison of hybrid and EV adoption rates below. Think of it as rounding out the “complete and unabridged” version of the article.

Note: I thought electric vehicles would have roughly the same adoption rate as early hybrids, figuring that greater sales due to a broader product selection from various manufacturers, would be offset by lower sales due to the higher sticker price. Boy, was I wrong. :)

Though I might claim that gov’t rebates “distorted the market” (in a very positive way, mind you) I’m not so egotistical as to be unable to admit to mistakes, so I’ll file that for future learnings… after taking this quick religious diversion. :)

A quick religious diversion

On the topic of “complete and unabridged” versions, people who peruse the Christian scriptures (the “New Testament”) will notice that the Gospel of Mark is a lot shorter than the Gospels of Matthew and of Luke. This is most likely due to the fact that back in the day, there were two standard scroll lengths: a short one, and a long one. Kind of like how we have letter paper (8.5″ x 11″) and legal paper (8.5″ x 14″) today.

Mark, chronologically the first of the three to be written, was written on a short scroll, and Matthew and Luke wrote on the longer ones.

A more interesting case is that of the book, Acts of the Apostles, commonly credited to Luke — whose name almost certainly wasn’t Luke, because people tended to assign famous works, to more famous people. The same tends to happen in our modern era — for instance, this British revocation of the American Declaration of Independence  is commonly attributed to John Cleese, though he didn’t write it.

Acts exists in two commonly-circulated versions, one about 10% longer than the other. While this is less impressive “genetic variation” than one finds in other texts — the Buddhist Dhammapada has more variants, possibly because it was translated into multiple languages early on, before anyone with overarching authority tried to establish a “canonical” version, as happened in Christianity. There, someone identified by scholars as “The Ecclesiastical Redactor” (possibly Polycarp of Smyrna) created a standard edition fairly early on. There are many reasons for hypothesizing this, not the least of which is that essentially all manuscripts available to us share the same abbreviations of key terms (from memory, Theos is abbreviated Ts and Iesous is abbreviated Is).

All of which is a phenomenally long-winded, trivia-filled way of saying that the text appended below would form the “10% longer” version of my GreenCarReports article.  It originally was included before the paragraph “The Innovator’s Dilemma – why Toyota’s tepid on electrics”.

Hybrid history and the plug-in path

Plug-in electric vehicle enthusiasts have exchanged many a high-five over the fact that in the United States and probably elsewhere, plug-in adoption rates have thus far surpassed hybrid adoption rates. Here again, context is valuable.

In the first four years of hybrid availability in the United States (2000-2003) oil was cheap, and consumers could choose between three hybrid vehicles — two small (the Prius and the Civic Hybrid) and one even smaller (the Insight). These were sold by Toyota and Honda, who shared about 17 percent of the automotive market between them.

In retrospect, it’s unsurprising that electric vehicles are being adopted faster, given the greater awareness of our environmental challenges, higher oil prices improving the cost/benefit equation, government incentives, and — perhaps most crucially of all — widespread automaker participation.  

By the four-year anniversary of the Nissan Leaf and Chevy Volt’s December 2010 retail debut, ten carmakers will offering production plug-in electrics stateside: BMW, Daimler (Smart), Ford, GM, Honda, Mitsubishi, Nissan, Tesla, Toyota and VW.  (Fiat is excluded from the preceding list, as the 500e is a compliance car available only in California.)

These automakers control about 75 percent of the US auto market, and by December 2014 their product offerings will range all the way from subcompact commuter cars to SUV‘s. To adapt Alfred Sloan’s old phrase, there’s now a plug-in “for every purse and purpose”. Fierce competition has already resulted in lower prices, which will only accelerate sales volume, which will itself improve economies of scale.

A patent app mishap

FC patent apps

I hope that if/when this patent application gets granted, they update the title…  otherwise, someone at Samsung will have some ‘splaining to do!

Software updates are flu shots


A nice metaphor for software updates.

Flu shots attempt to immunize people from the influenza virus, by exposing their immune systems to small doses of weakened or dead virus molecules. The idea is that this gives the immune system a “practise run” with a less-dangerous version of the influenza virus the patient might run into, later that year.

The situation is complicated by the fact that there are countless strains of flu, because the virus isn’t actually very good at copying itself accurately. When people with the flu sneeze, the viruses they expel will have already mutated from when they contracted it. To use the scientific jargon,

“…each daughter virus has an average of 1.34 to 1.52 mutations! The superfast mutation rate of influenza is what fuels its ability to evolve and adapt, overcoming the immune system’s method of recognizing old pathogens.”

Software updates — anti-virus updates in particular — serve much the same function: they create a moving target, which hopefully stays a step ahead of software viruses, which can take control of a computer in much the same way that parasites can “hack” their hosts, and change their behaviour.

Parasites “hack” human behaviour too — toxoplasma gondii is a well-known example — with all the collateral implications for the concept of free will. In addition to a host of terrible diseases, the parasite may make people more likely to take risks, not unlike what it does to rats. (Infected rats seem to behave more recklessly, among other things losing their fear of cat odours: getting the rat eaten is a way for the protozoan to get back into a cat, its favoured host species.)

On a personal level, given the “don’t-try-this-at-home-kids” characteristics of my investment strategies since we adopted our cat, I wonder if I’m one of the roughly one-third of people who host this “friend”-with-benefits…  :)

And just as a flu shot gives our immune system a chance to develop antibodies which will detect and defeat the flu strains to which we’re exposed — as well as close mutations — updates give our O/S or software the ability to neutralize the viruses / trojans / malware which they target, as well as any closely-related variants which run closely-similar scripts.  (If my understanding is correct.)

This is probably a good time to remind readers that the historical/mythical Trojan Horse was not a big wooden horse, but more likely a wooden battering Ram. Switch mammals and you get a wooden battering Horse, then add a few hundred years for legends to accrue in a Greek society which lost all its siege-warfare skills — seriously, they were hopeless besiegers — and you can see how tales of a wooden Horse ending a siege could be re-imagined as some sort of horse made out of wood.

The software update / flu shot analogy is imperfect — among other things, genetic variation among life-forms has no software analog — but is probably good enough to be functional. Or at a minimum, drive a bit of thought about the ways biology reflects and refracts itself, in our innumerable human endeavours.

I do wonder if people would be more likely to install and/or update anti-virus software, if it were marketed with a flu shot analogy.

Admittedly, given the nonsensical, false superstitions about vaccinations some people hold (which seem to be a case of people’s distrust of Big Pharma companies metastasizing into a distrust of evidence-based medicine — preventable flu deaths rose by the equivalent of twelve 9/11 terrorist attacks per year in first-world Japan, after mandatory immunization programs fell prey to fearmongering) … software firms might think twice about such a marketing strategy.

The surveillance state is an autoimmune disorder

Reaching into American history, we encounter the saying, “the price of freedom is eternal vigilance”…. in our modern era, we might need to add a corollary, namely that “the price of infinite vigilance, is freedom”.

First, a short medical analogy.

Autoimmune disorders

Autoimmune disorders (Wikipedia prefers autoimmune diseases) occur when the body’s defenses — antibodies — no longer distinguish between healthy tissue and harmful cells. Instead of focusing on the dangerous antigens, they attack the body itself.

Type 1 diabetes is an example, where the patient’s immune system attacks the insulin-producing regions of the pancreas. Blood insulin levels drop, making it more difficult for cells to absorb glucose from the bloodstream, leaving elevated blood glucose levels, and all the associated problems of diabetes.

Multiple sclerosis is another example, where the patient’s immune system attacks their nervous system. Localized physical inflammation occurs, which causes nerve damage, which impairs sufferers’ quality-of-life.

These autoimmune disorders occur on the individual level.

The hygiene hypothesis

In medicine, the hygiene hypothesis suggests that allergies could be thought of as a sort of autoimmune disorder, brought about by excessive cleanliness (!).

The general idea is that our immune systems developed over millions of years in the, um, virulent and filthy conditions that characterized most of human existence until the arrival of modern sanitation. Given this, our immune systems have evolved to be hyper-vigilant.  After all, until recently, even minor flesh wounds could be fatal, if they got infected.

One theory posits that if our immune systems aren’t kept busy fending off microbial, bacterial and viral attacks when we’re young, they overreact when they encounter benign intruders (e.g. pollen), or even healthy human cells, mistaking these for existential threats. It’s the medical profession’s equivalent of the “idle hands are the devil’s playthings” argument…!

The generally agreed-upon workaround is to make sure kids wash their hands before eating and after using the washroom, but to otherwise roll around in dirt, play with animals and so forth. The former steps help keep children safe from more dangerous microbes, while the latter keeps their immune systems busy.

Stranger still, medical researchers are exploring the treatment of autoimmune diseases by deliberately infecting patients with parasites!  Reasonably benign parasitic worms which co-evolved with humans and co-existed with us until the advent of modern hygiene, are introduced to the body.  Improvement comes when the immune system stops attacking healthy body tissue, to focus on beating back the parasites.

Unfortunately, the immune system sometimes resumes attacking the body after it beats back the parasites, meaning that periodic reinfection may be necessary. In helminthic therapy, you don’t take vitamin supplements, you take parasite supplements!

Societal-level autoimmune disorders

I think the recently-revealed excesses of NSA / PRISM / surveillance state can be best thought of as a societal-level autoimmune disorder. Human society has almost certainly become dramatically less violent over time, and that’s a very good thing. Especially for those of us who’re members of ethnic groups who’ve historically been the victims!

Meanwhile, for a roughly fifty-year period in the 20th century, an enormous American security apparatus evolved, to address the perceived existential threat from Soviet Communism — and, uh, stamp out any less-than-solidly-pro-American governments in Latin America and other strategic parts of the world. (We should try to be objective, eh?)

With the threat of Communism now gone, the vast resources of American society’s “immune system” have become focused on terrorists — individuals who can cause damage and suffering, but who cannot and could never pose the kind of existential threat of an invading army. This is a good thing: we want to be safe from those who cause us harm.

But like a white blood cell which indiscriminately attacks other cells in the body — not just the harmful antigens — the NSA has effectively wiretapped everyone in the world. Given that in the normal course of law, courts must be convinced of reasonably-probable wrongdoing for wiretaps to be granted, the NSA is essentially treating everyone as a suspect.

In this sense, its behaviour maps to that of an immune system that has been hijacked by an autoimmune disorder, and is treating the body’s own cells as invaders. The main difference is that the surveillance state exists at the societal level, while autoimmune disorders exist at the individual level.

The price of freedom…

If autoimmune disorders can be prevented and/or treated by allowing the body to be exposed to lesser pathogens, this might hint at the path out of the surveillance state. If citizens accept that to maintain their major freedoms, they must accept that minor acts of violence might succeed, the recently-revealed excesses of the NSA could be curbed. A government which tracks its people’s communications, by the very act of doing so, subtly impairs their freedoms of conscience and expression.

The above is a bit abstract, so I’ll close with a couple closer-to-home examples.

1) Shortly after the 9/11 attacks, George W. Bush made the comment to the effect that “[terrorists] hate us for our freedoms”. In doing so, he both oversimplified and mischaracterized the motives of such attackers, which relate more to the various humiliations of Western colonization, and the despair of resolving or overcoming the injustices they perceive.

But if Osama bin Laden hated us for our freedoms, then restricting those freedoms through the surveillance state gives him exactly what he would have wanted! (Political violence — such as terrorism — is successful when it causes the victimized government to bend its policies in the desired direction.) If for no other reason than thwarting bin Laden, it will be important for us to rein in the surveillance state.

2) brings us back to the “pull quote” at the start of the post.

Reaching deeply into American history, we encounter the saying, “the price of freedom is eternal vigilance”. It’s been misattributed to many Great Men in American history, among them Abraham Lincoln, Thomas Jefferson, and Thomas Paine, though it seems to’ve been a British actor who first formulated the phrase.

In our modern era, we might need to add a corollary, namely that “the price of infinite vigilance, is freedom”.

[July 25 – light editing to summarize the conclusion at the outset. – Thx for the tip, Bob!]

Steven Chu’s “Time to Fix the Wiring” at four years

Former US Energy Secretary Steven Chu’s recent resignation — his farewell letter is here  — is no doubt celebrated in the fuel cell quarters as passionately (or more so) than it is mourned in the rest of cleantech.  Early in his term, Chu infamously argued (infamously, at least, to fuel cell enthusiasts) that fuel cell electric vehicles (FCEV’s) needed four miracles for commercial success, namely:

  1. most hydrogen comes from natural gas (so why not just use that as a fuel?)
  2. improvements in hydrogen storage were needed
  3. fuel cells needed to improve
  4. there was no distribution system in place

While many of my colleagues were hostile to Chu — some more than others (an inside joke) — I was largely unfazed, as Ballard had by then moved on to “everything except automotive fuel cells” in light of the commercialization timelines.  (Which reflected points 3 and 4 above.)  And Chu seemed open-minded towards stationary fuel cells.  From the MIT Technology Review article:

“I think that hydrogen could be effectively a “battery” in the sense that suppose you had a way of using excess electricity–let’s say a nuclear plant at night, or solar or wind excess capacity, and there was an efficient electrolysis way of turning that into hydrogen, and then we have stationary fuel cells. It could effectively be a battery of sorts. You take a certain form of energy and convert it to hydrogen, and then convert it back [into electricity]. You don’t have the distribution problem, you don’t have the weight problem. In certain applications, you don’t need as many miracles for it to happen.”

Chu, ARPA-E, and solar

Many people have already written panegyrics to Chu’s departure, Climate Progress and Grist among them.  Even coming from the fuel cell industry, I think on balance he deserves a lot of praise for carrying out the US Department of Energy’s ARPA-E program to fund next-generation energy research.  Even if he did get a bunch of things wrong, among them the prediction that solar needed breakthroughs to achieve commercial viability.

“But Chu noted that solar power, for one, is still far too expensive to compete with conventional power plants (except on hot summer days in some places, and with subsidies). Making solar cheap will require “transformative technologies,” equivalent to the discovery of the transistor, he said.”

In the past four years, it’s gotten there in Germany, is on the cusp in Australia, and is probably already there in several sunnier climes.  The cost-reductions in that industry have come almost exclusively from economies of scale and the nearly-universally-applicable cost-learning, or experience curve.

Mind you, given my political leanings, I’m generally supportive of government-driven industrial policy.  :)  Societies generally last a lot longer — centuries longer — than any individual businesses, so it makes sense that societies may want to fund projects with a payoff too far out for individual businesses to care about.  That said, I support the notion that “moonshot” projects should ideally have partial private-sector funding, so that business people have skin in the game, and can search out ways to commercialize achievements made on the way.

An intro to “Time to Fix the Wiring”

The above provides good context with which to revisit the essay Chu (and one of his underlings?  :)  ) wrote for a McKinsey & Company series on the future of energy, exactly four years ago today.  This was part of their “What Matters” umbrella, which covered energy, biotech and other topics.

They’ve since taken the series offline — I suppose they need to keep things fresh — but I was able to get permission from a McKinsey representative to reprint the essay below.

Hindsight is 20/20, of course, and in this case renewable energy has progressed far beyond his Olympiad-ago assessment.  Solar’s costs have come way down, as noted above; renewables may be now viable for 40% of a grid instead of  25% he cites, and some of the geothermal breakthroughs he discusses, can probably be borrowed from the shale gas fracking industry.

All in all, the essay is a reminder to environmentally and stewardship-inclined alike, that the clean energy sector has come  astonishingly far in four years.  I’ll delve into further detail when I continue my series on our renewable destiny. :)


Time to fix the wiring

By Steven Chu

26 February 2009

Imagine that your home suffers a small electrical fire. You call in a structural engineer, who tells you the wiring is shot; if you don’t replace it, there is a 50 percent chance that the house will burn down in the next few years. You get a second opinion, which agrees with the first. So does the third. You can go on until you find the one engineer in a thousand who is willing to give you the answer you want—“your family is not in danger”—or you can fix the wiring.

That is the situation we face today with global warming. We can either fix the wiring by accelerating our progress away from dependence on fossil fuels, such as coal, oil, and natural gas, or we can face a considerable risk of the planet heating up intolerably.

The need to act is urgent. As a start, governments, businesses, and individuals should harvest the lowest-hanging fruit: maximizing energy efficiency and minimizing energy use. We cannot conserve our way out of this crisis, but conservation has to be a part of any solution. Ultimately, though, we need sustainable, carbon-neutral sources of energy.

It’s important to understand where we are now. Existing energy technologies won’t provide the scale or cost efficiency required to meet the world’s energy and climate challenges. Corn ethanol is not a sustainable or scalable solution. Solar energy generated from existing technologies remains much more expensive than energy from fossil fuels. While wind energy is becoming economically competitive and could account for 10 to 15 percent of the electricity generated in the United States by the year 2030 (up from less than 1 percent now, according to the US Energy Information Administration), it is an intermittent energy source. Better long-distance electricity transmission systems and cost-effective energy storage methods are needed before we can rely on such a source to supply roughly 25 percent or more of base-load electricity generation (the minimum amount of electrical power that must be made available). Geothermal energy, however, can be produced on demand. A recent Massachusetts Institute of Technology (MIT) report suggests that with the right R&D investments, it could supply 10 percent of US power needs by 2050 (up from about 0.5 percent now).

Coal has become a dirty word in many circles, but its abundance and economics will nonetheless make it a part of the energy future. The United States produces more than half of its power from coal; what’s more, it has 27 percent of the world’s known reserves and, together with China, India, and Russia, accounts for two-thirds of the global supply. The world is therefore unlikely to turn its back on coal, but we urgently need to develop cost-effective technologies to capture and store billions of tons of coal-related carbon emissions a year.

Looking ahead, aggressive support of energy science and technology, coupled with incentives to accelerate the development and deployment of innovative solutions, can transform energy demand and supply. What do I mean by such a transformation? In the 1920s and 1930s, AT&T Bell Laboratories focused on extending the life of vacuum tubes, which made transcontinental and transatlantic communications possible. A much smaller research program aimed to invent a completely new device based on breakthroughs in quantum physics. The result was the transistor, which transformed communications. We should be seeking similar quantum leaps for energy.

That will require sustained government support for research at universities and national labs. The development of the transistor, like virtually all 20th-century transformative technologies in electronics, medicine, and biotechnology, was led by people trained, nurtured, and embedded in a culture of fundamental research. At the Lawrence Berkeley National Laboratory—part of the US Department of Energy and home to 11 Nobel Laureates—scientists using synthetic biology are genetically engineering yeast and bacteria into organisms that can produce liquid transportation fuels from cellulosic biomass. In another project, scientists are trying to develop a new generation of nanotechnology-based polymer photovoltaic cells to reduce the cost of generating solar electricity by more than a factor of five, making it competitive with coal and natural gas. In collaboration with scientists from MIT and the California Institute of Technology, yet another Berkeley Lab research program is experimenting with artificial photosynthesis, which uses solar-generated electricity to produce economically competitive transportation fuels from water and carbon dioxide. If this approach works, it would address two major energy challenges: climate change and dependence on foreign oil producers.

In the next ten years, given proper funding, such research projects could significantly improve our ability to convert solar energy into power and store it and to convert cellulosic biomass or algae into advanced transportation fuels efficiently. Combined, this would mean a genuine transformation of the energy sector.

The world can and will meet its energy challenges. But the transformation must start with a simple thought: it’s time to fix the wiring.

This article was originally published in McKinsey’s What Matters. Copyright (c) McKinsey & Company. All rights reserved. Reprinted with permission.

Our Renewable Future part 1: clearing “myth”conceptions

With Obama talking the talk on climate action in his State of the Union address yesterday, now seems a good time to start compiling a planned set of blog entries about renewable energy. Many many others have done so online already (as evidenced by the fact I’m linking to them!) but I’d like to communicate my cautiously nascent optimism in my own words.

I’m growingly confident that I’ll live to see renewables dominate global electricity production, as dominantly as oil dominates global transport today, with immense and commensurate environmental benefits.

That moment won’t come a moment too soon, either, given the calamities that we’ve “locked in” for our children — the last time CO2 levels were this high (about 396 ppm in Jan 2013), sea levels were 25 metres higher than they are today.  The only reason sea levels remain near pre-industrial levels is that the earth’s systems haven’t had time to equilibrate, yet.  To use a baseball analogy, we’re still in the first inning of seeing the effects of our emissions.

Now, when I talk about renewables, I mainly mean wind and solar, which tower over their cleantech cousins like redwoods over a meadow.  (While hydroelectric is renewable and dwarfs these two for now, it doesn’t get the sexy “cleantech” label, being a mature technology.)

But before explaining my new-found confidence — certainty, even — in “Our Renewable Future”, I wanted to address a few major myths, objections and misconceptions about renewable energy — the blogging equivalent of clearing the underbrush, I suppose.  :)

I’ll do so using a Q & A format based on the way John Cook at Skeptical Science addresses common myths about climate change.

Continue reading

The Black Swan’s Thanksgiving Turkey

(originally written Nov 24, 2011.  Part of Great Upload of 2013.)

It came to my attention that Naseem Nicholas Taleb, who authored The Black Swan (surprisingly, not about a ballet dancer, but about financial crises) discussed other avians in his book, among them the Thanksgiving turkey.  Per the Wikipedia page, he seems to’ve co-opted the idea from a turkey anecdote by philosopher Bertrand Russell, whose atheism doubtless led antagonists to brand him cuckoo.  ;)

The abrupt change in the turkey’s situation is part of an argument that it’s ridiculous to project present trends very far into the future, because, well, things change.  Hockey-wise, the Gretzky-led Edmonton Oilers of the 1980’s inspired a high-scoring decade for the NHL.  This was followed by a low-scoring decade inflicted on fans by the New Jersey Devils’ success with the neutral-zone trap in 1994-1995.  (As per the viral video most of you’ve doubtless seen, the Tampa Bay Lightning are going retro with their 1-3-1 system.  Lightning GM Steve Yzerman was part of the Red Wings team the Devils upset in the 1995 Stanley Cup Finals.)

Continue reading