Modeling the costs of greenhouse gas reduction

Taking measures to reduce greenhouse gas emissions over the next
decades raises fears among some that  our economy would be adversely
impacted.  Of course, not taking measures, as the UK’s Stern Report argues, could be worse. 

Nonetheless, some nice researchers at Yale conducted a meta-analysis of
the current models for estimating the economic impacts of GGH reduction
measures, identified the seven major assumptions that control 80% of
the differences in estimates, and created a tool to allow anyone to
plug in their own version of these assumptions.

You, too, can play economic advisor: http://www.climate.yale.edu/seeforyourself

Innovation, Invention, and Edison

A very nice article in the New York Times today about Edison and the ways in which technological history gets (re)written by the victors: Edison….  In this case, Matt Richtel describes a voice recording recently discovered that predates Edison’s invention of the phonograph by 2 decades:

Édouard-Léon Scott de Martinville has certainly been obscure, at least until now. Researchers say that in April 1860, the Parisian tinkerer used a device called a phonautograph to make visual recordings of a woman singing “Au Clair de la Lune.” That was 17 years before Thomas Edison received a patent for the phonograph, and 28 years before his technology was used to capture and play back a piece of a section of a Handel oratorio.

Of course, it follows a similar pattern with Edison’s light bulb, the patent for which was turned down as too similar to one filed almost 40 years earlier (in 1845) by J.W. Starr. 

Insightfully, Richtel recognizes that "Whom we credit with an invention often has less to do with who came up with an idea, and more to do with who translated it into something usable, accessible, commercial."  This is, after all, the definition of innovation: the exploitation of a novel idea. 

The danger in pursuing "inventors" is that, while historians might be interested solely in understanding the facts of what happened when, too many others are interested in replicating the feats of these "inventors." If all we care about is who came up with the idea first then we miss invaluable lessons about what it takes to translate that idea "into something usable, accessible, commercial," which is the bigger challenge.

The future of organizations

A friend recently asked me where I thought the future of
organizations was heading.  It was one of those deceptively simple
questions on which few professors can resist taking the bait.  And, like
any good speculative sociologist, I think not in outcomes but in the
opposing forces shaping those outcomes (letting me make forecasts
without making predictions). My quick response hinged on how technology and organizations
interact over time, and so was worth exploring further. 

Two of the
major technological forces shaping the future of organizations today
are (1) the centralization of information (enabling the flow of decision-making, coordination, and control to the core) and (2) the
decentralization of capability (enabling more potent actions at the periphery)

Organizations are increasingly able to channel information to the
very top levels. Those movie scenes in which White House politicians
watch, in real time, the actions of soldiers in the field are not
unrealistic.  Through the miracle of enterprise software, top
executives at retail chains, for example, can watch and respond to the
daily revenue numbers of individual stores; sales executives can track
daily progress of their salesforce.  Large organizations like Walmart
can now respond with a speed and flexibility unheard of for their size,
or for any size 10 years ago. 

At the same time, individuals and small teams at the periphery of these organizations can now take actions that control the fates of those organizations. Two traders, Michael Swenson and Josh Birnbaum, were able to save Goldman Sachs from the subprime meltdown (Goldman’s traders, not bosses, deserve credit). And Andrew Hall, a trader at Citigroup whose small
group (Philbro Corp.) was able to make big bets on energy prices and,
by being right, generate 10% of the net income of Citigroup for 2007 ("Trader hits jackpot in oil").
Citigroup’s response reflect the dilemma posed by these opposing
forces:

"Questions about the future of Phibro could add to the problems facing
Citigroup and its new CEO, Vikram Pandit. A sprawling company with
300,000 employees, Citigroup is trying to nurture entrepreneurial
talents like Mr. Hall, while curbing risk-taking elsewhere. The bank
can ill afford to lose top performers after a tough 2007, in which it
wrote off billions of dollars in failed mortgage bets." [by rogue traders, no doubt -ed]

And this is not always a good thing. Individuals and small groups at the periphery are also becoming capable of toppling
those same firms. The Cavelese disaster,
in which a single pilot flew recklessly low, severing a gondola cable and
killing 20 civilians, triggered an international incident.  These actions, while tragic, seem
to pale in comparison to business, where we’ve seen "rogue" (but
nevertheless junior) traders like Nick Leeson and Jerome Kerviel take
down  longstanding and well-respected financial institutions (Barings and Societe General).  Increasingly, we’re seeing these demonstrations of the power and capability that has been created at the
periphery of organizations today.ves viable.

The Red Queen within

Between these opposing forces of centralization and decentralization we
have, in essence, a red queen effect.  The Red Queen is named after a
character in Alice in Wonderland.  As Lewis Carrol wrote.

"It takes all the running you can do, to keep in the same place."

The Red Queen describes positive feedback loops in competitive
systems–where each side advances (sometimes very rapidly) in response
to each other’s advances. Or, as the poet Robinson Jeffers wrote:

"What but the wolf’s tooth whittled so fine
The fleet limbs of the antelope?
What but fear winged the birds, and hunger
Jewelled with such eyes the great goshawk’s head?"
                (Robinson Jeffers in "The Bloody Sire", 1941)

The flow of information and
decision-making to the core and of capability to the periphery in many
ways reflects a Red Queen effect–a set of interdependent evolutionary forces that, by interacting with each other, creating rapid change in organizations.

The more capability flows to the periphery, the
more effort
will go into providing the core with control over those frighteningly large peripheral capabilities.
Conversely, the more coordination and control at the core, the more the
organization creates capability  on
the periphery. Control without the capacity for action is wasted, and action, by definition, lives on the periphery.

This Red Queen effect has forced
large organizations to simultaneously centralize decision-making and decentralize raw power. And so with all the running they must do, they
are in the same place–no less secure for all of their centralized
control and distributed capabilities.

And so it seems that one of the most important lessons we can be teaching our next generation of leaders is to understand and manage what will become an increasingly unstable aspect of organizational life.  The illusion of control over resources that can, on any given day, bring  the entire organization to its knees.

2500 to 1

NobelThat’s the ratio: the number of scientists equal to one politician in the
struggle to define and affect climate change. At least according the
Nobel Peace Prize committee.

To be clear:

2,500 scientists = 1 politician

On October 12th, the Nobel Peace Prize was awarded to both Gore and the Intergovernmental Panel on Climate Change (IPCC), half going to each.

Al Gore on one side of the scale and, on the other, the 2,500 researchers from 130 nations who worked for two decades to create,
in the words of
the committee, “an ever-broader informed consensus
about the connection between human activities and global warming.”

To be fair, Gore was not alone.  Anyone who’s seen An Inconvenient Truth,
will know he had his trusty Apple Powerbook running a very cool set of Keynote slides (and among the green, this may have done more for Apple’s sales than the
iPod.  But that’s not the point).

It’s not fair to blame the Nobel Peace Prize Committee for this ratio.  The more frightening prospect is that they may be right.

America has made a decided tip towards acknowledging climate change, exploring solutions, and taking action.  Not all of it good, but getting off the pot should in itself count for something. Did we reach our tipping point because the scientists finally discovered climate change?  No.  That was 20 years ago.  Instead, it was because the scientists and others (I’m talking to you, Al) finally convinced the rest of the world of this reality.

The more important question is why, in the eyes of the committee, could 2500 of the leading scientists only get halfway towards making their ideas make a difference?

This is the more frightening prospect.  Science as it’s currently practiced has its faults, but I know of no better way to advance our understanding of the world than the American university system.  The problem is what we do with that understanding.

If a scientist discovers something startling in the rainforest and no-one is there to hear it, does it make a contribution? If he or she publishes a paper in Journal of Startling Rainforest Discoveries, and the other 20 scientists working the same field read it, but no others, does it make a contribution?

America spends approximately $48B on funded research by universities in this country and yet so little of it  makes a difference outside of the individual fields in which the work is guided, conducted, peer-reviewed, and published. In other words, we have created a market (actually many small, disconnected markets) where scientists answer only to other scientists in their fields and nobody is held accountable for whether that research reaches a broader audience.

Our attempts to understand and affect climate change are at a scale that we, as a society, are not used to. Global cooperation and coordination of this scale has arguably only happened during the great wars.  And then, the challenge of developing a shared understanding of what was happening, what was needed, and by whom, was easier. Not easy, but easier.  Mobilizing the resources to mitigate climate change will require big bets.  Bets that need to be well-informed.  Recent investments in Hydrogen and ethanol are good examples of what happens when science gets taken for a ride (albeit a nicely funded ride) by special interests and politicians alike.

So what can we do?  I am putting my money (or at least a lot of my time) on what I think is one of the few crucial levers we have: increasing the ability of scientists to make their research make a difference.

Frederick Terman, grandfather  the Silicon Valley, once remarked (before transforming Stanford’s School of Engineering into the powerhouse it is today) that the field of radio was dominated by businessmen who knew a little about science.  He asked what would happen to the field if it was led by scientists who knew a little bit about business.  Hewlett-Packard, Varian, and many others were businesses that emerged from the School of Engineering were examples of the impact that scientists can have when they assume the mantle of leadership in business.

Until scientists understand that they have a obligation to see their research through to its impact on society, and the skills and connections to do so, they will always equal 1/2500th of a politician. With or without a powerbook.

Hey, I’ve got an idea…

Great advice by Casual Fridays on the micro-details, boots-on-the-ground, life-in-the-trenches reality of innovation–the act of actually mentioning your idea to others for the first time: 7 Reasons No One Likes Your Ideas. For example:

1. You took a leap, but didn’t build a bridge.
Our minds wander down paths and make leaps from one idea to the next very quickly. Your idea makes perfect sense to you because of the path you followed internally. If you don’t take everyone else down that path, it probably won’t make sense to them.

and

5. You tossed an egg instead of a bird.
You tossed it out there too early. Given time, it would have flown…

Should be required reading for anyone involved in anything creative.

Racing down the hydrogen highway…

Today’s WSJ charts the recent decline of ethanol’s prospects and suggests the business press, and mass media, has removed from the bio-fuel its most-favored-panacea status (Ethanol Craze Cools
As Doubts Multiply
). 

Gone are the in-depth articles charting the return of the family farm and the fall of the house of Saud.  The nay-sayers (who have been saying nay all along) now get the attention:

A recent study by the Organization for Economic Cooperation and Development concluded that biofuels "offer a cure [for oil dependence] that is worse than the disease." A National Academy of Sciences study said corn-based ethanol could strain water supplies. The American Lung Association expressed concern about a form of air pollution from burning ethanol in gasoline. Political cartoonists have taken to skewering the fuel for raising the price of food to the world’s poor.

From over here on the science side of the debate, there has been little doubt that corn-based ethanol was not ready for prime-time: it’s energy balance (the energy needed to produce ethanol relative to energy gained from its production) was within debating distance of zero. The only advantages of corn-based ethanol were a $0.51 tax credit for every gallon of used and a $0.54 tariff on every gallon imported.

The hope for scientists, though, was that enough investments in corn-based solutions would spill over and advance the (more promising but still immature) cellulosic ethanol. While this has been true recently, it comes at a potentially serious cost in the long run.

Should corn-based ethanol lose its status as the technological cure for our energy and climate change woes, it could fall pretty hard.  Heard much from hydrogen lately?  In 2003, Bush proposed spending $1.2 billion to fund research in Hydrogen. In 2004, California’s Governor Schwarzenegger announced:

I am going to encourage the building of a hydrogen highway to take us to the environmental future… I intend to show the world that economic growth and the environment can coexist. And if you want to see it, then come to California….

And senate bill 1505, signed in early 2007, turned this vision into a statute.  Hydrogen has since lost much of its luster, along with much of its research funding…perhaps when politicians realized that ethanol promised to cure the same woes while also appealing to the Iowa primary voters. But that’s another story.

What interests me is the question of what happens when good technologies go bad–when promising technologies are brought to market prematurely, with too many promises made and too few kept. It happens in countless start-ups, when emerging technologies turn out to need twice (or more) the development time than their business plans promised and in large organizations, when the demands of Wall Street made it too tempting to accelerate the next generation technology.

When the inevitable disappointment comes, the technology becomes a  pariah–outcast and shunned. Unfortunately, the scientists and engineers who worked their tails off trying to deliver on the unrealistic promises, usually get hit the hardest: "There goes ol’ Burt–he worked on the Newton project. Hasn’t been the same since." And another promising technology is set back decades (and the generation who pioneered it is lost) for no other reason that that very promise.

Perhaps the biggest tragedies happen on the national stage–when new technologies move from the spotlight to the scrap heap because they failed to live up to the unrealistic promises of a few scientists, investors, or politicians. Worse when so many others, urging caution, were ignored.   

The energy revolution, in perspective

Few would argue anymore that we need an revolution in the ways we produce and consume energy–both for global security and climate change. And there have been plenty of calls for a Moonshot or Manhattan project that would solve the problem (e.g., the ethanol mousetrap).  But two very critical and very sobering facts of life that must be faced when talking about innovations in energy.

First, that no single solution will save the day. Princeton scientists Pacala and Socolow crystallized this discussion with their framework of "Stabilization Wedges" and their calculations of how much impact could be expected from changes in existing  energy technologies. In short: no one technological innovation will account for the complete solution. Indeed, the authors identify 15 independent technological regimes that could and should be addressed.

Second and more sobering, is that any one solution faces astounding resistance.  Recent news brought another example of just how difficult change can be in established systems.  On Nov. 14th, ConEd cut the last line of Edison’s original Pearl Street Station network, opened Sept. 14th, 1882.

The last snip of Con Ed’s direct current system will take place at 10 East 40th Street, near the Mid-Manhattan Library. That building, like the thousands of other direct current users that have been transitioned over the last several years, now has a converter installed on the premises that can take alternating electricity from the Con Ed power grid and adapt it on premises. Until now, Con Edison had been converting alternating to direct current for the customers who needed it — old buildings on the Upper East Side and Upper West Side that used direct current for their elevators for example. The subway, which has its own converters, also provides direct current through its third rail, in large part because direct current electricity was the dominant system in New York City when the subway first developed out of the early trolley cars. 

Edison’s Direct Current (DC) system was dethroned within a decade of its introduction by Westinghouse’s (and Tesla’s) Alternating Current (AC) system and yet, here we are, 125 years later, finally and literally pulling the plug on that original system. Granted, it’s for a small area of New York City.  But if it was such a small area–why did it take so long to make the change?

The entrepreneurs and venture capitalists of the Silicon Valley are turning their attention to energy and climate change with the full intent of revolutionizing those sectors with the same modus operandi that enabled them to lead the information revolution.   But the circumstances are quite different.  Energy is a brownfield–the installed systems are as difficult to resect from existing physical infrastructure (buildings, homes, and automobiles) as they are from the political infrastructure (from municipalities, states, and Washington). 

We may need revolutionary new technologies to save us from our old ones, but we also need revolutionary new ways of changing. The revolution, if it comes, will come by changing the way we change.

Creativity versus efficiency, part 2

A recent United non-flight, that left me stranded and scrambling in O’Hare, made it clear to me how dangerous is the distracting debate between managing for creativity and for efficiency. I have seen the enemy of efficiency, and it is efficiency.

I wrote earlier about the illusory, often unnecessary, tension between managing for creativity and managing for efficiency. As I said there, "our obsession with the tension between the wild and crazy side of innovation and the button-downed nature of ongoing operations is distracting us from one of the more real problems in managing innovation."

The experience: I arrived in Chicago with plenty of time to catch my final leg home to Sacramento. But in the time between deplaning and getting to the gate, UAL had canceled the flight. Not for bad weather or for lack of planes, but because the flight crew was stranded in Louisville from bad weather 6 hours earlier and in another part of the country. The only remaining direct flight home had over 140 standby’s. I re-routed and got the very last seat to LAX, connecting to Sac, and home by 1:30am.  Only 4 hours late.  I was lucky–the next available seat out of O’Hare on United was 24 hours later.

It’s easy to share UAL horror stories, but in this case, the point is actually that creativity is not the enemy of efficiency.  Efficiency is the enemy of efficiency. 

By that, I mean the pursuit of efficiency.  The tiny but relentless accumulation of little improvements in efficiency (each one a creative act on someone’s part) creates an organization that, while efficient, is no longer safe from even small disruptions in its operating environment (whether externally or internally generated). The pursuit of efficiency can move unnoticed right past effective and into something you might call hyperefficient–which sounds good but in medical parlance is a pathology. 

United is not alone, though I would argue they are leading in this category. From a 7/5/07 NYT article:

As anyone who has flown recently can probably tell you, delays are getting worse this year. The on-time performance of airlines has reached an all-time low, but even the official numbers do not begin to capture the severity of the problem.

That is because these statistics track how late airplanes are, not how late passengers are. The longest delays — those resulting from missed connections and canceled flights — involve sitting around for hours or even days in airports and hotels and do not officially get counted.

Efficiency leads, ultimately, to a system in which the output of every step is tightly coupled to the inputs of the next steps.  There is no wasted time or material–no slack.  Tightly coupled systems   tend to fail catastrophically, and there is a long and very good literature on this topic (see for example, Herbert Simon’s Sciences of the Artificial and Charles Perrow’s Normal Accidents). 

Henry Ford learned this first hand.  He spent roughly 7 years developing mass production and then another 10 perfecting it.  In the process, Ford built such a tightly-coupled factory that, when GM and others demonstrated the market for variability in car makes and models, Ford could not respond.  Changing the design of even a single bumper created ripples all up and down the line. 5 years later, to finally change, all Ford manufacturing operations were shut down for six months—laying off 75,000 men—while Ford engineers worked on a new production line. The Ford Motor Company never regained its dominance in the market.

United, like so many other carriers, have built the aerial equivalent to Ford’s River Rouge plant–systems so tightly coupled that even the smallest flight delays in one corner of the country are felt by travelers everywhere else. 

But sitting in O’Hare, I was also reminded of a manufacturing systems lecture that explains why I really hate United. It’s not simply because they have become too tightly coupled to respond effectively to small disruptions. It’s because they have, in the pursuit of efficiency, consciously and completely forsaken the customer. 

The theory is known as Little’s Law, and states:

inventory = throughput x flow time

The average throughput time in a production system is proportional to the average inventory (work-in-process) in the system. 

Little’s law is also one of the handful of relationship laws that every manager should know. It’s also very useful in understanding the pitfalls of hyperefficiency.

One of the key insights from Little’s law is that, for an organization to achieve maximum utilization of its equipment it must make sure there never a shortage of work-in-process queued up and ready to go. Now I took manufacturing systems from Mike Harrison at Stanford’s GSB, and so I am short-changing everyone what was a very enlightening and dramatic lecture that climaxed with the epiphany (for me, at least) that this meant as you approach 100% utilization of your equipment, your inventory levels would need to approach infinite (conversely, if you reduced your inventory to zero your capacity utilization would also drop to zero).

To put this in terms a United passenger would care about:  the more United tries to cut costs by increasing the utilization of their processing equipment (planes and flight crews), the more they must allow their work-in-process inventory (passengers waiting for planes) to approach infinite. As UAL maximizes the efficiency of their capital, it willingly sacrifices the passenger experience.

For United, like Ford and so many others, investments in increased efficiency become, in the end, more threatening to current efficiencies than any tolerance for creativity.

On the virtues of qualitative research

I once heard Carly Fiorina speak, back in 2001, when she was CEO of Hewlett-Packard.  She said something that made it immediately clear to me she was not the right person for the job.  She was touting HP’s new strategy of innovation and she boasted of her early impact on the organization:

"We will continue to invent. We are now the number three generator of patents in the world. We generate five patents a day.”

In 2001, the number of patents filed at HP more than doubled, reaching 5,000 (HP Press Release). Patents are an easy measure of a firm’s innovative capability.  And by easy, I do not mean accurate, useful, or even safe.  Just easy.  And anyone who would pick that measure has little understanding of the process of innovation. 

I was in Philadelphia last week to give a (brief) talk on the virtues of qualitative research to junior faculty in the field of management. In preparing, I rediscovered a variety of passages from various renowned researchers and thought I would share them. Apologies to all who could care less about how theories of business are made but Carly, for one, might have benefited from a better understanding.   

To begin with, qualitative research is typically juxtaposed to quantitative research and so a brief comparison is in order.  (Over)simply put, quantitative research measures things and uses statistics to find relationships between those measures. The methods for doing so allow scientists to predict relationships between changes in inputs and changes in outputs, and are widely applicable and often very useful (e.g., more carbon in the atmosphere correlates with increased global average temperature; more casual touching by the waiter correlates with more tipping by the diner). More importantly, quantitative research allows scientists to test theories about relationships between inputs and outputs across a range of similar situations.  If you had a theory about casual touching and tipping, for example, how would you test it? Across diners and/or across the waitstaff.

On the other hand, qualitative research usually does not attempt to measure the same variables across a range of situation but rather it looks for for how new variables or new relationships can be found within a single situation–variables and relationships that nobody has yet identified and studied. 

As Einstein and so many others have been credited with saying

"Not all that can be measured should be measured and not all that should be measured can be measured"

There is a great deal of value to be had in measuring our world–and a great deal of value in continually questioning the methods and results obtained by our current measures. 

Clifford Geertz, one of my particular heroes in this field, wrote a wonderful piece entitled "Thick Description" which compared the thin descriptions of measurements with the thick description of context and meaning that qualitative research can provide in any given situation.

Geertz’s example remains one of the best.  From a purely physiological perspective, a wink is the contraction of the muscles of a single eye that cause the eyelid to close.  So, of course, is a twitch.  And so is a slow-motion, exaggerated parody of a wink; a fast motion parody of a twitch; and any number of parodies of parodies of twitches and winks that a group 3rd grade boys sitting in the back row might engage in to amuse one another on a spring afternoon.

As Geertz says, "the difference between a twitch and a wink is vast." And any measure of the interactions that include and are driven by these twitches and winks is bound to measure the wrong things and fail to measure the right ones.    

And so the roles of quantitative and qualitative research are complementary. If you are studying biology or chemistry, there might be a diminished role for qualitative research–there is little in the meaning and context of plant interaction that cannot be measured relatively easily.  Or maybe not.  John Steinbeck would argue even here it is critical to continually question the value of what we’re measuring and why:

“The Mexican sierra has “XVII-15-IX” spines in the dorsal fin.  These can easily be counted. But if the sierra strikes hard on the line so that our hands are burned, if the fish sounds and nearly escapes and finally comes in over the rail, his colors pulsing and his tail beating the air, a whole new relational externality has come into being—an entity which is more than the sum of the fish plus the fisherman. The only way to count the spines of the sierra unaffected by this second relational reality is to sit in a laboratory, open an evil-smelling jar, remove the stiff colorless fish from formalin solution, count the spines, write the truth “D. XVII-15-IX.” There you have recorded a reality which cannot be assailed—probably the least important reality concerning either the fish or yourself.”

And when you’re studying people–and people in interactions–the role of qualitative research is more critical. 

Indeed, one of the first quantitative studies of people in organizations attempted to measure the effects of lighting conditions on people’s productivity.  They isolated a control group of workers and turned the lights–and productivity went up.  Then they turned the lights down–and productivity went up. They were clearly affecting, but not measuring, the right something. So the researchers talked to the workers and uncovered a relationship that is much more important the lighting levels: the Hawthorne Effect.

People were responding to the experiment–the attention, the excitement, the changes–in ways that made them more willing to work hard.  This finding led to a great deal of measurable variables–then previously ignored-about worker morale and motivation. 

This is how qualitative research can make a contribution: by identifying and describing the meanings that people have of themselves and of the situation.  These are the meanings that drive their responses to changes in their environment.  To quote Geertz again:

Man is an animal suspended in webs of significance he himself has spun, I take culture to be those webs, and the analysis of it to be therefore not an experimental science in search of law but an interpretive one in search of meaning.

Qualitative research is, at its heart, an attempt to understand how people (or fish) interpret their reality and as a result make it. Anyone who has both looked at manufacturing statistics and wandered the factory floor knows that you can learn a lot by watching and talking to the workers about their work and their lives.

And so, when you decide you want your company to be more innovative–and you decide to reward those who are "innovative"–you need to be very careful how you are measuring innovation. As the WSJ describes:

What [Carly] Fiorina doesn’t mention is why the number of patents skyrocketed. Much of it had to do with a program put in place in 1999 to get HP into the top 10 patent producers. It relied on paying engineers for each new possible filing. At the time, it was $175 for a basic "invention disclosure," $1,750 if it became a patent application, and another chunk of cash and a plaque for an actual patent.  One engineer, Shell Simpson, nearly tripled his salary by working weekends in the first year by filing 120 disclosures and 70 patent applications-at one point taking two weeks off to work on patents full-time.

So when Carly Fiorina boasted  in 2001 about the number of new patent filings at HP, she recorded a reality which cannot be assailed—probably the least important reality concerning either the engineers at HP or herself.