To be clear:
On October 12th, the Nobel Peace Prize was awarded to both Gore and the Intergovernmental Panel on Climate Change (IPCC), half going to each.
Al Gore on one side of the scale and, on the other, the 2,500 researchers from 130 nations who worked for two decades to create,
in the words of
the committee, “an ever-broader informed consensus
about the connection between human activities and global warming.”
To be fair, Gore was not alone. Anyone who’s seen An Inconvenient Truth,
will know he had his trusty Apple Powerbook running a very cool set of Keynote slides (and among the green, this may have done more for Apple’s sales than the
iPod. But that’s not the point).
It’s not fair to blame the Nobel Peace Prize Committee for this ratio. The more frightening prospect is that they may be right.
America has made a decided tip towards acknowledging climate change, exploring solutions, and taking action. Not all of it good, but getting off the pot should in itself count for something. Did we reach our tipping point because the scientists finally discovered climate change? No. That was 20 years ago. Instead, it was because the scientists and others (I’m talking to you, Al) finally convinced the rest of the world of this reality.
The more important question is why, in the eyes of the committee, could 2500 of the leading scientists only get halfway towards making their ideas make a difference?
This is the more frightening prospect. Science as it’s currently practiced has its faults, but I know of no better way to advance our understanding of the world than the American university system. The problem is what we do with that understanding.
If a scientist discovers something startling in the rainforest and no-one is there to hear it, does it make a contribution? If he or she publishes a paper in Journal of Startling Rainforest Discoveries, and the other 20 scientists working the same field read it, but no others, does it make a contribution?
America spends approximately $48B on funded research by universities in this country and yet so little of it makes a difference outside of the individual fields in which the work is guided, conducted, peer-reviewed, and published. In other words, we have created a market (actually many small, disconnected markets) where scientists answer only to other scientists in their fields and nobody is held accountable for whether that research reaches a broader audience.
Our attempts to understand and affect climate change are at a scale that we, as a society, are not used to. Global cooperation and coordination of this scale has arguably only happened during the great wars. And then, the challenge of developing a shared understanding of what was happening, what was needed, and by whom, was easier. Not easy, but easier. Mobilizing the resources to mitigate climate change will require big bets. Bets that need to be well-informed. Recent investments in Hydrogen and ethanol are good examples of what happens when science gets taken for a ride (albeit a nicely funded ride) by special interests and politicians alike.
So what can we do? I am putting my money (or at least a lot of my time) on what I think is one of the few crucial levers we have: increasing the ability of scientists to make their research make a difference.
Frederick Terman, grandfather the Silicon Valley, once remarked (before transforming Stanford’s School of Engineering into the powerhouse it is today) that the field of radio was dominated by businessmen who knew a little about science. He asked what would happen to the field if it was led by scientists who knew a little bit about business. Hewlett-Packard, Varian, and many others were businesses that emerged from the School of Engineering were examples of the impact that scientists can have when they assume the mantle of leadership in business.
Until scientists understand that they have a obligation to see their research through to its impact on society, and the skills and connections to do so, they will always equal 1/2500th of a politician. With or without a powerbook.