Solar’s development – stalled by Reagan in the US and underinvestment in Europe – offers lessons for today’s policymakers in other technologies
If there’s a good news climate change story right now, one to counteract the grim daily scroll of reports about wildfires, droughts, dying coral reefs, and deadly heatwaves, it’s the stunning rise of solar power.
With cheap panels now springing up on Texas ranches, Chinese lakes, and German balconies, graphs showing solar’s takeoff resemble the “hockey stick” charts used by Al Gore to warn about global warming in the first place. Under the right conditions, solar is now “the cheapest source of electricity in history”, the International Energy Agency (IEA) has declared.
Trend lines point to an astonishing shift in the world’s energy mix. A recent report by the Economist – a solar sceptic only a decade ago – predicts it will be the biggest global source of electricity by the mid-2030s, and possibly the biggest for energy overall a decade later.
On one level, this is a success story – a technology riding to the rescue as the world smoulders. But delve deeper into the history of solar, and a much more unsettling picture emerges.
Even in the 1970s, some policymakers knew solar was on track to become competitive with fossil fuels, given enough investment.
Read more: ScienceBusiness