Paris, climate and surrealism

27 07 2017

Speaker: Prof. Kevin Anderson, Professor of energy and climate change

Title: Paris, climate and surrealism: how numbers reveal an alternate reality

The Paris Agreement’s inclusion of “well below 2°C” and “pursue … 1.5°C” has catalysed fervent activity amongst many within the scientific community keen to understand what this more ambitious objective implies for mitigation. However, this activity has demonstrated little in the way of plurality of responses. Instead there remains an almost exclusive focus on how future ‘negative emissions technologies’ (NETs) may offer a beguiling and almost free “get out of jail card”.
This presentation argues that such a dominant focus reveals an endemic bias across much of the academic climate change community determined to voice a politically palatable framing of the mitigation landscape – almost regardless of scientific credibility. The inclusion of carbon budgets within the IPCC’s latest report reveals just how few years remain within which to meet even the “well below 2°C” objective.

Making optimistic assumptions on the rapid cessation of deforestation and uptake of carbon capture technologies on cement/steel production, sees a urgent need to accelerate the transformation of the energy system away from fossil fuels by the mid 2030s in the wealthier nations and 2050 globally. To put this in context, the national mitigation pledges submitted to Paris see an ongoing rise in emissions till 2030 and are not scheduled to undergo major review until 2023 – eight years, or 300 billion tonnes of CO2, after the Paris Agreement.

Despite the enormity and urgency of 1.5°C and “well below 2°C” mitigation challenge, the academic community has barely considered delivering deep and early reductions in emissions through the rapid penetration of existing end-use technologies and profound social change. At best it dismisses such options as too expensive compared to the discounted future costs of a technology that does not yet exist. At worst, it has simply been unprepared to countenance approaches that risk destabilising the political hegemony.

Ignoring such sensibilities, the presentation concludes with a draft vision of what an alternative mitigation agenda may comprise.

Advertisements




Our Aversion to Doom and Gloom Is Dooming Us

20 07 2017

Reproduced from Commondreams.

I worked for over 35 years in the environmental field, and one of the central debates I encountered was whether to “tell it like it is,” and risk spreading doom and gloom, or to focus on a more optimistic message, even when optimism wasn’t necessarily warranted.

The optimists nearly always won this debate. For the record, I was—and am—a doom and gloomer.  Actually, I like to think I’m a realist. I believe that understating the problems we face leads to understated—and inadequate responses.  I also believe that people, when dealt with honestly, have responded magnificently, and will do so again, if and when called. Witness World War II, for example, when Churchill told the Brits, “I have nothing to offer but blood, toil, tears, and sweat.” In those words, he helped ignite one of the most noble and dedicated periods of unity and resistance in all the annals of human endeavor.

Finally, I believe that the principles of risk management dictate that when the consequences of our actions —or our inactions—are pervasive, long lasting, irreversible and potentially devastating, we should assume worst-case outcomes.  That’s why people get health insurance; it’s why they purchase insurance for their homes; it’s why they get life insurance. No one assumes they’ll get sick, that their house will burn down, or that they’re about to die, but it makes sense to hedge against these events.  It’s why we build in huge margins of safety when we design bridges or airplanes. You can’t undo an airplane crash, or reverse a bridge failure.

And you can’t restore a livable climate once it’s been compromised.  Not in anything other than geologic timeframes.

Yet we routinely understate the threat that climate change poses, and reject attempts to characterize the full extent of the potential for catastrophe it poses. And it’s killing us.

David Wallace-Wells’ recent article in the New York magazine, The Uninhabitable Earth, is a case in point.  It was an attempt to describe the worst-case scenario for climate change.  Here’s the opening sentences to give you an idea of what Mr. Wallace-Wells had to say:

It is, I promise, worse than you think. If your anxiety about global warming is dominated by fears of sea-level rise, you are barely scratching the surface of what terrors are possible, even within the lifetime of a teenager today. 

Predictably, a large part of the scientific community reacted with hostility, and environmentalists were essentially silent. For example, Climate Feedback published a critique of Wallace-Well’s article by sixteen climate scientists, leading with Michael Mann, originator of the famous hockey stick, which graphically showed how rapidly the Earth was warming. Here’s part of what Dr. Mann had to say:

The evidence that climate change is a serious problem that we must contend with now, is overwhelming on its own. There is no need to overstate the evidence, particularly when it feeds a paralyzing narrative of doom and hopelessness.

The last part of Dr. Mann’s statement may explain the real reason the environmental and scientific communities reacted so hostilely to Wallace-Well’s article, and why they generally avoid gloom and doom, even when the news is gloomy—the notion that presenting information that details just how bad climate change could be, leads to “paralysis.”

This, together with scientists’ tendency to stick to the most defensible positions and the scenarios that are accepted by the mainstream—what climate scientist James Hansen calls dangerous scientific reticence—probably explain why the scientific community has tended to understate the threat of climate change, although few would describe Dr. Mann as reticent.

And it should be noted that Mr. Wallace-Well’s did overstate some of the science. For example, given out current understanding of methane and carbon releases from permafrost, it appears as though it would take much longer to play out than Wallace-Wells suggested, although it likely would add as much as 2°C to projected warming by 2100. But for the most part, he simply took worst-case forecasts and used them. As Dr. Benjamin Horton—one of the scientists commenting on the Wallace-Wells article put it, “Most statements in the article are based on peer-reviewed literature.”

One of the reason worst-case projections seem so dire, is that the scientific community—and especially the IPCC—has been loath to use them. For the record, ex-ante analysis of previous forecasts with actual changes show a trend that is nearer to—or worse than—the worst-case forecasts than they are to the mid-range.

The article also forecast some of the social, demographic, and security consequences of climate change that can’t be scientifically verified, but which comport with projections made by our own national security experts.

For example, in this years’ Worldwide Threat Assessment of the US Intelligence Community, climate change was identified as a “threat multiplier” and Dan Coats, Director of National Intelligence, said in testimony presented to the Senate Select Committee on Intelligence in May of this year:

Climate change influences the entire geostrategic landscape. In that sense, one could  walk through the entire threat assessment report and identify ways in which climate  change will intersect with nearly every risk identified, and in most cases, make them worse.

Director Coats specifically highlighted health security, terrorism and nuclear proliferation as threats that climate change would exacerbate. This is coming from the Trump administration, which has been censoring climate-related information coming out of NOAA and EPA.  It’s a measure of how seriously the national security community takes the threat of climate change that they fought to keep the issue above the political fray.

Yet here again, the scientific community took issue with these claims, because they were conjecture.  Never mind that those whose job it is to assess these kinds of risks found the forecasts likely and actionable. Scientists want data and the certainty it brings, not extrapolation.

So what’s the gap between future worst-case and the more typically used mid-range projections the media and scientists favor?  It’s huge, and consequential.  I’ve pointed out some of the risky—if not absurd—assumptions  underlying the Paris Agreement in the past, but let’s briefly outline some numbers that highlight the difference between what’s typically discussed in the media, with projections based on worst-case—but entirely plausible—forecasts.

After Paris, there was a lot of attention paid to two targets: a limit of less than 2°C warming, and a more aggressive limit of no more than 1.5°C warming.  What was less well known and discussed was the fact that the Agreement would have only limited warming to 3.5°C by 2100, using the IPCC’s somewhat optimistic assumptions.

What is virtually unknown by most of the public and undiscussed by scientists and the media is that even before the US dropped out of the Treaty, the worst-case temperature increase under the Treaty could have been nearly twice that.

Here’s why.

As noted, the 3.5°C figure had a number of conservative assumptions built into it, including the fact that there is a 34 percent chance that warming will exceed that, and the idea that we could pass on the problem to our children and their children by assuming that they would create an as yet unknown technology that would extract massive amounts of carbon from the atmosphere in a cost-effective way, and safely and permanently sequester it, thus allowing us to exceed the targets for a limited amount of time.

But the fact is, some projections found that temperature increase resulting from meeting the Paris targets would exceed 4°C by 2100, even if we continued to make modest progress after meeting them – something the Treaty doesn’t require. The IPCC forecasts also ignored feedbacks, and research shows that just 3 of these will add another 2.5°C of warming by 2100, bringing the total to more than 6.5°C (or nearly 12°F). At this point, we’re talking about trying to live on an essentially alien planet.

Finally, there’s evidence that the Earth’s natural sinks are being compromised by the warming that’s happened so far, and this means that more of what we emit will remain in the atmosphere, causing it to warm much more than the IPCC models have forecasted. This could (not would) make Wallace-Well’s thesis not only plausible, but likely.

But rather than discussing these entirely plausible forecasts, the media, environmentalists and too many scientists, would rather focus on a more optimistic message, and avoid “doom and gloom.”

What they’re actually doing is tantamount to playing Russian Roulette with our children’s future with two bullets in the chamber. Yes, the odds are that it won’t go off, but is this the kind of risk we should be taking with our progeny’s future?

There is something paternalistic and elitist about this desire to spare the poor ignorant masses the gory details.  It is condescending at best, self-defeating at worst.  After all, if the full nature of the challenge we face is not known, we cannot expect people take the measures needed to meet it.

I believe now, and I have always believed, that humans are possessed with an inherent wisdom, and that, given the right information, they will make the right choices.

As an aside, Trump is now President because the Democrats followed the elitist and paternalistic path of not trusting the people – that and their decision to put corporate interests above the interests of citizens.

Watching Sanders stump against the Republican’s immoral tax cut for the rich disguised as a health care bill, shows the power of a little honest doom and gloom.

We could use a lot more of it across the political spectrum.

John Atcheson

John Atcheson is author of the novel, A Being Darkly Wise, and he has just completed a book on the 2016 elections titled, WTF, America? How the US Went Off the Rails and How to Get It Back on Track. It is available in hardcover now, and the ebook will be available shortly. Follow him on Twitter:@john_atcheson





Another silver bullet bites the dust….

10 10 2016

A recent article in the Guardian explains why scientists now believe that soil’s potential to soak up climate changing carbon dioxide has been overestimated by as much as 40%….

Hopes that large amounts of planet-warming carbon dioxide could be buried in soils appear to be grossly misplaced, with new research finding that the ground will soak up far less carbon over the coming century than previously thought.

Radiocarbon dating of soils, when combined with previous models of carbon uptake, has shown the widely assumed potential for carbon sequestration to combat climate change has been overestimated by as much as 40%.

Scientists from the University of California, Irvine (UCI) found that models used by the UN’s Intergovernmental Panel on Climate Change (IPCC) assume a much faster cycling of carbon through soils than is actually the case. Data taken from 157 soil samples taken from around the world show the average age of soil carbon is more than six times older than previously thought.

markcochrane2

Mark Cochrane

Mark Cochrane, our resident climate scientist, recently picked up on this at Chris Martenson’s Peak prosperity blog and wrote the following……?

The article points again to the problems with global models of climate change. Those who generally complain about ‘models’ usually do so to try to imply that they are wrong and that this therefore means that they are overstating climate change. The fact of the matter is that although they are ‘wrong’, the errors, in principle, are just as likely to understate as overstate the situation. In reality, the science tends to be conservative, as scientists are usually constrained to using what is statistically defensible for many of the parameters within their models, so the likelihood of understating known issues (e.g. ice sheet collapses) is greater than substantially overstating them, which is why the vast majority of new findings point out that climate change is progressing faster than we have been estimating.

The famous quote by George Box “All models are wrong, but some are useful” nicely sums up the state of things. Much of what we do in this world is based on our internal modeling, some of which is of high accuracy, “the sun comes up every morning”, and other ideas somewhat less so,  “I’m a safe driver so driving is not risky”. Weather models are notoriously inaccurate but we find quite a lot of utility in consulting them anyway. They may not be absolutely ‘right’ but they are usually reasonably close to the ultimate conditions. Climate models have multitudinous components but their ultimate function basically boils down to calculating the balance between sources and sinks of carbon in the atmosphere, then estimating what the ramifications are of the net changes in type and amounts of the so-called greenhouse gases.

Sources are emissions from things like burning fossil fuels, and positive feedbacks like melting permafrost that releases a portion of the carbon stock, that has literally been frozen in place for millennia, to the atmosphere as the climate warms. Sinks are things like ocean uptake of carbon as higher atmospheric carbon dioxide concentrations force the gas into the water, like occurs in your soda bottle or beer can. Negative feedbacks are those that ultimately bring the system back into balance after excessive emissions and include things like plants soaking up carbon and ultimately depositing some of it for long term storage in soils, in addition to transformation of silicate rocks to carbonate rocks as mountains erode and deposit sediments into the sea, soaking up atmospheric carbon in the process.

As I have mentioned before, the existence of a positive or negative feedback is only part of the story, we also need to know the rate at which it proceeds and ultimately how long it might continue. If you put a match to a high concentration of an explosive gas (say hydrogen) the positive feedback of energy release from a few molecules transferring energy to the proximate molecules will proceed very rapidly but not for very long before the process runs its course in the explosion. On the other hand, eroding the Himalayan mountains down to sea level will soak up immense amounts of carbon dioxide from the atmosphere but will take millions of years to accomplish.

All of which is providing context for what the He et  al. (2016) paper is saying. Soil carbon is a catch all term for many chemical compounds in soils that have carbon as a component. This makes the ‘organic’ component of soils. If you are modeling the rate at which carbon can get soaked up by soils you need to know the processes involved and calibrate them using parameters that balance the rates at which carbon enters and leaves the soil. What the new research is showing is that the current Earth Systems Models (ESMs – components of Global Climate Models – GCMs) currently underestimate the age of organic materials (carbon) in existing soils which effectively means that they overestimate the rate at which carbon is likely to be sequestered through plant growth/soil formation in the future. The upshot being that the models are currently estimating that soils will soak up potentially twice as much carbon between now and 2100 as seems likely. If the carbon isn’t getting soaked up it means that it could pile up in the atmosphere for longer than presently estimated and act to warm the planet more than currently projected.

As in all scientific matters, these results will be tested by other scientists and either be verified, refuted or refined. So what does it signify if this is correct? The soil component is only one pool among many but the net fluxes are what matters in the climate situation. For example:

With a Net Terrestrial Uptake of 3.0, the findings could indicate that this should be better described as 1.5-2.0. This could conceivably move the net atmospheric increase from 4.0 to 5.0 or so, a 25% increase. Non trivial. That said, what it probably means is that existing errors in other components of the modeling are either partially overstating emissions or global photosynthesis or understating net oceanic uptake. Therefore, instead of a 25% increase in atmospheric carbon there would be a smaller compounding increase between now and 2100, how small is the question. Future studies will be aimed at teasing these interacting components apart.





Global Warming may proceed faster than expected

1 05 2015

http://www.theguardian.com/environment/climate-consensus-97-per-cent/201…

There certainly is some evidence that climate sensitivity may be below 2°C. But if you look at all of the evidence, it’s hard to reconcile with such a low climate sensitivity. I think our best estimate is still around 3°C for doubled CO2.

Mark Cochrane has this to say about the above……..:

The above video and article at the link do a good job laying out the range and likelihood of various modelled climate sensitivities to a doubling of pre-industrial atmospheric carbon levels. The extension of the published IPCC range of possible sensitivities to values as low as 1.5C are more an exercise of political correctness than anything else. To realistically have values below about 2.5C you would need to have both reduced feedbacks (e.g. increasing water vapour in the atmosphere and melting ice cover on the ground) and a large negative feedback from clouds (e.g. more low level clouds at low latitudes).

The problem for this scenario being that we have already had decades of measurements that positively show the feedbacks we have already gotten, in terms of increased water vapour and decreased ice cover, won’t support a low climate sensitivity. Similarly, the clouds haven’t shown up as hoped. I spent about 15 years assuming and hoping that they would. Reality and various research studies beat that idea out of my head. If anything, the clouds are yielding a small positive feedback (warming). Depending who you believe that could be due to wispy high level clouds that trap heat or diminished low level clouds at lower latitudes that reflect less sunlight.  Could be a bit of both.

So, barring a sudden and unexpected change in all of the trends to our advantage, a climate sensitivity below 2.5C is a pipe dream. Something more like 3-3.5C is probable with higher values more likely than lower ones around that range. Note, I’d really like to be proven wrong about this… (to the low side)

In practical terms, the higher the climate sensitivity, the faster and more extreme the emissions cuts we will have to make in order to avoid compromising the resilience of our society and the rest of global ecosystems to climate changes as we progress through this century.

Wishful thinking is not a viable strategy for managing our future.





Climate models in perspective

8 02 2015

Mark Cochrane

Mark Cochrane

Another guest post from Mark Cochrane.

One of the favourite refrains from climate change ‘skeptics’ is saying that models are wrong, which is a bit like saying that water is wet. Models are simplifications of more complex systems and as such they are always going to be ‘wrong’ but the question is whether or not they are useful. Weather models are wrong too but we use them all of the time. Sometimes they over predict precipitation or temperature while at other times they under predict.  The Weather Channel actually uses ‘proprietary methods’ to over predict the chances of high precipitation (link), presumably because they’d rather be wrong in one direction than the other.

The ‘skeptic’ claim is that climate scientists do something similar with their projection of future climates for a range of probable emissions scenarios. Somehow hundreds of scientists across multiple groups all manage to collude in this mass delusion to make us worry more than we should about the future impacts of our fossil fuel dependent culture.

Favorite climate skeptic Christopher Monckton and colleagues recently published an article (link) saying as much in the somewhat obscure Chinese journal Science Bulletin. I’ve never heard of this journal before but they are making an attempt to be in the peer-reviewed literature so I won’t quibble. They make the case that the IPCC model scenarios are overstated by a factor of 2 or 3. If you want an in depth summary and critique of this paper see this blog post. Relative ranking of journals is often done using their respective Impact Factors; their likelihood to yield citations to any given work. Scientists aim to get their works into the best location for their work to be read and used by their peers. Science Bulletin has an impact factor of 1.4, Nature has one of 42.4.

Now an analysis has been done to test whether model misses of the so-called warming ‘hiatus’ are due to having incorrect model forcing (sensitivity to greenhouse gasses – e.g. CO2) or other random factors (mostly volcanoes and ocean modes – e.g. El Nino etc.). The paper in Nature by Marotzke and Forster (2015) does this by comparing model simulations used by the IPCC and observations of global mean surface temperature as functions of all possible 15-year (and 62) trends from 1900 to the present.

An in depth explanation of the paper can be found here.  The use of 15 years is somewhat arbitrary but it provides a test of using short time periods to get whatever trend is desired. If the nefarious climate scientists are really making their models over predict the rate of warming then when you plot all of the differences between the model outputs and the actual observations they would tend to be greater than zero. This is the pattern you see from 1998-2012 (middle below), the period of the so-called hiatus or pause. However, you see the exact opposite pattern from 1927-1941 when models would have been accused of seriously under predicting temperature increases (left panel below). When you plot all of the data (right panel) the results are not significantly biased either way.

Comparing model-simulated (brown bars) and observed global surface temperature (vertical black line) for the 15-year periods covering 1927-1941 (left) and 1998-2012 (middle), and for all 15-year periods between 1900 and 2012 (right). Source: Marotzke and Forster (2015)

The models are not systematically biased in either way, they just cannot account for chaotic or random factors within the atmosphere and ocean. Therefore they will seem to over predict and times and under predict at other times. Where are we now?

Still cruising along the lower portion of the 95% range of the model simulations. If it were a blood test you still wouldn’t have a little asterisk on your test but it would be close.

It is noteworthy that this apparent inaccuracy only pertains to ‘surface temperatures’ of the Earth. There are no indications that the rate of warming has actually slowed down in any way. The only change has been in the amount of energy being stored in the world’s oceans. Despite our terrestrial surface bias of measuring the climate, over 90% of the energy is piling up in ocean waters, with much of the heat being transported into deep reaches (>700 m) as chaotic processes lead to periods of ocean water turnover.

The wiggles in the ocean uptake (blue above) of heat end up as periods rapid warming or a ‘pause’ these days in surface temperature. Note, back before 1970 those down turns would cross beneath the land component and yield actual global cooling for short periods. Ocean processes call the tune of energy transfer and the atmospheric and surface temperatures dance to it on short time scales. None of this changes the steady ramp of ongoing energy storage in the long term trend due to green house gas accumulations that is driving global climate change.





Compounding problems for sea level rise…

28 01 2015

Another guest post by Mark Cochrane…..

One of the larger concerns in recent years has been the question of just how fast sea levels might rise due to collapsing ice sheets of Antarctica and Greenland. In the IPCC AR4 report (2007) there was considerable furor because the 2005 cut off for literature and the natural conservative nature of the ‘consensus’ interpretation resulted in estimates of sea level rise that were known to be too low at the time of publication: specifically, from 0.18 to 0.59 m by 2100, depending on which scenario you chose and the low-to-high extremes. In the more recent AR5 report (2013), they conclude that for the best of emissions cases, if we start immediate and extensive carbon emission reductions (RCP 2.6), sea level is expected to rise by 28-61 cm by 2100, while in the worst of cases (RCP 8.5) sea level rise is expected to be 52-98 cm. This is still conservative but much better than the AR4 estimates.

The real question is whether sea level rise occurs at close to a linear rate (fixed amount per year) that is slow and easily projected, or if it is increasing at a nonlinear rate (fixed percentage per year) that could yield unpleasant surprises in future years? Dr. Richard Alley (2010) compared projections of sea level rise going forward and basically found that most included 1m within their error range, with the exception of one serious outlier at 5m made by Dr. James Hansen (2005, 2007, 2012). Hansen’s predictions have not been well-received by the community of experts on ice sheet dynamics. They point out that, so far, there has been nothing like the amount of sea level rise observed that would be necessary to reach 5m in a linear fashion. Hansen however premises his ideas of rapid ice sheet collapse on nonlinear phenomenon caused by things like glacial melt water being transported to the base of the ice sheets and acting as a lubricant to speed their movement dramatically.

In the mean time, more traditional approaches to looking at glacial melting rates have had values centered more on 1 meter, to maybe 2 meters under worst conditions, of sea level rise by 2100 (NOAA 2012). Hansen has been intransigent in his estimations and the rates of sea level rise keep exceeding the best estimates of the ‘experts’. Glacial melt within dynamic ice sheet models has typically been modelled based on simple top down melting with unchanging processes for explaining the ongoing flow of ice sheets. However, the accelerating rates of observed ice sheet flow and disintegrating ice shelves have led to reappraisals of what is going on. As I recently detailed (post #2340), warmer ocean waters have been melting ice sheets from underneath in some regions, removing the stable grounding lines and now the West Antarctic Ice Sheet (WAIS) is in irreversible collapse (Rignot et al 2014, Joughlin et al. 2014).

Similarly, Pollard et al. (2014) have recently tried to improve continental ice sheet models by adding the processes of oceanic melting and hydrofracturing (melt water from the surface pouring into cracks and forcing them further apart) and also account for ice cliff failures (when they get so large the ice face crumbles). Both processes they added are based on observations made in the field in recent years. They looked at the effects on both the WAIS and the Eastern Antarctic Ice Sheet (EAIS). The interesting thing (to me at least) is that cliff failure and hydrofracture combine to cause very large changes in expected sea level rise that either process alone does not create. By itself, cliff failure does little to accelerate collapse over the standard model representation. Conversely, hydrofracturing, by itself, causes expected sea level rise to roughly double from 2 to 4 m over thousands of years. When both processes are included though, the sea level rises by 17 m, with about 4 m happening in the first 100 years! Clearly the two processes interact to strongly enhance the collapse rate.  The EAIS collapses slowly over thousands of years but the WAIS collapses in decades.

The Pollard et al (2014) paper is not expressly addressing our future as it was aimed at explaining formerly unexplainable sea level rises during some previous interglacial periods – which their results ended up matching fairly closely. They forced their model using 400 ppm CO2 so it isn’t wildly different from what we currently have though.  In the model, roughly 3 m of sea rise come from the WAIS alone, within 100 years. If you add the much slower response of the EAIS and the undiscussed but very similar ice sheet collapse from Greenland, suddenly Hansen’s 5 m sea level rise call doesn’t look so outlandish after all. Interestingly, the senior author on the Pollard et al. paper is none other than Richard Alley who previously did not see how such rapid ice sheet collapse could be occurring.

This still doesn’t mean that we definitely will get 5 m of sea level rise in this century (let’s pray that we don’t!) but it certainly increases the perceived risks of much larger sea level rises than the IPCC AR5 report states (again). It also helps explain the increasing rates of sea level rise from 1.0 mm/yr to 3.0 mm/yr in recent decades. Things seem to be proceeding in a decidedly nonlinear way.





Still on Track for the Collapse of Modern Civilization

15 10 2014

Originally posted on Collapse of Industrial Civilization:

BeFunky_null_u1.jpg

Two recent pieces of scientific evidence really hammer home the predicament of modern industrial civilization, and they have to do with the fact that our globalized, just-in-time economic model is hopelessly wed to carbon-based energy. Once one understands this, then there can be no delusions about why we are on such a catastrophic trajectory of greenhouse gas emissions. As was explained in a previous post, GDP is fundamentally and directly linked to CO2 emissions. Below, two graphs(click to go to source) illustrate this fact:

C02 emissions since 1850 (red); exponential growth (blue); cuts to hit climate target (dashed).

Graphic_PE_CoalUseIncreased

It’s not really about evil fossil fuel companies, although they do certainly exert enormous political clout and do conspire to protect their business model by doing such things as spreading doubt on climate change science, but as with all corporations, externalizing social and environmental costs is endemic to the profit system and the coercive forces of competition in capitalist markets.

Firstly, there is the graph submitted by…

View original 1,746 more words