Ben Matthews

  • New here on lemmy, will add more info later …
  • Also on mdon: @benjhm@scicomm.xyz
  • Try my interactive climate / futures model: SWIM
  • 0 Posts
  • 7 Comments
Joined 2 years ago
cake
Cake day: September 15th, 2023

help-circle

  • This study is indeed disturbing, drawing on multiple lines of evidence suggesting melting may happen faster than previously assumed, I’ll study more.

    However, there never was any magic safe (global-average-surface-) temperature level, to save polar ice sheets. Melting, and the penetration of heat, is cumulative, so to a first approximation it is the integral of the warming that counts (maybe we could talk about a heating budget, similar to the concept of carbon-budget to avoid a specific temperature).

    Although diplomats may stress that the concept of safe level is baked into Article 2 of the Climate convention, that orginally applied to “concentrations” not temperature. Back in the day (early 2000s) I among others pushed (this wasn’t easy) to adopt temperature as a goal closer to real impacts, pointing out that required peak+decline concentration pathways.
    Nevertheless we always knew that a stable (higher) temperature does not bring a stable sea-level (on a multi-century timescale) . While for some other types of impacts - e.g. ecosystem adaptation, it may be the rate (derivative) rather than the integral that matters more. The ‘level’ concept was a compromise to coalesce policy (within which - round numbers like 2.0 or 1.5 C also arbitrary).

    Maybe it could help motivate the global debate, to specifically (dis)agree goals of sea-level rise we try to avoid ? That’s a more tangible level ( at least until we get into regional sea-level-rise variations…) , but due to the double integral, it’s harder to implement.



  • I’m not convinced this plot bears much relation to the Hansen et al paper. The plot combines SSPs - socioeconomic scenarios, RCPs - levels of ‘radiative forcing’ (all gases together - the numbers 1.9 to 8.5 are RF in W/m2) and temperature.
    Hansen et al (OP) are suggesting that the climate sensitivity may be higher than the IPCC range, which would imply a higher temperature for any given RCP level. This suggestion is derived from the recent history of aerosols and temperature, as I discuss in another comment (above/below?).
    However this doesn’t change which curve we are on (in reality none of them), as the big difference between curves is from projections of future emissions depending on technology, population, policies etc. Also the near-term projections have little relation to the long term, due to inertia but also the range of different models and assumptions used to make them (some years ago). So you can’t look at very recent data to derive which curve we are on.

    I was (years ago) part of the process that designed the IPCC scenario structure of splitting the SSPs from RCPs, to allow the two modelling groups to work in parallel rather than one waiting for the other, so they could get on with connecting feedbacks (not sure this happened…).
    But IPCC ends up re-blending too many varying factors in one plot, because they have page limits, which can be somewhat confusing. Work of 10,000 scientists over six years condensed into one summary report… - is that the most effective method of communication ? That’s why I prefer to make an interactive model.

    Fundamentally, the useful question is not ‘where are we going’ but ‘where do we want to go’ - taking into account all the feedbacks and inertia in the systems, but making choices about our common future. To get to the lowest (1.9) curve we’d need a global green revolution tomorrow, but it’s not physically impossible. As for the 8.5 curve it’s very unlikely it’s part of the set as similar to earlier high-end scenarios run by IPCC since 1990s, which might have happened if we’d stayed in the coal age (still useful for comparing the high end of the physical models). The others are still in play. Extrapolating current trends and policies (especially tech and popn in China) I’d say the most likely is somewhere between 2.6 and 4.5 curves, but we could do better. While if you are building sea-walls you should anticipate worse. But don’t be fatalistic, we still have many options.

    [ Side remark in response to comment - as for Jan 2025, it’s really not ‘climate’ to conclude anything from one month. Even the El-Niño cycle depends on heat transfer between surface and deep ocean, which is a slow process, you need to integrate over much longer time. There may be a warm anomaly recently in tropical oceans, and also the high-arctic, but it’s actually quite cold where most northern people live (due to Rossby waves whose amplitude is increasing, as we expected in an hand-waving kind of way). ]


  • My model, no not yet clathrates or specific permafrost feedback, although I’m concerned about these and would like to put such feedbacks in (with wide adjustable parameter ranges to reflect high uncertainty). My model is interactive, you can play with it in a browser, so it’s hardly typical.
    However, in relation to the OP and the Hansen paper, it’s important to understand that the usual definition of climate sensitivity does not include such “slow” feedbacks - it’s assumed to assume fast atmospheric feedbacks e.g. physics of clouds, but not slower biogeochemistry. CS has been used for decades for comparing models, so it’s useful that the definition remains the same, simply the equilibrium (multi-century) response to CO2 doubling, it’s not any kind of prediction. That’s why it’s surprising that he would draw strong conclusions from a number (4.5) that’s well beyond the normal range.
    Complex 3d physics models derive CS, while integrated assessment models use CS as a calibration parameter for one component of a complex system, including socioeconomic drivers, emissions policies, land use change, etc. Most models (including mine) do have some climate - biogeochemistry feedbacks (for example, there is faster soil respiration at higher temperature), which are included in such ‘real’ scenario projections, but wouldn’t change the CS.


  • I think the Guardian article may be somewhat exaggerating what the Hansen et al paper says. I’ve been studying this kind of problem for 30 years. It’s indeed true, that there are many ways to explain the historical temperature rise, by adjusting the balance of positive (mainly greenhouse gases) and negative (sulphate and white-carbon aerosols, volcanos etc.) forcings. So if you think the aerosol effect was greater, you also have to assume the greenhouse warming was greater to balance - hence deriving a higher climate sensitivity (CS). In this case, they are arguing that we underestimated the (former) cooling effect of shipping sulphate - I’d agree this is not a trivial factor (and similarly for the warming effect of aviation induced cirrus, which we could also change quickly with global transport policy). However, I really doubt this change is sufficient to justify such a big shift from the long-developed consensus range of CS.
    In general, the recent historical data series has never been a sufficient constraint on CS (I know from having tried a similar approach for probabilistic analysis with earlier versions of my own model). So we have to use other methods too, and the IPCC consensus for the likely range of 2.5-4ºC for CS is derived from a wide range of methods and sources, particularly but not only big physics-based models (GCMs). I’d be very cautious about overturning this based on any one study, despite my respect for Hansen and colleagues. Of course, this contributes an interesting new view on this important topic, but it does not justify the headline of the article.