Suggested readings, #3

Here are some interesting articles I’ve come across recently, for your consideration:

Why is simpler better? Ockham’s Razor says that simplicity is a scientific virtue, but justifying this philosophically is strangely elusive, says my colleague Elliott Sober. (Aeon)

Sabine Hossenfelder discusses the current chaotic state of fundamental physics, showing why “beauty” gets in the way of science. To be read in tandem with the above linked article by Sober. (Nautilus)

I had no idea what “ethical interilimity” is. Now that I’ve found out from this article by Sam Ben-Meir, I doubt it’s a particular useful or coherent concept. But I could be wrong. (Blitz)

I’ve explained before why Jordan Peterson ain’t no Stoic (he doesn’t claim to be, but some people think he is). This article actually by Jennifer Baker argues (correctly) that is an anti-Stoic. (Psychology Today)

One more on Peterson, this time a commentary on his recent inane debate with the equally embarrassing Slavoj Žižek. “Enjoy.”

Published by

Massimo

Massimo is the K.D. Irani Professor of Philosophy at the City College of New York. He blogs at platofootnote.org and howtobeastoic.org. He is the author of How to Be a Stoic: Using Ancient Philosophy to Live a Modern Life.

6 thoughts on “Suggested readings, #3”

  1. I read multiple articles about the debate and how it was a quasi-lovefest, with Žižek joining Peterson in attacking political correctness. Oy. I also have Hossenfelder on my blogroll.

    Like

  2. Why is simplicity better?
    As it happens, I am in the midst of an argument with the climate science community over simplicity as applied to statistical inference. A couple of days ago I bought Probability, Confirmation and Simplicity: readings in the the philosophy of inductive logic Foster and Martin 1966, which contains six essays on simplicity. Not as simple as it’s cracked up to be – exactly the ammunition I require.
    Accordingly, I disagree with Sober. He refers to the Akaike Information Criterion, which measures simplicity but says that it refers to the same underlying reality. But we see it being repeatedly used for different underlying realities by people who don’t read the small print. They are being simplistic (#OccamsRazor). By mixing probabilities with theory Sober is making a fundamental mistake. I can apply probabilities to an experiment or a test, but I cannot to a theory. At best I can severely test (Mayo) a hypothesis and by attaching it to probative criteria in such a way that the alternatives are as unlikely as the hypothesis is likely, then I have a chance of confirming that theory.
    In climate science, simplicity is represented by trend-like change. Under increasing greenhouse gases, forcing leads to warming as the logarithm of the increasing forcing plus feedbacks. In the Earth system, this leads to monotonic warming, linear to forcing. Trouble is, most of this heat is absorbed by the ocean and it is the atmosphere that needs to respond. The atmosphere-ocean relationship is a dissipative system driven by thermodynamics and decidedly nonlinear. So, if I assume the atmosphere warms according to the linear radiative forcing concept, I have a simple model that is predictive over demi-century-long timescales. If I assume that warming obeys the dissipative pathway, then it proceeds via enhanced climate variability as a series of step-like regime changes. Over both pathways, warming reaches close to the same destination but its mode of getting there is very different. One contains more inherent risk than the other.
    So, I can represent both pathways statistically. They get similar sum of squares residuals (trend-like change fails the heteroscedasticity test but almost no-one tests for this), but because the pathway of step-like change carries more adjustable parameters, it is penalised (actually that isn’t even true because the detection method is completely different). But they represent different realities – Sober does mention this but few have remembered this before, so why should they now?
    Where simplicity works with this example, is that the natural greenhouse effect (average 155 Watts per square metre per year) is distributed through climate variability. The net anthropogenic greenhouse effect is 0.7 Watts per square metre per year and roughly 1% of that is assumed to be stored within the atmosphere (0.07 W m2/yr) producing a trend. So here we have created a very complex physical situation where most of the energy flux is controlled by climate variability and where perturbations in the climate of >1 W m2/y can be brought back to mean within months, but somehow a tiny amount of heat remains in the atmosphere in preference to an ocean with 24 times the heat conductivity and 3,200 the heat capacity.
    Whereas we could accept that the simplest thermodynamic solution is for all heat to follow the same pathway, for climate change to behave like enhanced climate variability and for warming to follow a series of regime change producing a long-term, complex trend.
    Theoretically and thermodynamically simple, statistically more complex. The problem with the simplicity argument is that it has to be very finely applied, and that confusing methodological simplicity with theoretical parsimony is an issue. In economic, climatology and a number of other disciplines, simplicity is being misapplied to methods rather than theory and this is a problem, because it means we apply simple solutions to complex, real-world problems.

    Like

  3. Massimo … on the Sober and simplicity/Ockham’s razor, is this possibly connected to some degree with the quest for scientific beauty, which has been put down by … Hossenfelder?

    Like

    1. Socratic,

      interesting question, but Sabine actually explains at the beginning of another piece she wrote, for Aeon, that her beef isn’t with Ockham. The latter is a useful heuristic, a guide to discovery. She charges some of her colleagues with the notion that Nature *must* be simple, which is a metaphysical, not an epistemological, statement.

      Liked by 1 person

Leave a Reply