Categories
systems

Probability/Possibility

If you think about the distribution of events, some will have very low probabilities. So low, in fact, that on average they occur less than once. But of course events either happen or they don’t, so an average of less than once tends to get approximated to not-at-all if you think in probabilistic terms. But if you look at the class of all those things with average frequency less than one there’s a good chance of one, or some, of them happening. And when they do, they happen at a frequency far greater than the prediction of the average (by necessity, since the average is between zero and one if they occur it will be an integer value of minimum one).

I was thinking these thoughts while watching this demo of ‘The Galton Machine’ which illustrates the central limit theorum and, peripheraly, provides an example of a system which can be approximately described by one ‘rule’ (a Gaussian distribution) but follows quite different mechanistics rules (you see, it’s always about minds and brains round here). Look at the extremes of the distibution. Soon enough a single ball will fall somewhere there, and when it does it will far exceed the predicted (average) frequency of it occuring.

It occured to me that all this was analagous to an interesting piece in the Guardian, an extract from Worst Cases: Terror and Catastrophe in the Popular Imagination, by Lee Clarke, published by the University of Chicago Press. Clarke says that thinking about probabilities lets you get away with thinking about was is most likely to happen, whereas thinking about possibilities lets you plan for rare events of serious magnitude.

The trouble is that when it comes to real worst cases – actual disasters – there are no “average events”. How could we talk about a normal distribution of extreme events? If we imagine the future in terms of probabilities, then risks look safe. That’s because almost any future big event is unlikely.

counterfactuals …help us see how power and interest mould what is considered legitimate to worry about. One lesson is that we cannot necessarily trust high-level decision-makers to learn from their mistakes. They could. But they often have an interest in not learning.

Our governments and institutions almost always have a vested interest in not engaging in possibilistic thinking, and in offering us instead the reassuring palliative of probabilistic scenarios.

I wonder if this is part of the story of why modern government seems to be about the management of existing trends rather than envisioning of future, alternative (possibly radically alternative), states that society could exist in.

9 replies on “Probability/Possibility”

Here’s a simple suggestion I am sure economists have thought about. If in a unit time only one problem event will occur from a distribution of events.

proportion of time and money spent on problem event = probability of problem occuring * relative cost of problem

For more complex distributions of events, with say multiple events, correlations, etc. it gets more complex, but I think the jist of this idea still could hold.

“Possibilistic thinking” seems like a perfect recipe for inaction. You’d have to spread (finite…) resources so widely that they wouldn’t serve any meaningful purpose.

I envisage a problem when you have low probability events with disproportunately high costs (climate change, nuclear war). You’d spend all your resources on avoiding these and none on near certainties with low but non-negligable costs (discomforts basically).

Although arguably avoiding the death of life on earth would be worth a bit of short-term discomfort…hmmm

Beck’s ‘Risk Society’ is likely to throw light on this. I haven’t read it myself (though I’ve had it recommended).

‘Risk and Technological Culture’ (Van Loon) is a book I *am* reading, and he runs over the theory of risk as a force that shapes our society. All very good.

I suppose what you really mean is “avoiding a negligeable probability that life on earth might die might be worth living a shit life” ? 😉

I recently read this about doomsday scenarios : http://esr.ibiblio.org/?p=220#more-220

I wonder what you’ll think of it !

BTW, I can’t get to the comments simply by clicking on the items under “recent comments”.

Hubert – nearly. What i mean is “avoiding a probability that life on earth might die might be worth living a shit life”. The question to consider is whether it is negligeable – so back to thinking about risk.

My first thought about Nicol’s scheme was that it would lead you to do exactly what you imply – sacrifice all comforts to avoid negligable risks. But then i thought of some risks which might be improbable but certainly aren’t negligeable (like the end of the world) and thought that if the scheme helped us avoid this it could be a good thing.

Thanks for the doomsday scenarios link

I fixed the comments link

Hubert’s right about the problem of possibilistic thinking – though if its just a heuristic to broaden the mind, its v useful.

When we actually have to decide on action we’re back to probability. But including the probability that our actions will be effective helps:

E.g. if the probability of a catestrophic meteorite strike was the same as oil-consumption induced environmental breakdown then we could imagine ways that we might realistically solve the latter but not the former.
So, imagine these were the only two possible catastrophes, by acting on the oil thing, haven’t we halved the probability that any catastrophic event will occur?

Our resources are finite; the possible problems to solve are not. I’m not even sure we have enough resources to solve those problems that do already exist. Anyway, when you deal with scarsity, you have to prioritize, and then you’re back to “probabilistic thinking”. The probability that civilisation would be destroyed because of insufficient oil reserves is close to nil and we can’t do anything about it either (except hasten our petroleum-free demise, if you happen to believe that). So the right course of action is to forget about it and trust people to behave as they always have done, i.e. find a way around the problem when it happens.

More fundamentally, think what civilisation would be if nobody ever took risks, including with dear old Gaia. People weren’t exactly sure about what was going to happen when they first flew, for instance. The precautionary principle, or whatever you call it, is IMHO deeply inamical to the dynamics of human life.

Hubert’s comments take us some way from the original post, but interesting nonetheless:
“trust people to behave as they always have done, i.e. find a way around the problem when it happens.”
In fact, since the agricultural revolution what (most) people have always done is to plan for future scarcity. In order to reduce risk they have generally spent some proportion of their resources in a way that makes them unavailable for immediate enjoyment.
That’s not the same as the precautionary principle, and its not the same as taking zero risks. But it does highlight that what is fundamental to human nature is the ability to imagine a number of possible worlds and then take purposive action in order to influence outcomes.

Comments are closed.