How code (modelling) eats the world — and why it’s important for us all.

How code (modelling) eats the world — and why it’s important for us all. https://www.linkedin.com/posts/antlerboy_how-code-modelling-eats-the-world-and-activity-7414224107195494400-rtYd

Campbell/Goodhart/Strathern’s law: when a metric becomes a basis for reward, punishment or control, it stops reliably representing the underlying reality and instead shapes behaviour to optimise the metric itself, often at the expense of the actual goal.

Amazon had a brilliant metric: CpX

They measured the number of customer contacts per order — then went after the ‘root causes’ of customer contact with the goal of making their contact completely seamless. A lot of thinking and internal work and change.

Huge progress… until. Guess what? The iron law of hitting the target but missing the point took over.

__

Boyd would call it freezing orientation. Deming warned against managing by visible figures alone. Beer and Ashby remind us that low-variety controls cannot regulate high-variety reality, so the easier move is to reduce the variety of reality: fewer routes, fewer exceptions, fewer humans. Ackoff called it sub-optimisation. Mintzberg would call it the death of strategic learning.

__

The ethical position is that this is wrong.

And the hopefully naïve prescription is that locking your orientation is a failure mode for organisation; you focus on your own modelling, lose touch with the world, and crash.

__

Get Benjamin P. Taylor’s stories in your inbox

Join Medium for free to get updates from this writer.Subscribe

But then comes Zuboff’s uncomfortable lesson. Information in the modern business environment often presents first as ‘waste’ (she talks of ‘data exhaust’ in Facebook — traces of likes, activity, views, ‘linger time’, emotional response — an uncanny echo of Beer’s 1975 diagram). But then you can start to predict people’s behaviour from the data,

Prediction allows you to intervene more effectively to make money.

But the most effective — and profitable — form of prediction? Control.

If you can steer behaviour at scale, you do not need to understand messy reality. You can standardise it.

__

In systems terms this is reversing science. Instead of updating the model to fit the world, we force the world to fit the model.

So Amazon has now found ‘better’ ways to reduce contact: they make it impossible to contact them and they don’t care what you have to say.

__

If a service becomes hard to reach, is that failure — or is it success, because the system’s real purpose is behaviour-shaping rather than problem-solving?

It’s insidious — and the insights of second-order cybernetics make it both worse and better: we can’t act without carving the world into categories. The question is whether our categories stay adaptable to lived experience, or whether they become channels that people must squeeze through.

__

In a nutshell: once you categorise and measure the world, you move towards making the world conform to the model rather than the other way around.

Where you can you see systems that are imposing structure and control on reality, rather than testing and learning from reality?

One thought on “How code (modelling) eats the world — and why it’s important for us all.

Leave a comment