In last week’s post on entropy and information, I started off complaining about the most common definition of entropy as disorder or disorganization. One of the nice things about blogging is you often learn something in the subsequent discussion.
My chief complaint about the disorder definition was that it’s value-laden. I asked: disordered according to who? That, as it turns out, was the wrong question. Often asking the wrong question can leave us stuck, and what we really need to do to move forward is figure out the right question. The one I was asking treated the issue like some kind of ethical or aesthetic value. Instead I should have focused on a more intermediate value.
The question I should have asked then is, disordered or disorganized for what? Asked that way, the answer becomes obvious when we remember that Rudolf Clausius coined the word “entropy” from the Greek word for transformation. What we’re talking about is disordered or disorganized for transformation. Or in more engineering terms, disorganized for work.
So a low entropy system is organized for transformation, for change. Transformation requires energy gradients. In a low entropy system, the components are arranged such that the gradients are maximized, or that they support, reinforce, and/or magnify each other. But the act of transformation, of causal action, inevitably reduces that organization. If you think about it, it can’t be any other way, and the second law of thermodynamics is a natural consequence.
In a high entropy system, the gradients have become separated, fragmented, and no longer able to reinforce each other. That fragmentation effectively minimizes any possible transformations. The first law of thermodynamics still applies. The system still has the same amount of energy, but much of it is now unavailable for transformation, for work.
The classic example of this is mixing cream into coffee. Prior to the mixing, the two are in separated states. When we initially pour the cream in to the black coffee, the states are still initially relatively low entropy, in the sense that they’re organized for transformation. Specifically, the cream is primed to mix with the coffee. As that mixture happens, as the transformation ensues, the entropy increases, and the mixture becomes less organized for change, at least without new energy from outside.
It just so happens that this disordered, disorganized, fragmented state corresponds to one with a large number of microstates, a system with a high degree of uncertainty, one that requires a larger amount of information to describe than a lower entropy system. We can use various information compression techniques to efficiently describe an organized system, but that becomes increasingly infeasible with ones effectively random and disorganized.
Maybe what I’m saying here is obvious to most of you, but for me, it’s a conceptual breakthrough. Now I won’t have to be annoyed when a popular science article talks about entropy as disorder, except possibly about the ambiguity.
There is one important point worth mentioning here that may be less obvious. It’s important not to assign too much general goodness or badness to order and disorder. I’m currently reading Anil Seth’s new book: Being You: A New Science of Consciousness, and one of his points is that brains aren’t really maximally ordered or maximally disordered systems. While they contain a large amount of information, and therefore entropy, the necessity to integrate that information, to process it holistically, requires that they maintain an ongoing balance between order and disorder.
Of course, brains can work with large amounts of entropy because they’re not closed systems. They have a constant stream of energy coming in, and constant metabolic processing to remove waste. Information processing is thermodynamically costly, a reminder of how physical it is.
Anyway, I feel like I have a much better handle on common conceptions of entropy now. Unless of course I’m still missing something?