In a recent LinkedIn discussion about complexity, someone did not want somebody else to use the term *entropy*, allegedly too “smoky”.

I thought that was fun, because I often feel the same way. In pop discussions, entropy is a deus ex machina that has saved many souls: it gets pronounced a lot by people who could not tell Boltzmann from Dobermann.

Yet, entropy is a useful concept in systems science. E.g., as in Kolmogorov’s discussion, entropy (K)

- K=0 for deterministic systems;
- K=∞ for stochastically chaotic systems;
- K=λ for complex systems (“deterministically chaotic”, organized disorder).

The value λ is the “Lyapunov exponent” in a series expansion attempting the mathematical description of the “complex” system at hand (as an alternative to mere numerical simulation).

K is a precisely defined property in a system’s **state space**.

Picture the latter as an N-dimensional cartesian space and divide it into infinitesimal hypercubes: K relates to the probability of finding or not finding, in any specific minicube, the ghost of the system, i.e. one point of the N-dimensional figure representing the subsequent states of the system over time.

Entropy only becomes “smoky” under two circumstances. One is pop interpretations of complexity and related blabla’s (“order”, “disorder”, “order from chaos”, etc.): and in this sense the LinkedIn polemist was right.

But a **thicker smoke** develops when one ponders on the connections between thermodynamic entropy and information theory entropy. In that distant and mysterious (for me) territory lies perhaps the connection between static and dynamic complexity…

### Like this:

Like Loading...

*Related*