I am most grateful to physicist Ijaz Durrani for referring me to a paper recently published by Liu, Slotine and Barabasi in the Proceedings of the US National Academy of Sciences,entang addressing the observability of nonlinear dynamical systems.

The authors believe to have proven that their graphical approach (GA) leads to the isolation of a minimum set of sensors (i.e., a subset of the system’s internal variables) necessary and sufficient to describe the dynamics of a complex system.

For linear dynamical systems, the minimum sensor set derived from the GA would only be necessary and not sufficient. But for nonlinear dynamical systems the GA sensor set is also sufficient. According to the authors, this stems from the fact that, unlike linear systems, nonlinear systems contain zero or almost zero symmetries in the state variables.

Any symmetries in the state variables that leave the inputs, outputs, and all their derivatives invariant make the system unobservable (i.e., you can’t look at outputs and say something positive about the system’s state): a dynamical system with internal symmetries can have an infinite number of temporal trajectories that cannot be distinguished from each other by monitoring outputs.

A complex system, on the other hand, is more essential, it has a personality (no symmetries): and this is why its behavior can be captured by a subset of the internal variables, i.e., by monitoring only some outputs.

The paper does not offer rigorous proof of the sufficiency of the GA-selected sensors. The authors have simply run a total of  circa 1000 numerical simulations in several complex domains (such as Michaelis–Menten, Lotka-Volterra, and Hindmarsh–Rose) and found the GA-selected subset to be a sufficient descriptor.

The graphical approach reduces observability (a dynamical problem) to a property of the static map of an inference diagram: and such maps are available for an increasing number of complex problems, like the three mentioned above.

The graph is obtained as follows.

Like in the life-sciences example offered in the paper, consider a number of chemical substances

A, B, C, D, …

some of which are reacting with each others. Reactions, i.e., will be of the kind

A+B+C –> D+F+J

D <–> E

GA

Liu, Slotine, Barabasi: “Observability of complex systems”, PNAS 2013

and so on. You may therefore write, using mass-action kinetics, balance equations representing all reactions: the equations will contain the substances’ concentrations as variables (xA, xB, xC, xD, …) and a number of rate constants k1, k2, …, as many as there are reactions.

From there, an inference diagram is built by drawing a directed link

xi –> xj

if  xj  appears in the right-hand side of xi ‘s balance equation.

Then, strongly connected components or SCCs are identified as the largest subgraphs such that there is a directed path from each node to every other node in the subgraph. Among these, “root” SCCs are those SCCs that have no incoming edges. At least one node is chosen from each root SCC, to ensure observability of the whole system.

These findings are likely to benefit various domains of public interest, such as medicine or economics and other social sciences.

There also are several lessons here for pop-complexity fans to learn: e.g., complexity can be managed, and it can be done using a scientific instead of a fideistic or animistic approach.

Paolo Magrassi 2013 Creative Commons Attribution-Non-Commercial-Share Alike

Advertisements

fpcover-fullA friend pointed me to a new book that promises to be interesting.

I am a bit surprised that, in the book presentation, Microsoft Research labels data-intensive science as «emerging» though.

Bionformatics emerged 20 years ago, and its underrated logical foundation, agent-based modelling, came up in the early Seventies.

Even large chunks of mathematical research have been data-intensive and computer-driven since decades (e.g., the Great Internet Mersenne Prime Search project is 17 years old).

More reasons to read the book then…

boeing-787-dreamlinerThe idea that failure to manage increasing complexity could lead to public disasters was born, as far as I remember, at the Club of Rome in the Seventies. Much water has gone under the bridges and tools to measure complexity are beginning to emerge: an essential prerequisite for “managing” anything.

Ontonix is leading this field. While I do not necessarily espouse all their views or agree 100% with their methodology, it is easy to recognize them as being miles ahead of the clumsy talk surrounding “complexity” in its various forms. At Ontonix, complexity is not bla-bla: it is a measurable feature of any system.

Ever since I first reviewed them years ago, their tools have progressed enormously and many are available online for free trial. Just read founder Jacek Marczyk’s commentary on the Dreamliner woes here and get a grasp.

LinkedIn Groups about Edgar Morin are starting to appear… About time, at least for those among us who saw La Méthode appearing 25+ years ago.

Morin has a tiny but fanatic following, thanks to his personal likeability and the popularity he acquired during the 1968 student revolts; but also, after La Méthode, thanks to his constructivist position according to which knowledge (epistème) does not exist per se and is built by all of us, inside us, every day -with no need for superstructures like researchers, etc.

morinThe latter idea was obviously immediately popular, and forever will be, among those who, not having spent time to acquire a scientific background, like to think that it is irrelevant. (By science I mean results guided by theoretical models and obtained via organized experimentation, that are replicable by other researchers).

Morin is an admirable writer, a decent philosopher, an intriguing filmaker, a dearest man and a respected public figure. However, when I read (in French) the most critical parts of La Méthode, I could not help but noticing that he had not fully understood the “scientific” aspects of complexity.

As to inter- and trans-disciplinarity, discoveries that fans do not hesitate to attribute to him, these are actually not Morin’s creations as they had been established long before by the cyberneticians, and even earlier, between the two Wars, by the Bogdanov’s and Schrödinger’s of sort. Even though, admittedly, Morin was effective in opposing to the barbarism of specialization.

In a LinkedIn discussion on medical robotics, I noted the following remark by a neurosurgeon: «A robot is an information system, and an information system is as good as the person who designs it and the person who uses it».

This is a myth.

Since a a couple of decades, artificial intelligence and robots build on characteristics that do not have much to do with those of humans, such as pattern recognition-based induction, hugely larger (than those of humans) knowledge bases, vastly superior precision, ultra-fast Bayesian networks navigation (the sole remote resemblance to humans, perhaps), and more.

Unlike structured and logically-based software, a robot resembles its programmer no more. Don’t think humanoid robot here: think robot surgeon. Or think chess-playing robots: their programmers ain’t Grand Masters and lose systematically in matches against their own software. See the point?

Today surgery is merely “robot assisted” (of which I predicted widespread use in a 2004 book), but watch out for for 50% of unemployed surgeons in 15 years or less.

Ideologically, I am a supporter of econophysics and non-orthodox economic research in general.

However, I am wary of scientific works that are too swift and radical just for the sake of sounding cool. And there plenty such papers, blogs, magazine articles, etc. Take, for example, the talks of the economic paradigm being broken. These are often naif:

A) Things like non-linearity, non-ergodicity, spontaneous symmetry breaking, or “black swans” are common in physical systems too, not just finance. So why don’t bloggers and marginal thinkers of the world write posts saying that physics is broken? I think it’s because we all deal with economics and finance everyday, so as soon as we get a PhD (and sometimes even without one in economics) we feel entitled to speak up on the fundamentals. And the oi polloi (like me) will soon be excited.
We rarely do that with math or physics…

B) The world of economics is dominated by [dynamical systems] complexity, politics, etc. etc., and the paradigm that gained a half-dozen Nobels in Chicago (Rational Expectations Hypothesis, Efficient Market Hypothesis) is of course a bunch of bullshit and will be rewritten soon. 🙂 Yet, markets seemed to show tremendous efficiency when they anticipated, in 2008, a deep recession coming… Dear God, never let the facts get in the way of my opinions!

Books and blogs and articles are flourishing about the need for a new paradigm in economic research, one that go beyond Rational Expectations and Efficient Markets, etc.

Non-economists are usually fastest in criticizing the flaws and the holes of the mainstream scientific paradigm in economics, however what they do not know is that those flaws and holes have been known for a long time: see, e.g., Scarf 1960, Mandelbrot 1963, Sonnenschein 1972, Stiglitz 1975, Kahneman/Tversky 1979, Anderson 1988. (The references are here).

Notice that those names are/were established and recognized scientists, not just eccentric mavericks or casual best-seller authors. (Also notice, as a telling detail, that Eugene Fama was working in Chicago on his doctorate with Mandelbrot when the latter developed his view of infinite-variance distributions of financial events…)

In other words: economists know more, of economics, than what non-economists think they know. The reason why a new economic paradigm has not emerged is that nobody has yet proposed a valid alternative.

(For that matter, no theoretical model, in any scientific discipline, is a perfect description of “reality”, because modeling always implies some simplification. What economics lacks, however, is a less axiomatic approach. This will not be easy, as controlled experiments in macroeconomics don’t seem easy ^_^).