Posts Tagged ‘pop complexity’

circleYou are unlikely to have ever met a perfect circle. One with absolutely no irregularities on the edge of its circumference, regardless of how close you look at it. Even the most technologic­ally-advanced and accurately crafted circle is far from ideal: inspect it with a sufficiently precise optical tool, and you will notice the imperfec­tions.

Today, we could build a very smooth circle, such as a bearing for nanomachines, us­ing a scanning force microscope, a device allow­ing to move individual atoms. We could pa­tiently shape the external surface of the bearing for long hours using our high-tech tool. But in the end, what we have is still a relatively rough contour: optical microscopes would show a per­fect circle, but another scanning force micro­scope would reveal the trick.

The fact of the matter is that perfect figures, such as circles, tri­angles or squares, only exist in geometry books. They do not belong in this world: they are ideal­izations of reality, archetypes, models.

Does that mean that geometry is a foolish thing, or a useless pastime like a sudoku?

Uhm, not really. Until they learned basic geometry stuff such as Pythagoras theorem or trigonometry, humans could not safely compute distances, build calen­dars, divide up land properties, sail a ship in seas unknown or develop firearms. The first prosper­ous, technology-based societies of Mesopot­amia, India or China were born after geometrical and mathematical knowledge had been accumulated and formalized.

If you want to measure a field of rectangular shape, you will take its length, then its width and multiply them. A center-pivot irrigation field has the shape of a circle: its area is equal to π times the diameter. If a grain silo is conical, its volume is to be computed as one-third the height times the base area. Neither fields or silos are perfect geometric figures: no exact circles or right angles. But the formulae still hold. If you have a square and multiply the length of the side by it­self, you get the area of the surface, irrespective of whether the square at hand is a wheat field or an idealized figure in an Euclidean school book.

The only thing that really matters is the preci­sion you require. If what you have is a real-world wheat field and you get more and more precise measures of the side thanks to improving metering technology, then you will be getting more and more accurate measures of area, but the process is never ending unless you decide that a certain precision is enough (and crop fields are measured in acres, not square inches anyway).

And, more importantly, the process, the method, the algorithm is the same: side times side equals area, or π times diameter equal cir­cumference, irrespective of whether the object is ideal or real.

That is why geometry and mathematics have developed so much. They seem to be dealing with idealized, esoteric things and platonic ideas, but the methods we learn and devise with­in them are good in reality as well, they do have real-life applications.

justin(In the past two centuries, mankind has developed a whole lot of mathem­atical concepts that, unlike idealized circles or triangles, seem not to bear any connection what­soever with the physical world. But that is not a problem. To begin with, every now and then we realize that this or that abstruse piece of math has a useful application; secondly, even “use­less” mathematics teaches scholars new tricks of the trade. And, last but not least, mathematics is beautiful!).

When applied in the real world, as we saw, the rules of mathematics may give raise to some errors. Enter applied mathematicians and engin­eers. These people play a slightly different role than pure mathematicians: they know the tricks of applying math rules to real objects. Engineers never deal with perfect circles, ideal straight lines or truly squared angles.

Sometimes, imper­fections do not bother them; sometimes they do. The former case corresponds to situations where the differences between ideal triangles or circles and their real-world counterparts do not matter, like when measuring fields in acres.

Or take quantum mechanics: we know that Nature is quantistic. However, in many meso-scale situations (i.e., neither at an extremely small or at an extremely large one), we ignore quantum mechanics and continue designing applications in accordance with old good Newton-Laplace mechanics, even though the numeric results we get may be slightly imperfect.

Same story with Einstein’s Relativity. We know that Nature is relativistic. However, as long as things move at speeds much smaller than light’s and do not travel intergalactic distances, engineers ignore the relativistic effects because they are negligible for most practical purposes. Formula One cars, aircraft or super tall skyscrapers are built with such approach. GPS systems in our cars and smart phones, on the other hand, are equipped with hardware and software that take Relativity into account, otherwise they would come up with significantly wrong distances. Therefore, here is one example of a situation where engineers are not happy with the classical, non-relativistic approximation: a case where, in a sense, the difference between a real and an ideal triangle is important.

Geometry itself, the world of idealized shapes, is more complicated than we assumed above. Circles, triangles and squares only ap­proximately obey the ordinary rules that we learned at school, because they actually belong in a non-Euclidean world, where straight paral­lels cross, the shortest path is not a straight line, and what you see is not what you get.

towerWhenever we say that area equals base times height, we are  approximating the truth somewhat, assuming a right angle where one really is not and a straight line where there is a curve. The geometry that 99.9% of people know and use, and with which the Great Pyramid of Giza or the Burji Khalifa Dubai tower have been built, is slightly wrong, both in principle and in practice!

The point I believe I have made is simple: technology is grounded on approxima­tions. It is never exact, infinitely precise. Real-world scientific applications all live within the limits of their tolerable precision, which is deemed sufficient until proof to the contrary. They are sufficiently precise, good-enough. (In fact, this even occurs in theoretical science, where being as precise as possible is more important: we do not know with infinite precision the value of  π, or the electric charge of the electron, or its mass).

Examples of situations where theory and practice (i.e. applications, technology) diverge include:

  • Euclidean Geometry, as discussed above. In principle, straight lines do not exist, parallels cross, and so on. This has had and is having pro­found influence in mathematics as well as in physical models of Nature: but it does not dis­turb the nights of any engineer and 99,999% of the advanced technology we have, from DNA to supercomputers to exotic financial products, would be exactly the same even if we knew nothing about hyperbolic and elliptic geomet­ries;
  • Dimensions of the Universe. Well into the 19th century it was believed that the “real world” had three dimensions. This evolved to four with the added time dimension of Relativ­ity. It is now up to eleven or so in string theory, and the discussion is very dynamical: the number changes as physicists run their major congresses annually and it jumps up and down weekly in arXiv.org. This is an extremely important fundamental scientific discussion: but it has zero impact on daily life and technology;
  • Theory of Relativity. In the vast majority of everyday situations, including those involving sophisticated technologies, we do not worry about the consequences of relativity, because the objects which we deal with do not move at speeds approaching that of light or travel inter­galactic distances. The effects of Relativity, such as for example the decreasing length of traveling objects or the expansion of time ex­perienced by a traveler, only become significant under those circumstances, therefore they are negligible in most mesoscale situations. On the other hand, as we discussed already, there also exist common situations where we better take Relativity into account: for example, GPS devices;
  • Quantum Mechanics.  One of the basic principles in physics, called Heisenberg uncer­tainty principle, implies that it is impossible to measure the present position of an object while simultaneously also determining its future mo­tion: if we know exactly where it is, then we will not know where it goes, while if we know how it is going then we can not tell exactly where it is. The object in question, however, is intended to be so small (like for example an electron or a proton) as to require specially accurate measur­ing apparatus. The same problem does still oc­cur, but is totally negligible and uninteresting, when the energy of the objects being observed is far larger than Plank’s constant, which is a small quantity, equal roughly to a millionth of a billionth of a billionth of a bil­lionth of a Joule*second. (One Joule is the en­ergy released by dropping a small apple from one metre high onto the ground, like in Isaac Newton’s famous anecdote). With ordinary ob­jects in the mesoscale, which is where humans belong, the uncertainty principle is irrelevant. Its importance is significant philosophically, be­cause it means that Nature is fundamentally un­certain and undeterministic: but techno­logy-wise, in most cases it is but a curiosity.

Like it happened with Relativity and GPS’s, there may well be cases when the intimate non-linearity of Nature pops up and ceases to be simply a sophisticated epistemological matter. It could, for example, become relevant to the life of professionals and managers: witness the dis­cussion in econophysics. However, this has not happened yet. Or, if it has, we do not know: no one has as of yet presented a credible account of how non-linearity is affecting businesses and professions.

Pop complexity pundits confuse the epistemological issues with the practical consequences.

They hear about someone having proved that a butterlfy flapping could cause a tornado ten thousand miles afar? Well, they will assume that to be a dominant phenomenon in metereology.

Equally, they assume that getting “surprises” from complexity (such as, e.g., weird emerging behavior) is standard, and forget that the most surprises we simply get from plain ignorance.

PAOLO MAGRASSI, 2011 CREATIVE COMMONS NON-COMMERCIAL – NO DERIVATIVE WORKS

FYI, the discussion of Gaussian vs. other forms of stable statistical distributions wast not born in a recent best-selling book.

It was brought up by Vilfredo Pareto a century ago and then again, in specific relation to finance, by Benoît Mandelbrot in the 1960’s, when he suggested that markets do not follow a normal Gaussian but rather a leptokurtic, heavy-tailed Pareto distribution.

This is what happens when the constraint of finite variance on the participating stochastic variables (e.g.; securities prices, individual risks, et cetera) is relaxed: from the Central Limit theorem, we derive no longer a Gaussian but rather other stable distributions (although not necessarily a “power law“, as is believed in popular versions of the theory).

If Mandelbrot is right, the current forecasting models (e.g. those for risk assessment, which failed in 2008) are wrong and it would be little surprise that they be not very good at anticipating even highly-impactful events.

John Lennon: We're going straight to the top, boys!
Beatles: Oh yeah? Where's that?
John Lennon: The toppermost of the poppermost!

 

Not surprisingly, the top of pop complexity is to be found in Wikipedia.

Don’t get me wrong. I am an admirer and an avid user of Wikipedia, a beautiful tool for quick reference and often the best place to start from if you want to investigate a topic completely unknown to you. It can save you enormous amounts of time in web browsing, and this simple fact is worth a regular donation to Jimbo’s organization.

(In my experience, en.wikipedia is best, followed by es.wikipedia and fr.wikipedia. The others I just can’t judge, except it.wikipedia which, for some reason still mysterious to me, is dismal, currently a fiasco).

However, it has its flaws. And one of these concerns complexity, a concept still entirely in the hands of pop-complexity fans over there.

As of today, December 18, 2011, <en.wikipedia.org/wiki/Complexity> misses out completely on the notion of [non]linearity, hence it does not even come close to explain the inner workings of complex systems and complex behaviour.

The adjective linear appears only 1,582 words through the article, or 60% its length. (There even are distinct articles for “complex system” and “complex systems”!).

It will improve. But that’s where we stand today, 50 years after P.W. Anderson and 40 years after E. Lorenz.

Thought of the day

Posted: September 24, 2011 in Uncategorized
Tags:

Most of the ongoing “complexity talk”, which has originated after a blossoming of popular literature in the mid-Nineties, is really an anti-scientific outburst.

It is a movement that resonates with the creationism that has taken America by storm over the past decade or so and, like it, is part of a larger cultural megatrend which can be seen at work in most of today’s world: people frightened by a greedy technocracy (in finance, pharma and other fields) and repelled by rationality because they do not understand it and tend to associate it with said technocracy.

Cheap books and a lack of scientific mindedness have fooled “pop” complexity fanatics into thinking that all is chaotic, no predictions whatsoever can be made and Nature escapes human insight entirely. They believe that science is a mechanistic system aimed at turning the world into a final exact equation, a clockwork. Most of them exhibit signs of dogmatism and bigotry, for the very reason that they do not have a scientific mind.

We have already discussed, here and elsewhere, the essential a-scientificity of the “pop complexity” phenomenon.

Part of the pop complex literature is genuinely motivated by scientific curiosity. But a larger part is, at its core, a conscious or unconscious attempt to liquidate the scientific approach.

If nothing is predictable, all is at the edge of chaos, and behaviors always emerge unexpectedly in Nature, it follows that no rigorous, methodical, controlled approach is possible. That is how the pop-complex enthusiast does reason (and often explicitly states, such as, e.g., in some essays on emerging behavior and Darwinian evolution).

In her mind, all we are left with are animistic beliefs or at most some organized religious scheme. In the [few] more sophisticated cases, the pop-complex zealot accepts at most the notion of numeric simulations: scientific investigation is reduced to studying the behaviour of complex adaptive systems.

[Nota bene: We are not talking about hard-science researchers here. We are referring to approaches in the popular literature about complexity, such as the one on complexity and management, complexity and human organizations, complexity and psychology, complexity and medicine, etc.].

Now, why is this happening? Because many people, including intellectually gifted people such as some of the pop-complex exponents, are afraid of mathematics. Math repelled many if not most of us in mid and high school. And those who did not venture into “hard” scientific studies afterwards have remained forever preys of such repulsion.

To these folks, mathematics is the hallucinating and monstrous sequence of untameable formulae that we remember from when we were fifteen: a deterministic mechanism (two of the most recurring, obsessive words in pop complexity) that needs to be followed pedantically in order to get to some predefined result.

The learned person knows that mathematics is quite another thing, and that our teen-age recollections aren’t but the exercises that we were given in order to make sure that we were learning the concepts and acquiring an inclination for precision. (Similarly, Latin literature is about reading Virgil, Ovid and Horatius, not inflecting nouns and conjugating verbs. However we need the latter in order to progress to reading).

Like music, mathematics is about creativity, abstraction and beauty, not merely exactitude. Vast parts of mathematics and formal logic aren’t exact at all, as they involve estimating, approximation and guessing. Math isn’t deterministic, either: one of its quintessential activities, giving proofs, is profoundly non deterministic.

The fortunate Italian reader who questions our words is welcome to reading Discorso sulla matematica by Gabriele Lolli (Bollati Boringhieri 2011), where the author equates the fundamental mathematical methods to the literary ones discussed by Italo Calvino in his Lezioni americane (1985-1988), namely

Lightness

Quickness

Exactitude

Visibility

Multiplicity

Consistency

(A book for which we anxiously wish an English edition.)

If pop complexity authors really studied mathematics as opposed to just some of its grammar, they would learn how to position non-linear phenomena and complex approaches under the light they deserve, instead of drawing caricatures.

Paolo Magrassi Creative Commons Non-Commercial – Share Alike

In November, 2008, I was writing a book in Italian on pop complexity and I had to undergo a (painful) review of the scientific literature on “complexity theory” for management.

The most cited paper at the time (no idea what might have changed ever since) was “The Art of Continuous Change: Linking Complexity Theory and Time-Paced Evolution in Relentlessly Shifting Organizations”, by Shona Brown and Kathleen Eisenhardt, which had been published in 1997 by Administrative Science Quarterly (Vol. 42, No. 1).

It had already been cited over 1100 times by other authors as a sort of management-science complexity Bible.

In actuality, rather than «extending thinking about complexity theory», as bombastically announced in the abstract and implicitly in the title itself, all the paper accomplishes is to offer  a bunch of suggestive references to a sloppy popular literature. It also explicitly admits, in the very last paragraph, not to have empirically proved anything on the relationship between complexity and organization: «If these inductive insights survive empirical test, then they will extend our theories […]».

As typical with unsuccessful scientific accounts, the bibliography is very long and includes citations of works totally unrelated to the paper’s content as well as of others which the authors have obviously not understood, if at all read, such as a renowned popular work by physicist Murray Gell-Mann, The Quark the Jaguar, an absolute must as a citation for authors who dwell upon complexity but, lacking a scientific background, feel the need of putting together a credible bibliography. 

The 35-page Brown and Eisenhardt paper starts talking about complexity only on page 30 (beginning with «Perhaps closest to our research is work on complexity theory […] »). It does the job by merely quoting four books (no page numbers), and  concludes it with these words:

«Although speculative, our underlying argument is that change readily occurs because semistructures are sufficiently rigid so that change can be organized to happen, but not so rigid that it cannot occur. Too little structure makes it difficult to coordinate change. Too much structure makes it hard to move. Finally, sustaining this semistructured state is challenging because it is a dissipative equilibrium and so requires constant managerial vigilance to avoid slipping into pure chaos or pure structure. If future research validates these observations, the existence of semistructures could be an essential insight into frequently changing organizations».

The words «[it] is challenging because it is a dissipative equilibrium and so […]» are one annoying example of unnecessary abuse of pseudoscientific language to state something that could have been said in a clear and simple fashion.

What the authors intended to say is that if an organizational structure is too rigid it will tend to oppose any change, while if it is not structured at all it inclines to chaos; the intermediate organizational condition is more flexible, however its equilibrium is unstable since the state can become rigid or chaotic unless it is persistently controlled.

This may not sound like a tremendously innovative concept to you, yet if you go and read the paper you will concur that it could be stated as I just did. The analogical and imprecise resort to “dissipative structures” serves the purpose of leading the reader to believe that the authors are referring to a scientific context which they know well and which presumably attests the veracity of their statements, adding credibility to them.

However, the reader with a minimal scientific culture is annoyed by the paucity of the content and by the unfounded allusions (the expression unstable equilibrium would have been clearer and more correct, with no need to call into question the entropic mutation and environmental exchange issues implied by the term “dissipative”, to which the remainder of the paper makes no reference whatsoever).

In the Conclusion section the paper states that

«At a more fundamental level, the paper suggests a paradigm that combines field insights with complexity theory and time-paced evolution […]. Continuously changing organizations are likely to be complex adaptive systems […]»

The « paradigm that combines» is what I illustrated previously, i.e. 10 lines of rhetoric, and complex adaptive systems are an obligatory slogan that you must utter if you want to make believe that you know what you are talking about when daydreaming about complexity.

The truth is that this paper is pervaded by a fundamental confusion between complexity and dynamism (which is what it is really about) and that when the authors make reference to complexity (that is on pages 30 and 33 only) they reveal their incompetence in the field.

Enough said about the most successful (by 2008 at least) scientific paper on complexity in organizational management. And if this is the scientific state of the art, you can imagine what follows in the food chain down below…

 

PS: There are good papers too! One example, again taken from my 2008 review, “Complexity Theory and Organization Science”, by Philip Anderson (Organization Science, May-June 1999, Vol. 10, No. 3). An excellent overview of complexity concepts that may or may not turn out useful in management theory.