The world can work better

Scalable and non-scalable systems

Modeling of the reality that takes place consciously (System 2) or unconsciously (System 1) in our minds is the basis for action heuristics of our choice. In this case, heuristics is nothing but a simple procedure which, although imperfect, allows us to act in recognizable circumstances – similar to previous ones – without laborious and time-consuming analyzes. But in order to better imagine the potential effects of our actions under imperfect, simple and automatic procedures, we should introduce the concepts of scalable and non-scalable systems.

Taleb in his bestseller The Black Swan describes these concepts as follows – we have to do with the non-scalable system when the level of achieved results depends on continuous efforts(41). In massage salons masseurs massage with the sweat on their brows, in kindergartens teachers teach a small group of children they can control, bakers bake bread in the bakeries, musicians play at weddings, organists in churches, doctors see patients. Of course there are individual differences in the range of activities carried out by these professions, but in principle their physical constraints will limit the daily, monthly or annual amount of work performed and the related benefits. But – which is very important – the risks associated with them will also be limited.

On the other hand, we are dealing with a scalable system when instead of continuous efforts we can rely on the accuracy of the decision which increases efficiency without additional effort(42). The basis for propagating accurate decisions in a complex system is the manner of storing and reproducing information. Inventions such as speech, writing, numbers, printing, sound recording, image registration, word processors, accounting software, data recorders from floppy disk to flash drive, Internet, social media information exchange – all these devices make the range of actions taken, disseminated ideas, action heuristics, data processing and decision making algorithms grow and achieve the scale of the entire system. These activities can bring about sky-high incomes – of course for those few whose books, songs, software, shoes or coca-cola will spread using scalability. They are able to disseminate fashions, attitudes, ways of modeling or algorithmization of reality throughout the world. But above all, their unforeseen negative consequences may be equally sky-high.

The combination of the continuous growth of technical possibilities of spreading and storing information, mobility of people, goods and ideas all the time increases the scalability potential. Offered algorithms of social networking software such as Facebook, data access programs such as Google, computer systems like Windows, mobile operating systems like Android, prepared by lobbyists (politicians and all others wanting to influence the public opinion) algorithms of later published polls, promoted dietary heuristics, norms of health behavior, strategies for taking antibiotics by people, but also for fattening animals, strategies for selecting varieties of crops, fertilizing or using herbicides, growing monoculture crops, etc., by using the economies of scale, change our perception of reality, customs and relationships in a way which unexpected effects can spread on an unprecedented scale.

In a way not quite grasped by our system of perception used to locality, Gaussian probability distribution of independent events and relative simplicity of the perception system, we take imperceptibly for ourselves the risk associated with the unexpected effects of complex systems operation. And the immanent feature of complex systems is their vulnerability to excessive consumption of their vital resources and catastrophic collapses.

Nassim Nicholas Taleb in his bestselling book Antifragile writes about the errors in perceiving the consequences of business activities undertaken. For example, the economies of return to scale desired by companies that merge into corporations are an illusion. In the pursuit of relatively – as compared to the size achieved – small advantages of the merger, corporations expanding their size escalate the risk of erroneous estimates concerning their managers’ perception of reality. And after reaching certain levels they press political authorities via lobbyism or take advantage of the phenomenon „too big/important to fail” and parasitize others.

The desire for excessive rule and concentration of power consumes the vital resources of the system and/or increases the likelihood of its uncontrolled simplification

The consequences of errors on an even greater scale, writes James Rickards, may be imposed on us by governments. Willingness to settle everything, financing as many wishes as possible on account of future elections, manipulating markets and paper money at the future generations expense, allowing free growth of money-speculating bank depositors or attempting to influence the behavior of taxpayers and consumers by using behavioral attitudes is exactly the opposite of the conclusions of the theory of complexity. Anything that results in accumulating imbalances also carries the risk of collapse. The desire for excessive rule and concentration of power consumes the vital resources of the system and/or increases the likelihood of its uncontrolled simplification. A complex system can and should be simpler, unregulated, obstructing and not facilitating the spread of imbalances that threaten its functioning. So, it can and should be more local than global. Otherwise, as Rickards points out, there remains either conquest (and parasitizing, nowadays economic, the conquered) or collapse(43).

A textbook example of such an unnoticed risk was, as Rickards describes it, the fastest-spreading virus in the global financial epidemic, i.e. the method of measuring of the value at a risk of loss, the so-called VaR(44) (Value at Risk – extensive comments can be found in Ashlag’s TOC Thinking, Taleb’s Antifragile, as well as in The Death of Money and The Way to Ruin by Rickards). The theory of effective markets (where investors maximize profits and react rationally to price signals and new information) combined with the normal risk distribution (Gaussian curve of probability of occurrence of independent events) served to create a risk measurement method in the entire asset portfolio. Based on the assumption that future price relations will in principle be shaped in a similar way as in the past and that price fluctuations are of a random nature consistent with the Gaussian curve, namely both very favorable and extremely unfavorable events are unlikely and eliminate each other, compilations of investment positions that seemed immune to all kinds of disturbances were developed using complicated mathematical calculations. They seemed to generate safe and unlimited profits in the face of investment demand, practically not generating any risk. The number of complex combinations of mutually eliminating risks, as Rickards writes, is virtually unlimited(45).

The latent toxic gene, i.e. the overlapping of Gaussian probability of random distribution of independent events and dependent events, the imposition of simplicity on complexity, the invisibility of fragility – somehow imperceptible to the constructor of the financial risk model – always demonstrates itself in favorable conditions. The stock market crash in 1987, the collapse of the Long-Term Capital Management fund in 1998 and, finally, the global financial collapse in 2008 are some of the most spectacular traces left in the history of finance by activities pushing with mathematical algorithms complexity into a sooner or later disintegrating corset of simplicity.