Tuesday, December 02, 2008

Some notes on recursion, complexity & emergence

Recursion


Recursive patters are structures that structure themselves. Examples of recursion are the ‘bases’ in mathematics, where integers (numbers) are organised into complex patterns based on some properties of the numbers themselves.

For example, each large set of numbers in base ten contains nested subsets of all the other base ten numbers – these are vastly complex patterns generated by just 10 numbers.

1
10
100
1000
10000
100000
1000000
10000000
100000000

In recursive systems. Repeating a patterns like this creates a syntax (a grammar) which organises further iterations (repeats) of that pattern.

Fibonacci
So far we have dealt with purely abstract numbers patterns. In the twelfth century, an Italian mathematician called Leonardo of Pisa (now better known as Fibonacci) came up with a famous example of recursion, which seemed to describe natural patterns. This sequence is known today as the Fibonacci sequence. It starts like this:

0, 1, 1, 2, 3, 5, 8, 13, 21, 34

To make a Fibonacci sequence all you do is add to the sum of the two previous numbers together to get the next number in the sequence. So for example if you start with 0 & 1 then the next number would be 1, (0+1), and the next number would be 2, (1 + 1), then 3, (2 + 1), then 5, (3+ 2), then 8, (5 + 3) and so on……

Graphically represented, the Fibonacci numbers can be arranged as squares which results in a pattern like this.

Fibonacci numbers form a spiral pattern which occurs everywhere in nature:

They are in the branching structure of trees…


the fronds of ferns…

in the shells of crustaceans…

and in the structure of the human hand



Beyond noting their ubiquity, I am not making any claims for these spiral patters as some kind of key to existence. Fibonacci numbers only describes, they does not explain. Those who have searched through them for the answers to life, the universe and everything (and many have) discover rather that these numbers do not reveal truth, all they reveal is more iterations of the same structure - behind the patterns there are only more patterns, stretching into infinity.

Complexity

The notion that behind a structure you get more structure is known as complexity. And this is also in essence the message of another concept called self similarity. Self similarity was examined in a famous 1967 paper by BenoĆ®t Mandelbrot “How Long Is the Coast of Britain? What Mandelbrot showed in that paper was that a measured length of a coastline behaves in a similar way over a range of measurement scales. He called this behaviour fractal.

Fractals are curves that are irregular all over. Moreover, they have exactly the same degree of irregularity at all scales of measurement. So it doesn't matter whether you look at a fractal from far away or up close with a microscope-in either case you'll see exactly the same picture. If you start looking from a distance (i.e., with a "long" ruler), then as you get closer and closer '(with shorter rulers) small pieces of the curve that looked like formless blobs earlier turn into recognizable objects, the shapes of which are the same as that of the overall object itself (Casti 1994, 232).
James Gleick in the book Chaos writes;

Although Mandelbrot made the most comprehensive geometric use of it… [s]caling also became part of a movement in physics that led, more directly than Mandelbrot's own work, to the discipline known as chaos. Even in distant fields, scientists were beginning to think in terms of theories that used hierarchies of scales, as in evolutionary biology, where it became clear that a full theory would have to recognize patterns of development in genes, in individual organisms, in species, and in families of species, all at once (Gleick 1988 , 115-116)

The principle of self similarity means that repeating patterns result in similar configurations nested over different scales

Here is the pattern the retreated tidal water makes in the sand...

Which is repeated in the configuration of a river delta...

Extraordinary complex phenomena can be broken down and modelled based on this concept:

The mathematician and brilliant computer scientist Alan Turing's last published papers, before his death in 1954, had studied the riddle of "morphogenesis"-the capacity of all life-forms to develop ever more baroque bodies out of impossibly simple beginnings. Turing's paper had focused on the recurring numerical patterns of flowers, but it demonstrated using mathematical tools how a complex organism could assemble itself without any master planner calling the shots (Johnson 2001, 14).

Turing's paper on morphogenesis found its application in Lindenmayer systems. These are sets of rules which can be used to generate self similar fractals that model the morphology of a variety of organisms.

The principles of fractal geometry are used in virtual reality modelling. The self similarity of fractals create extraordinarily life-like computer generated images....



These examples illustrate how recursion can generate extremely complex patterns by repeating simple configurations nested inside one another. The outer systems repeats the patterns are found in the inner systems, but those patterns act as structuring elements across the whole system.

Descartes’ legacy
in the 17th Century, the truth of Rene Descartes’ theory of being (ontology) was seemingly confirmed by the usefulness of his analytical geometry. Descartes argued that the true being of substance was contained in the simple fact that it could be extended in space along x, y and z axes. Today it can be argued that the truth of recursion, as a theory of being, is demonstrated in fractal geometry that creates stunningly simulations of the real world. (The fact that these are incompatible truths says more about the contingent and context bound nature of truth than it does about the weakness or either theory. The point being that a theory is true so long as it helps advance human understanding - as Descartes analytical geometry undoubtedly did in the fields of mathematics.)

Emergence
In August of 2000, a Japanese scientist named Toshiyuki Nakagaki announced that he had trained an amoebalike organism called slime mold to find the shortest route through a maze. Nakagaki had placed the mold in a small maze comprising four possible routes and planted pieces of food at two of the exits. Despite its being an incredibly primitive organism (a close relative of ordinary fungi) with no centralized brain whatsoever, the slime mold managed to plot the most efficient route to the food, stretching its body through the maze so that it connected directly to the two food sources. Without any apparent cognitive resources, the slime mold had "solved" the maze puzzle.


How did such a lowly organism come to play such an important scientific role? Slime mold spends much of its life as thousands of distinct single-celled units, each moving separately from its other comrades. Under the right conditions, those myriad cells will coalesce again into a single, larger organism, which then begins its leisurely crawl across the garden floor, consuming rotting leaves and wood as it moves about. In the simplest terms, [systems like slime mold] solves problems by drawing on masses of relatively stupid elements, rather than a single, intelligent "executive branch." They are bottom-up systems, not top-down. …In a more technical language, they are complex adaptive systems that display emergent behavior. In these systems, agents residing on one scale start producing behavior that lies one scale above them… The movement from low-level rules to higher-level sophistication is what we call emergence. (Johnson 2001, 11-18)



Emergence is easier to spot in primitive systems like slime mold because paradoxically their complexity exist at the microscopic level and their individual elements are opaque to the naked eye. It is only when you move away from microscopic scales and view phenomena at a macro level that its emergent properties become apparent. The human species for instance when viewed at cosmic distances is easier to represent as an emergent system. While. at a human everyday scale of understanding, the individual entity is more salient and emergent properties are correspondingly much harder to demonstrate.

Experiencing the world ultimately comes down to the recognition of boundaries: self/non-self, before/after, inside/'Outside, subject/object and so forth (Casti 1994, 230).

From the individual’s point of view existence is diagrammed in its binary oppositions, black/white, good/bad wrong/right etc. As Mandelbrot showed mathematically, the complexity of life is always going to escape the confides of our diagrams. because "Mountains are not cones, clouds are not spheres, and rivers are not straight lines." Furthermore, as Nakagaki found out with slime mold - collectively, self-organizing systems are smarter than their individual components. If we accept that self similarity works across all scales we could presume that the human species is more intelligent at the species level than it is at the level of individual human beings. This is a notion intimated by Jung’s collective unconscious and by the idea of God.

References

Casti, J. L. (1994) Complexification. London: Abacus.

Gleick, J. (1988) Chaos. London: Sphere

Johnson, S. (2001). Emergence. Harmondsworth: Penguin.