...matter, life, mind and society are simply different levels or aspects of the same thing

Steve Grand

Watts and Strogatz found that the insects were able to manage the synchronisation almost as readily as if everyone were talking to everyone else. By itself, the small world architecture of the reduction in the required number of links by a factor of thousands. There is a profound message lurking here - the message not about biology but about computation.
From an abstract point of view, the group of fireflies trying to synchronise themselves is making an effort in computation. As a whole, the group attempts to process and manage myriads signals, counter-signals and counter-countersignals travelling between individual fireflies, all in an effort to maintain the global order. This computational task is every bit as real as those taking place in a desktop computer or in the neural network of the human brain. Whatever the setting, computation requires information to be moved between different places. And since the number of degrees of separation reflects the typical time needed to shuttle information from place to place, the small world architecture makes for computational power and speed.
Of course, no one knows how fireflies are really connect within a swarm. Indeed, only a few species manage the synchronisation trick.Watts and Strogatz had not answered all the questions about fireflies, and many remain unanswered still.
Nevertheless, they had learned that in terms of computational design, small world architecture is especially important "good trick". When it comes to computation, though, nothing is so wondrous as the human brain. And so it was natural to wonder, might the brain also exploit the small world trick?....Does it point to some deeper design principle of nature? In their three-page paper in the June 4,1998, issue of "Nature", Duncan Watts and Steve Strogatz unleashed all these findings on an unsuspecting scientific community. Their paper touched off a storm of further work across many fields of science. As we will see an upcoming chapters, a version of their small world geometry appears to lie behind the structure of crucial proteins in our bodies, the food webs of our ecosystem, and even the grammar and structure of language we use. It is the architectural secret of the Internet and despite its apparent simplicity it is in always a new geometrical and architectural idea of immense importance. pg 48

Mark Buchanan
Small Worlds and the Groundbreaking Science of Networks
W.W.Norton 2002
...The natural world is composed of a hierarchy of "persistent phenomena", in which matter, life, mind and society are simply different levels or aspects of the same thing. I propose that this natural hierarchy can be mirrored by an equivalent one that exists inside a parallel universe called cyberspace. I want to sketch an outline for a common descriptive language which can be used at all levels of the hierarchy. In this language which shall find the basic operators of which life and mind are constructed. To create artificial life we have to understand the nature of this hierarchy, implement simulations of these basic operators using a computer (or other device) and build upon that foundation the higher levels of persistent phenomena that we seek. A computer cannot be intelligent, but did can create a parallel universe in which natural forms of intelligence can be replicated
pg 20 ...the problem is that the digital computer was modelled on the outward appearance of mental processes, rather than the structures that give rise to them. Even though we know our brains consist of vast numbers of neurones operating in parallel, we each appear to have only one mind.
This mind seems to operate in a stepwise way, thinking about to carrying out sequences of actions one at a time. We also get a sense that our conscious thoughts or at the top of a chain of command - we take the big decisions consciously, but then delegate the task of carrying them out to some lower, subconscious parts of our brains. The mind therefore gives us the impression that it is top-down (it employs a chain of command), serial (only one mind her brain, operating one step at the time) and procedural (works in terms of logical procedures to be followed, as in a recipe).

The digital computer is similarly a serial machine because it only carries out one operation at the time. It is procedural because the basic units of a programme are actions to be carried out (such as "add these two numbers and store the result here"). It is also top-down, since computer programmers tend to design their programmes as control hierarchies.
The computer was designed as a model of how the mind seems to work, and the operation of a computer program was assumed to be very similar to thinking.
.... it is really only philosophers and mathematical logicians who would believe that thinking amounts to the formal manipulation of symbols according to set rules. Most of the time the rest of us don't think in need syllogisms or conduct formal arguments in our heads. More often than not the answer is just occurred to us in some mysterious way, and we use logic only in retrospect as a means of justifying our conclusions to others all to ourselves.
So the digital computer was in many ways the wrong tool, applied to the wrong job. Ironically, though, this most organised of machines is such a powerful concept that it can actually get round its own limitations, but only if one thinks about it in the right way. This book is very much about how to turn the prim, tightly organised digital computer into a disorganised, self organising machine. We shall use the serial, procedural, top-down computer as a tool to create new machines that are parallel, relational and bottom-up.
pg 25
intelligent systems must be designed to emerge from the bottom-up.

A system will not be intelligent unless it is also alive.
Intelligence is a property of populations.

Grand pg 12
Grand pg 130
Keywords : FEEDBACK - topological approach to studying complex systems - PHASE LANDSCAPES - complexity theory - hyperspace - an ever-changing, restless adaptive system - PREDICTION - adaptive behaviour - INTELLIGENCE - ability to learn the relationships between cause and effect and to use them to predict the future is what is called intelligence - theory of mind - mental models of the world - INFORMATION - Signals are therefore not stuff. They are non-physical, persistent patterns with a coherence and existence of their own - compressibility - 'meaning' - an observer effect -
Grand pg 73
Keywords : emergence - interactions - When populations of interacting structures become arranged in certain configurations, and something new and surprising comes into existence, we call this an emergent phenomenon - Conway's Game of Life: glider - logical reasoning - prediction - we could, in principle, predict the existence of a glider from the rules of Life, but only by actually carrying these rules out, simulating the system - Matter is just one link in the chain of being . Atoms are no more real than societies or minds. Hardware is a subset of software - self-sustaining patterns in space and time - The interconnectedness of all things - self-organizing chains of cause and effect - 'problem' of how mind has influence over matter is spurious. Since the two are not distinct, the idea that one can affect the other should not be at all difficult to accept - purposeful action - cause and effect act in webs, not chains - think of the information acting upon the brain at least as much as we think of the brain acting on the information - gestalt: form, the whole that is greater than the sum of its parts - Life, we discover, is a loose coalition of selfmaintaining eddies in a flowing stream. When the whole becomes more than the sum of its parts something new and perfectly real comes into existence. This gestalt is not mysterious, but neither is it a figment of our imagination - the basic buildin blocks of cause and effect

Steve Grand
Life and How to Make it
Phoenix 2001
pg 29
Virtual machines are real universes
Universes are made according to laws or rules, and a computer program is a set of rules. When a program is run, and the rules are followed by the computer, a digital universe is created. In computer science this is not a new idea. The concept of the digital universe has been known and understood for many years. It has simply had a different name. Traditionally, digital universes have been called virtual machines.
The computer program defines a digital universe. A computer executing that program creates that digital universes. This is the essence of software, computer programs and digital universes. Fundamentally, they are all a collection of rules that define the behaviour of the computer. These rules are written down using a high-level language, compiled into a much greater number of low-level instructions, and executed by the rules embedded in the electronic circuits of the computer.
Every one of our computers is capable of emulating the behaviour of every other computer (UTM -Universal Turing Machine). When it computer is asked to run such emulation software, it is transformed. It is no longer able to run the software designed for it. The action other keys, the output to the screen, the operation of the mouse - everything has changed. The computer can now run only software designed for the computer it is emulating. Because he emulation software alter the behaviour of the computer in this way, we call it a virtual machine.
In general, the virtual machine is a piece of software that defines an environment. That environment may duplicate the behaviour of another physical computer or it may be a distinct environment in its own right - an environment that can be created by different models of computers and provides a consistent interface for other pieces of software. And, with a little stretch of the terminology, we can regard every piece of software as a type of virtual machine. Our operating system defines a clearly identifiable environment, a familiar look and feel, with consistent behaviours in response to your input.

Peter Bentley
Digital Biology
Headline Book Publishing 2001

evolutionary algorithms: Bentley 47

HOME      BOE     SAL     TEXTE