×
welcome covers

Your complimentary articles

You’ve read one of your four complimentary articles for this month.

You can read four articles free per month. To have complete access to the thousands of philosophy articles on this site, please

The Impact of Science

On Simplicity & Complexity

Phillip Hoffmann gives a simple introduction to a complex subject.

Some concepts in philosophy and science are surprisingly simple while others are surprisingly complex; and exploring the meaning of such concepts can reveal some interesting surprises. Take the concepts of simplicity and complexity, for example. What is it for a thing or a system to be simple, and what is it to be complex? These are simple enough questions, but the answers are anything but, and the search for answers takes us to the heart of computer science, physics and philosophy.

On an intuitive, everyday level, we tend to associate simplicity with things that are uncomplicated and easy to understand; in fact, we assume that even a child can appreciate something if it is simple enough. Complexity, on the other hand, brings to mind the idea of difficult, complicated or sophisticated systems, problems or things. In modern science, these concepts have taken on technical meanings and definitions, which accord with our intuitions only up to a point.

The pioneers of artificial intelligence research at MIT were struck by the fact that tasks that appear to be difficult and complex, such as solving calculus problems, turned out to be relatively easy to do from a computational point of view, whereas things we would regard as simple and requiring little intelligence, such as tying shoelaces, still thwart even the most advanced robots. And fairly simple systems, like the handful of rules and axioms that comprise arithmetic, can generate very difficult problems like Goldbach’s Conjecture (which states that every even number greater than two is the sum of two primes).

Given all this, how do we even begin to get a handle on the notions of simplicity and complexity? One way is through the concept of algorithmic complexity. For any piece of information, programmers are interested in finding the shortest program that could print out that piece of information (or string) and then stop. The length of this program represents the algorithmic information content (AIC) of that string. In this sense the number 10100, although extremely large, is actually quite simple, because the program required to generate it just has to instruct a computer to print out 1 followed by 100 zeroes. On the other hand, a random number consisting of 100 digits has a much higher AIC level, since the program required to print it would not be much shorter, if at all, than the original string itself. So in this sense, random numbers are complex, but are neither ordered nor particularly interesting. The number 10100 is highly ordered and definitely stands out in a crowd, so to speak, but it is very simple from a computational standpoint.

So wherein does true complexity lie? The human body, which comprises roughly a trillion cells, is a paradigm example of an extremely complex system. But if we scramble those trillion cells, as the villains did near the end of the movie Fargo when they fed their hapless victim into a wood chipper, we end up with an unholy mess. The poor guy ended up with his cells highly randomized, and the resulting system wasn’t very interesting or capable of doing much other than testing the strength of a good stain remover. When his cells were organized in such a way that all his liver cells were in one place, his brain cells were all together, and so on, he was technically far less random and in that sense a simpler system, but he was also clearly much more interesting. By ‘interesting’, I mean effectively complex, a concept I owe (along with AIC) to the physicist Murray Gell-Mann in his wonderful book The Quark and the Jaguar. Effective complexity involves a trade-off between the computational notions of simplicity and complexity discussed above, and points to a kind of complexity more in tune with our intuitive understanding of the concept.

Systems with high levels of effective complexity are associated with intermediate levels of AIC, and are delicately situated between order and disorder, regularity and randomness. Effective complexity involves a fine and delicate balance, much like life itself, which experience tells us is nothing if not a balancing act between all kinds of physical, environmental and even emotional, extremes. To indulge in a little metaphysical speculation about what all this might mean, optimizing such balances makes us more effectively complex systems, and may be some kind of organizing principle of evolution itself. Interestingly enough, Aristotle thought that various virtues consist in achieving what he called a mean between extremes, but that is the subject of another article.

© Phillip Hoffmann 2002

Phillip Hoffmann is an Australian philosopher who lives and works in Canada, and has been known to disorder his brain cells simply as a result of ordering drinks.

• For more on the topics of simplicity, complexity, computation and information, check out Gell-Mann’s book as well as Rudy Rucker’s equally stimulating Mind Tools.

This site uses cookies to recognize users and allow us to analyse site usage. By continuing to browse the site with cookies enabled in your browser, you consent to the use of cookies in accordance with our privacy policy. X