[<< | Prev | Index | Next | >>]

Tuesday, August 26, 2003

Simon's Hierarchy of Consciousness



Ok, so this was what my second-previous entry was supposed to be about but I got sidetracked.

Along with the happy hoopla about the impending fall of mankind at the clinking claws of robots, there is a lot of active speculation going around about the nature of consciousness. This makes me happy. It's been politically incorrect to even speak the word "consciousness" in AI academia for the last decade or two, on the basic assumption that anything you said would certainly be wrong and therefor you'd have been arrogant for saying it.

Now academia is finally realizing that consciousness is going to spring from commercial research (where they are paid to be arrogant) if academia doesn't get its act together first, and so everybody's out in droves saying lots of wrong things. But really I think this is a vast improvement over saying nothing, because there's more likely to be something right amidst a bunch of wrong than there is amidst nothing.

Here's my contribution to the fray, which seems to disagree as usual with most everyone else:

Level 0 - Perception
The first thing to grok about consciousness is so implicit in the way people talk about it that people just miss it entirely. The whole subjective experience of consciousness, which is really what most people are most concerned with when talking about it, is exactly that -- experience, aka perception. People want it to be more, they say "oh, but I have a sense of place, and a sense of self, and a sense of god, and bla bla bla", but again they betray the simple truth in their own words: Sense as in senses. The five we know well by name are all external, but the same act of perception goes on inside as well, and all our subjective experience of being conscious is, once again, just that: experience.

To drive the point home, I'll say more about what it is not: You do not need memory of the past in order to be conscious, nor the ability to plan into the future, nor even the ability to act or choose. You do not even need control over your own attention. Examples exist each of these cases of humans lacking these elements and yet clearly still being conscious. There was the famous patient HM who could form no new memories and yet in the moment was as conscious as you or I. There are people who have damaged a part of their brain leaving them without volition, but who have later recovered to tell of what it is like to be conscious without will--they were aware of their surroundings, but simply had no desire to act, and no desire to think [1]. To understand this better, just imagine a sort of meditation where you sit with your eyes open but exert no mental effort whatsoever, passively absorbing your surroundings without thinking about it. In contrast to HM, you would not appear very conscious from the outside, but also in contrast to HM you do have a memory and can later remember what it was like. And unless you fell asleep, the introspective experience of being conscious would still be there--the act of perceiving the moment. Likewise, consciousness does not require a sense of location or space as is commonly claimed. Put yourself in a sensory deprivation chamber while listening to, and focusing on, a monophonic stream of simple tones. In the short time before you go bonkers, you will probably lose all sense of space--the part of your brain that deals with space just isn't needed there, and even though it's usually engaged in the course of every-day life, you don't cease to be conscious when it goes quiet.

In sum, the subjective experience of consciousness is synonymous with perception.

This may seem to shift the question onto some humunculous that implements perception, but my claim is more radical than that: Perception in turn is nothing more than the processing and abstraction of sensory inputs (including internally generated senses, not just the five+ externals). This processing may employ history-encoding state which in effect would be a form of short-term memory, but that is not a prerequisite to conscious perception.

So, yeah, that's it. I claim that is all that our introspective sense of consciousness is--just some information processing of our sensory inputs.

But this says little about why we sense what we sense. That is, why does our own consciousness which we perceive as such do what it does--why do we get this particular set of (internal) inputs to our senses?

Before we go on, though, I must re-emphasize that the whole of the internal experience of consciousness is addressed in the above. There's just nothing more to it, nor can there be, because experience is perception. (Obviously I haven't explained the math of perception, but I claim it doesn't matter--so long as the results can be linked up with the non-experiential aspects of consciousness which I will address next...)

Level 1 - Choice and Action (Will)
While perception may provide the whole of the introspective experience of consciousness, it sure isn't very useful by itself. To be conscious in the moment, but without memory and without will, that is such an impotent and fleeting existence that few would credit it with consciousness (and yet still I claim if it was you in there, you'd still feel conscious). Level 0 gets a 0 because it is truly a moot state--all experience with no purpose. Evolution would never create it (though a bathtub chemistry lab or playing with guns might).

So for our next module, we layer on choice. Choice and action are inseparable from a cognitive standpoint--choice is always amongst possible actions (even if some, many, or all of those possible actions are internal, cognitive actions--more later). In its simplest implementation, choice is just a formula which maps the current perceptual state into an action. In a slightly more complex formulation, the choice machinery itself may maintain internal state apart from the perceptual state. This choice state may or may not then be accessible as input to the perceptual system. (In humans, much of it clearly is not.)

To concretize this a little, consider a bug that hunts for food by smell until it sees something moving and then it runs for cover and waits a while. The perceptual system processes the smells and visual clues and provides an abstract set of readings which the choice system then churns into simple action decisions: If we're in hunt-for-food state, turn toward the food smell and walk, but if we see something moving, switch to the run-for-cover state. If we're in the run-for-cover state, turn toward the darkest object and run. And so on.

Level 1 makes for a decent bug, but it falls short when it comes to complex analysis.

Level 2 - Focus and Attention
Attention comes after choice because attention is an action. There are practical limitations to what a perceptual system is capable of with limited hardware. One of the ways we get around these limitations is by multiplexing that hardware. Our eyes do not see the whole world before us in our highest possible detail--they would have to be as big as our head to do that, so instead they only see the part we are looking at. And so, where to look becomes a choice, and looking an action. And this literal truth apparent in our eyes becomes a metaphor for what goes on inside as well: of the multitude of aspects of our internal perceptual experience, we must choose which to attend to and which to ignore (if just for a moment). As our eyes provide a mechanical outlet for our choice to act on our visual attention, so our internal perceptual hierarchy also provides hooks for our choice to act on our internal attention.

And once again, this choice is but a formula, combining some internal state with the perceptual state and generating an output which causes an action. I.e., Level 2 really adds very little beyond Level 1, since the ability to choose was already there--it simply adds a particular outlet of choice, one which happens to reach back in and guide our perceptual system (Level 0).

Level 3 - Planning
It is natural for Levels 0 though 2, when well evolved, to implement a working model of the perceptual space. That is, a natural side effect of being able to (actively, in this case) abstract the perceptual inputs is being able to "fill in the blanks"--whether those blanks are because something is obscured, simply unobserved, or because it hasn't happened yet. So--rather by accident in some sense--all the information and most of the circuitry is probably already in place to do something quite remarkable: look forward in time. Further, because our perceptual representation itself is (implicitly by now) a hierarchy of abstractions, we can look forward by tiny, concrete steps, by giant, abstract steps, or by anything in between.

The method and means of controlling this ability are once again just an extension of Level 1, probably via hooks mostly already in place for Level 2. Shifting of focus forward and backward in time is essentially the same as shifting it to and fro in space. The largest addition of Level 3 may be simply in the choice formula, which now must be extended to encompass a planning algorithm. While in practice this algorithm, and those for Levels 1 and 2 as well, are partly hard-coded and partly learned, the end result by whatever path is simply that there is a choice algorithm, and it decides what actions to take when, what to attend to at what moment, when and where to project into the future, and how to integrate the results of all of that into the next moment's choices. This algorithm, rooted back in Level 1 where we called it merely a formula, is the seat of the will, and determines what you want at each moment of time. In fact, the very notion of "wanting" merely expresses the outcome of this algorithm, and it is only through our rich perceptual module that we eventually come to abstractly represent not only our self, but our internal states and actions, in as much as we are able to directly perceive or indirectly infer them. [This, by the way, has just summed up "emotions" too, since emotions are essentially state variables in our choice algorithm for which we have some--often quite indirect--perceptual access to. Our emotional state affects the choices we make; and rather separately from that, we have some ability to perceive--mostly through inference from indirect queues--our own emotional state.]

Level 4 - Memory (but not Learning?)
The short-term memory we have come to rely on, our scratch pads of iconic and echoic memory (your mind's eye and ear) as well as our general sense for recent events, object locations, and such, are best treated simply as robust aspects of our perceptual system--which our focus choices make deft use of in order to form more useful perceptual abstractions than could be done with less residual state. So short-term memory is effectively already accounted for in Level 0.

Conceptual learning is paramount to the development of consciousness, at least in the higher animals, in that a great deal if not the vast majority of the choices we make, including choices of focus, are made based on integrated past experience. But if you turn off conceptual learning, an organism does not cease to be conscious, it simply ceases to improve or change with time. Conceptual learning, by nature of being integrated experience, leads to gradual change, and so is unimportant here where we are concerned with the nature of the immediate experience of consciousness.

So really the only addition for Level 4 is long-term instance memory, our non-integrated snapshots of past experience which we are able to recall through acts of will. This is a fairly distinct add-on module that records from and imposes recalled states upon our perceptual system in response to choice actions.

So, in sum, our sense of being conscious is the act of us perceiving both our outer and inner worlds. Our inner world includes a number of modules which are wired into our perceptual tree in various ways, allowing our choice algorithm to control our perceptual focus, to recall events from the past, to project into the future, and so on with other things I haven't touched on here ("imagination" is an extension of planning...). We perceive the actions we take as being our "choice" or "free will" because they truly are--the choice algorithm is our will, and it is wired right into our perceptual tree, which is our awareness.


[1] Search on akinetic mutism and/or anterior cingulate damage; also page 253 of Phantoms in the Brain; also mentioned here.


[<< | Prev | Index | Next | >>]


Simon Funk / simonfunk@gmail.com