Sunday, August 7, 2016

Gosper's hierarchy of needs

In yesterday's post I tried to point out that the intuitions about whether a machine implementation of our minds was really conscious (etc) seemed to depend on how much its internal mechanism resembled our own. In particular, a Chinese Room implemented as a lookup table seemed particularly resistant to the notion that there's "somebody home."

But that left unexamined the question of how the lookup table got filled in. In the case of HashLife, the answer is straightforward: take the patch of cellular automata space you are trying to skip forward, run the Life algorithm on every possible configuration, and fill in your lookup table. But equally obviously, you don't actually have to do this ahead of time: you run your Life simulation as usual, looking for speedups in your table, and every time you see a situation you don't have listed, run it as normal a step at a time and then insert the results in the table. That's why it's a hash table, sparsely populated in the address space of starting patches.

In practice, the big systems in Life that the experimenters were trying to run were highly stylized, with glider guns and sinks and mirrors and similar gadgetry, to construct circuitry and Turing machines or even to emulate (!) more complex cellular automata. In such a case HashLife essentially creates a direct table-driven implementation of the higher-level machine.

How would we apply this scheme to running a human mind? We don't have hash tables in our heads, and whats more, the address space a human experiences is so vast and finely divided that we never experience exactly the same input or situation.

We don't have hash tables in our heads, but we do have circuitry that looks suspiciously like associative memory, a point I first ran across in Pentti Kanerva's thesis. What's more, as we know from our experience with neural networks, it is reasonably straightforward to arrange such circuitry so that it will find the nearest stored memory to the requested address. With a bit more work, you can make a memory that will interpolate, or extrapolate, two or more stored memories near a requested address.

Do you remember learning to walk, or tie your shoes, or tell time from an analog clock dial, or read and write? These were all significant cognitive challenges, and at one time you were heavily concerned with the low-level details of them.  But now you do them unconsciously, having essentially hashed them out to operate at a higher level.

Thus it seems not unreasonable to claim that the authentic human experience includes the HashLife-like phenomenon of losing direct consciousness of lower-level details in the process of becoming concerned with higher ones. Indeed I would claim that you cannot have the authentic human experience without it.

The remaining question is, how high up can we go?

No comments:

Post a Comment