I’m saying that your mind is literally a signal processing
system. Not just metaphorically; literally. That is, while minds have a
great many features, a powerful theory, in fact our standard theory, to
explain the mix of features we see associated with minds, is that minds
fundamentally function to process signals, and that brains are the
physical devices that achieve that function. And our standard theories
of how physical devices achieve signal processing functions predicts
that we can replicate, or “emulate”, the same signal processing
functions in quite different physical devices. In fact, such theories
tell us how to replicate such functions in other devices…[G]iven how rich and well developed are our standard theories of
minds as signal processors, signal processors in general, and the
implementation of signal processors in physical hardware, it hardly
seems fair to reject my conclusion based on a mere “metaphor.”
Unfortunately, the best possible response isn’t good enough. The “standard theories of minds as signal processors” that Robin refers to aren’t theories at all. They’re just eccentric tautologies.
As Robin has frankly admittedly to me several times, he uses the term “signal processors” so broadly that everything whatsoever is a signal processor. On Robin’s terms, a rock is a signal processor. What “signals” do rocks “process”? By moving or not moving, rocks process signals about the mass and distance of other objects in the universe. By warming or cooling, rocks process signals about the energy and distance of other objects in the universe. Etc. Woh.
The only way Robin can avoid the Metaphorical Fallacy, then, is to commit what I’ll call the Tautological Fallacy. Here’s how it works:
1. Define a concept Y such that all conceivable things are Y. (It’s rhetorically most effective to begin with a somewhat familiar semi-technical term like “signal processing,” then eccentrically redefine it).
2. Trivially infer that X is Y.
3. Claim this implies something substantive about X.
This is intellectual sleight of hand. Once you define every conceivable thing to be a signal processor, being a signal processor does not and cannot have any substantive implications. You can “upload a human mind to a computer” in the same trivial sense that you can “upload a rock to a computer” or “upload a rock to a human mind.” Anyone who thinks such sophistry is a path to personal immortality is sadly mistaken.
READER COMMENTS
Megapolisomancy
Sep 8 2012 at 6:58pm
Unlike Robin, a number of high profile cryonicists have been quite critical about the idea of mind uploading too, and have made similar arguments as you do.
See:
Can You Build a Locomotive out of Helium? Robert Ettinger on Substrate-Independent Minds:
http://www.alcor.org/magazine/2011/11/11/can-you-build-a-locomotive-out-of-helium
Eric
Sep 8 2012 at 7:05pm
In the last sentence, “personal immorality” should say “personal immortality” :-).
Kevin Dick
Sep 8 2012 at 8:11pm
It seems that you’ve become wedded to the “easy out” of constructing a fallacy and then claiming Robin is committing it.
How about we call this approach “The Fallacy Fallacy”?
There is a highly developed theory of computation. It is not a tautology. Claiming that the human mind is a computer according to the rules of this theory passes the smell test.
You can choose to disprove the theory, show that the human mind does not in fact conform to the rules described in the theory, or make an argument on practical grounds.
But if you actually want to convince anyone familiar with computational theory, you’ll have to do more work than claiming there’s a fallacy involved.
Andy
Sep 8 2012 at 8:30pm
Don’t economists commit this fallacy all the time with utility maximization and “revealed preferences?”
vlad
Sep 8 2012 at 8:47pm
The argument is this:
1. Everything that the brain does is information processing.
2. The “mind” is the information stored in your brain at a given moment in time.
3. Information is substrate neutral.
Therefore, the mind can be copied to a different information processing device that replicates the processes that now occur in your brain. After this happens your mind (the information part of you) will continue by virtue of the functioning of this other device.
Those who are not happy with this theory usually claim that 1 is false – i.e. there are SOME things done by the brain (such as consciousness) which are not information processing. Consequently, they hold that when your mind would be copied all this other stuff (the important stuff) is left behind. Hanson referred to the fact that the rough consensus among psychologists and neuroscientists is that 1 is in fact true.
The bottom line is that the arguments in this debate are not merely part of a language game – they rest on opposing empirical claims about what the brain actually does.
Greg G
Sep 8 2012 at 9:00pm
Just because it is a tautology doesn’t make it a fallacy. When Newton theorized that f=ma that was a tautology (another way of saying that ma=f) but that did not mean it was a fallacy and it did not mean the idea was trivial.
I’m not sure whether or not it is right to to think of everything as a signal processor. Whether or not it is right to think of minds and rocks as having this much in common, enough people find the idea objectionable to make the idea more than trivial.
Maybe Robin is wrong but I have to agree with Kevin. If you want to show he is wrong you will need to do more than invent a new fallacy and label his argument with it.
Vipul Naik
Sep 8 2012 at 10:23pm
You mis-spelled “immortality” as “immorality” in the last sentence.
F. Lynx Pardinus
Sep 8 2012 at 10:23pm
A “computer” and what is “computable” have precise mathematical meanings to computer scientists. Unless you’re prepared to talk in that language (previous posters have mentioned the Church-Turing Thesis and Turing-computable functions), people in the field are not going to take you seriously when you start pointing out their supposed fallacies.
J Storrs Hall
Sep 8 2012 at 10:57pm
As yet another commenter with a PhD in computer science, it is my relatively well informed opinion that Robin is right and you are wrong.
Urstoff
Sep 8 2012 at 11:43pm
Lots of ad hominem responses in the comments. Does Robin use a notion of “signal processing system” that is that inclusive or not?
This is not an idle question, and just because cognitive scientists use terms like “computer”, “information”, “signal processing”, etc. doesn’t mean that they have precise concepts in mind. Indeed, trying to provide a non-circular (read: one that doesn’t resort to semantic/representational language) definition of “information” or “computing” is notoriously difficult (and no, aside from a few papers in the 50’s, cognitive scientists are not using Shannon’s mathematical definition of information).
Here’s one attempt to define computing mechanistically: http://www.umsl.edu/~piccininig/Computing_Mechanisms.pdf
Whether its successful is another matter.
The point that Brain may or may not be making is that if your definition of “signal processing” (“information”, “computer”, etc.) is so broad as to count any causal system as a signal processor, then it tells you nothing about the actual workings of the mind/brain, for to say that the brain is a signal processor is simply to say that it is a causal system, which is true but unilluminating.
John
Sep 9 2012 at 5:51am
Urstoff,
Any reasonable definition of a signal processing must include all causal systems, because that’s what computers are made of. An AND gate is just a series circuit; an OR gate is a parallel circuit. The fact that we can build up simple circuits into incredibly complex arrangements that seem like something new and different to our human minds doesn’t change the fact that we are simply arranging the simple causal system of electric current to yield a desired result.
In other words, computers aren’t really new. A computer is just a particularly physical arrangement that turns a large number of tiny causal systems into a single much larger one. The fact that hitting the “Enter” key on my keyboard adds a newline character and places my cursor on the next line seems like something totally different from the fact that a rock falls if nothing is under it–but at the lowest level, the former process is entirely made up of equally mundane physical events. A circuit is closed, another opens, etc. There’s just a much, much larger number of them.
While we can’t upload a physical computer chip–or even a simple parallel circuit–it’s simple to use software to emulate that chip’s behavior. We can upload the relevant information about its behavior, and about the rules it follows. While we can’t upload a physical rock, we can certainly upload the relevant information about its behavior and the rules the rock follows, then emulate a rock to whatever precision we’d like. So it’s certainly possible to emulate the functions of a brain, to whatever degree of detail our processing power permits.
Urstoff
Sep 9 2012 at 8:11am
That makes no sense; that’s like saying any reasonable definition of a dishwasher must include all causal systems because that’s what dishwashers are made of, which is, of course, obviously false. Dishwasher are causal systems composed of smaller causal systems, but that doesn’t make a rock a dishwasher. Likewise, for “signal processor” to have any meaning or theoretical import, it needs to restrict the class of causal systems. Any definition that includes all causal systems is a vacuous and unilluminating definition.
So there needs to be a more specific definition. Piccinini (to whom I linked) argues that a computer (a related notion, but not, perhaps, identical to a signal processor) is a manipulator of strings. Clearly this excludes lots of causal systems (e.g., rocks).
Brandon Berg
Sep 9 2012 at 8:18am
The basic problem, it seems to me, is that we simply have no idea how subjective experience works. Until we figure that out, there’s no way of knowing whether it’s something that can be duplicated in computer hardware.
David R. Henderson
Sep 9 2012 at 9:07am
@Bryan,
I think you mean “personal immortality,” not “personal immorality.”
Steve Z
Sep 9 2012 at 9:29am
This seems like a rehash of Searle vs. Dennett, particularly the bit about the rock. One response is to ask people in Brian’s camp whether they are committing a god of the gaps fallacy. Let’s say you can’t upload a person. Can you upload declarative memories? Sure, why not. Then, can you upload or simulate facial recognition features? Again, sure, why not? By repeated iterations, this should clear the way to the real issue, which is that many have the sense that there are features of humans that cannot be uploaded to a neutral substrate.
Take, for example, qualia. There’s a paper, Mary in a White Room, that posits that the qualitative _feeling_ of seeing red is separate from all the data about red that exists. It doesn’t necessarily follow that it can’t be uploaded, but it could be that the feeling of seeing red need a substrate similar to a brain to emerge. There are those who deny qualia (the Churhlands, Dennett), and there are others who minimize it. Of course, we can’t really tell if anybody besides us has qualia, even if there are qualia (the p-zombie thing). So it’s kind of a lost cause interpreting whether AI possesses qualia. Pragmatically, it doesn’t seem to matter much.
My objection to mind uploading is quite different: it wouldn’t be me. I like kids, have kids, and selfishly want more kids. Importantly, my kids are similar to me, but different enough that they can “make their own mistakes.” To have an exact copy running around (or subsiding in a vat or whatever) would constitute a narcissistic injury. I would probably pounce on every weakness and imperfection of theirs–the moreso the more it mirrored my own–all the while ascribing these weaknesses and imperfections to “glitches with the transcription process.”
Lawrence D'Anna
Sep 9 2012 at 12:35pm
You are grossly mistaken. Rocks do not have high input impedance and low output impedance. Therefore, rocks are not signal processors in the sense that Robin uses the term. You must have totally failed to understand what Robin wrote in order make this “tautology” argument.
Lawrence D'Anna
Sep 9 2012 at 12:49pm
A rock is not a signal processor. A capacitor is not a signal processor either. But if you hook a (special kind of) rock and a capacitor up to an *amplifier* in just the right way, you can make a crystal oscillator circuit, which is a signal-generator and a oft-used component of signal processors.
Daublin
Sep 9 2012 at 1:23pm
Bryan, would you care to point out where Robin defines signal processing so broadly? This is twice in a row now that you have attributed an argument to him that is at odds with everything he actually posts.
For what it’s worth, I think Robin’s description of brains as signal processors is a fairly standard model of how thinking works. Brains are made up of neurons that send electrical signals to each other. When they receive signals, they emit other signals, depending on some sort of rules that are stored within the neurons. They also go through two kinds of learning processes: neurons will connect to different neurons over time, and their response patterns when confronted with incoming signals change.
In short, I don’t see how to weasel out of this on tautological grounds. Robin is talking about the real world, and you are replying with angels on a pin.
Stephen R. Diamond
Sep 9 2012 at 2:25pm
But Robin’s claim rests on fallacious reasoning. (However, I don’t think Bryan’s “fallacies” do much to clarify the error.)
I think the root of Robin’s error is mistaken adaptationist dogma in evolutionary theory. Essentially, Robin’s argument is:
Robin’s error occurs in the bolded point. Just because the “designer” (evolution or, like Dennett likes to say, “Mother Nature”) can only take incremental steps. It can’t reach most points in design space through such a process, making it improbable that it could reach the state of purifying its signal processing so it is medium independent.
MingoV
Sep 9 2012 at 5:30pm
Perhaps Robin Hanson’s brain is that limited, but I know that mine can do far more than process signals. I have no idea what “Standard Theory” of brain functioning he refers to. It’s not any theory I learned during or since medical school. Hanson regressed from a metaphorical fallacy to an erroneous initial argument.
roystgnr
Sep 9 2012 at 7:18pm
Sure, a rock is a signal processor in Robin’s sense. The signals we typically care about in rocks are things like “forces transmitted between underlying rock and overlying structures” or “fluids transmitted through fractures and pores”, and we simulate these things on computers all the time.
You’re just digging yourself in deeper here.
Mr. Econotarian
Sep 9 2012 at 9:09pm
We are already simulating small parts of neural circuitry today on computers, for example see: “A world survey of artificial brain projects, Part I: Large-scale brain simulations”.
That is about 10 petaflops/s. Sequoia, the IBM BlueGene/Q system installed at the Department of Energy’s Lawrence Livermore National Laboratory achieves 16.32 petaflops/s. So we likely have the raw computational power, but the trick now is to link the artificial neurons together.
Fortunately, we are getting better bran connectivity research as well, for example see the Connectome Project.
RPLong
Sep 10 2012 at 9:48am
I think Urstoff wins this discussion.
To his posts, I will merely add that there are sufficiently many aspects of the human experience that defy the ability to verbally describe in a way that is meaningful to everyone that it seems silly and vain to believe that we will one day be able to model it.
Scott Scheule
Sep 12 2012 at 2:21pm
I think Chalmers’s systems reply to the Chinese Room is illuminating here (and of course I recommend his The Conscious Mind for those with an interest in these issues). Imagine that we replace one neuron of Bryan’s mind with a robot that simulates perfectly what the prior neuron was doing. Is it likely that this minor replacement would make Bryan cease having consciousness or threaten his notion of selfhood? Surely not–neurons are being replaced piecemeal all the time, by various methods of cellular repair, et al.
So let’s replace one more of those neurons. Run same argument. It’s still Bryan!
Repeat however many billion times and we’ve got Bryan with a robotic brain.
Now take whatever platform you want to emulate Bryan to. Take the same tiny steps to get there and the same argument applies: it seems unlikely that any one step will result in the winking out of Bryan’s consciousness.
And note that this is independent of whether or not consciousness is spooky or if dualism is true.
The result is, I suggest, that the medium on which consciousness is implemented is irrelevant, so long as it is sufficient to replicate the original pattern. And if that’s true, Robin’s claim that the mind is a computer (in the sense of being uploadable) is correct–so no metaphor is being made.
Comments are closed.