Saturday, October 24, 2009

H+ -Ghost in the Shell: Why Our Brains Will Never Live in the Matrix

I am not a fan of transhumanist ideas about uploading consciousness into cyber networks - it seems to me a monumental failure to grasp the embedded nature of consciousness. It does not exist in a vacuum, but rather, as the sum of our physical, psychological, cultural, and social context and experience - to think you upload consciousness into a computer fails to grasp that removed from its context it would cease to exist.

Seems cool to see someone else, writing for a transhumanist magazine, also suggest that it is not possible to upload consciousness into "the matrix."

Ghost in the Shell: Why Our Brains Will Never Live in the Matrix

Written By: Athena Andreadis
Date Published: October 19, 2009

When surveying the goals of transhumanists, I found it striking how heavily many of them favor conventional engineering. This seems inefficient and inelegant, since such engineering reproduces slowly, clumsily and imperfectly, what biological systems have fine-tuned for eons, from nanobots (enzymes and miRNAs) to virtual reality (lucid dreaming). Recently, I was reading an article about memory chips. (See Resources) In it, the primary researcher makes two statements that fall in the “not even wrong” category: “Brain cells are nothing but leaky bags of salt solution,” and “I don’t need a grand theory of the mind to fix what is essentially a signal-processing problem.”

And it came to me in a flash that many transhumanists are uncomfortable with biology and would rather bypass it altogether for two reasons, each exemplified by these sentences. The first is that biological systems are squishy — they exude blood, sweat and tears, which are deemed proper only for women and weaklings. The second is that, unlike silicon systems, biological software is inseparable from hardware. And therein lies the major stumbling block to personal immortality.

The quest to restore damaged cognitive functions with electronic parts begins with a small dish of living rat brains [above], located inside a lab at the University of Southern California. Photo credit: John B. Carnett
The analogy du siècle equates the human brain with a computer -- a vast, complex one performing dizzying feats of parallel processing, but still a computer. However, that is incorrect for several crucial reasons that bear directly upon mind portability. A human is not born as a tabula rasa, but with a brain that’s already wired and functioning as a mind. Furthermore, the brain forms as the embryo develops. It cannot be inserted after the fact, like an engine in a car chassis or software programs in an empty computer box.

Theoretically speaking, how could we manage to live forever while remaining recognizably ourselves to us? One way is to ensure that the brain remains fully functional indefinitely. Another is to move the brain into a new and/or indestructible "container,” whether carbon, silicon, metal or a combination thereof. Not surprisingly, these notions have received extensive play in science fiction, from the messianic angst of The Matrix to Richard Morgan's Takeshi Kovacs trilogy.

The MatrixTo give you the punch line up front, the first alternative may eventually become feasible but the second one is intrinsically impossible. Recall that a particular mind is an emergent property (an artifact, if you prefer the term) of its specific brain –- nothing more, but also nothing less. Unless the transfer of a mind retains the brain, there will be no continuity of consciousness. Regardless of what the post-transfer identity may think, the original mind with its associated brain and body will still die –- and be aware of the death process. Furthermore, the newly minted person/ality will start diverging from the original the moment it gains consciousness. This is an excellent way to leave a detailed memorial or a clone-like descendant, but not to become immortal.

What I just mentioned essentially takes care of all versions of mind uploading, if by uploading we mean recreation of an individual brain by physical transfer rather than a simulation that passes Searle’s Chinese room test. However, even if we ever attain the infinite technical and financial resources required to scan a brain/mind 1) non-destructively and 2) at a resolution that will indeed recreate the original, several additional obstacles still loom.

Mary Shelley's Frankenstein - Photo courtesy of: horrorstew.com
Mary Shelley's Frankenstein - Photo courtesy of: horrorstew.com
The act of placing a brain into another biological body, à la Mary Shelley’s Frankenstein, could arise as the endpoint extension of appropriating blood, sperm, ova, wombs or other organs in a heavily stratified society. Besides being de facto murder of the original occupant, it would also require that the incoming brain be completely intact, and be able to rewire for all physical and mental functions. After electrochemical activity ceases in the brain, neuronal integrity deteriorates in a matter of seconds. The slightest delay in preserving the tissue seriously skews in vitro research results, which tells you how well this method would work in maintaining details of the original’s personality.

To recreate a brain/mind in silico, whether a cyborg body or a computer frame, is equally problematic. Large portions of the brain process and interpret signals from the body and the environment. Without a body, these functions will flail around and can result in the brain... well, losing its mind. Without corrective “pingbacks” from the environment that are filtered by the body, the brain can easily misjudge to the point of hallucination, as seen in phenomena like phantom limb pain or fibromyalgia. Additionally, processing at light speed will probably result in madness, as everything will appear to happen simultaneously or will change order arbitrarily.

Finally, without context we may lose the ability for empathy, as is shown in Bacigalupi’s disturbing story People of Sand and Slag. Empathy is as instrumental to high-order intelligence as it is to survival: without it, we are at best idiot savants, at worst psychotic killers. Of course, someone can argue that the entire universe can be recreated in VR. At that point, we’re in god territory… except that even if some of us manage to live the perfect Second Life, there’s still the danger of someone unplugging the computer or deleting the noomorphs. So there go the Star Trek transporters, there go the Battlestar Galactica Cylon resurrection tanks.

Let’s now discuss the possible: in situ replacement. Many people argue that replacing brain cells is not a threat to identity because we change cells rapidly and routinely during our lives -- and that, in fact, this is imperative if we're to remain capable of learning throughout our lifespan.

SynapsesIt's true that our somatic cells recycle, each type on a slightly different timetable, but there are two prominent exceptions. The germ cells are one, which is why both genders — not just women — are progressively likelier to have children with congenital problems as they age. Our neurons are another. We’re born with as many of these as we’re ever going to have and we lose them steadily during our life. There is a tiny bit of novel neurogenesis in the olfactory system, but the rest of our 100 billion microprocessors neither multiply nor divide. What changes are the neuronal processes (axons and dendrites) and their contacts with each other and with other cells (synapses).

These tiny processes make and unmake us as individuals.

Read the rest of this long and interesting article.


2 comments:

raw by default said...

While I don't think we'll ever be able to stick a consciousness in a computer, I'm not sold on the idea that "a particular mind is an emergent property (an artifact, if you prefer the term) of its specific brain –- nothing more, but also nothing less". Individual minds may be linked to (and influenced by) their corresponding brains, but I don't agree that the mind is nothing more than an artifact of the brain.

Where does the mind go when we die? Just because we don't know doesn't necessarily mean it ceases to exist.

william harryman said...

I agree: a mind is much more than an emergent property of the brain - it also includes the body, the culture, the society - a mind is part of a cultural-social context, as well as its unique body and brain

Peace,
Bill