Archive for the Ridley Scott Category

The Promethean Urge

Posted in Entertainment, Ethics, Film, Forbidden Fruit, God, Literature, Morality, Mythology, Philosophical and Religious Reflections, Pop Cultural Musings, Prometheus, Religion, Ridley Scott, Science, Science fiction, Technology with tags , , , on June 10, 2012 by Uroboros

When Prometheus steals fire from Zeus and gives it to man, his punishment is to be bound to a mountaintop for all eternity. Each day, an eagle eats his liver. Each night, the wounds heals, and the next day the torture begins again. The idea of breaking through boundaries is the key to the enduring power of the Promethean myth. Humans can’t help being curious about what’s really ‘out there’ beyond the veil of appearance—can’t help being tempted by the fruit of knowledge that grows there and the power it bestows.

When we part the veil and peer into the other side, though, are we gazing at something we were meant to see, or at a realm that is beyond human capacities and thus dangerous to behold? Quite often, people think ‘God’ is on the other side—that ‘He’ has drawn the line, and it is out of pride that we want to trespass and set up camp in ‘His’ space. As sinful, broken creatures, we simply don’t know when to quit. A human is, by definition, the kind of being who won’t, or possibly can’t, accept limitations on its nature. Since we were made in God’s image, we are invariably tempted to become what we behold that mirror image to be.

A survey of human history reveals that, despite our reservations, we have been playing ‘God’ right from the beginning. Restless creatures that we are, humans have always been asking questions, testing possibilities, and putting answers into practice. If we didn’t continually test the bounds and explore ‘God’s territory,’ we’d still be hunting and gathering—we’d still be following and praying to animals. Because we’ve indulged our Promethean urge, however, most humans don’t worship animals anymore. We keep them as pets. We clone them. With the power of genetics, we’re remaking life itself in our image. Modern civilization has re-framed the boundaries of its looking glass and is both enamored and terrified by what it sees. Undeniably, science and technology have enriched our lives—enhancing our ability to alleviate suffering, to travel previously unthinkable distances, and communicate with each other on a global scale—but we can also annihilate ourselves with the push of a button. We can create tools of mass salvation and destruction. The sci-fi authors and filmmakers I admire most tend to imply that the sin isn’t necessarily in wanting to explore unknown territories. Hubris isn’t an epistemological issue. It’s a moral one. The sin is in running away from the implications of what you find. It’s in disowning what you create in the process. Humanity does have to accept at least one limitation: we can’t have the fruit of forbidden knowledge and eat it too.


More Human Than Human: Blade Runner and the Radical Ethics of A.I.

Posted in A.I., artificial intelligence, Blade Runner, Brain Science, Christianity, Consciousness, Descartes, Entertainment, Ethics, Film, Jesus, Morality, Neurology, Phillip K Dick, Philosophical and Religious Reflections, Philosophy of Mind, Pop Cultural Musings, Prometheus, Psychology, Religion, Ridley Scott, Science, Science fiction, Uncategorized with tags , , , , on April 27, 2012 by Uroboros

Blade Runner: What makes us human?

Self-consciousness is a secret, or at least its existence is predicated upon one. The privacy of subjective experience has mystified philosophers for centuries and dogged neuroscientists for decades. Science can, in principle, unravel every enigma in the universe, except perhaps for the one that’s happening in your head right now as you see and understand these words. Neurologists can give rich accounts of the visual processing happening in your occipital lobes and locate the cortical regions responsible for parsing the grammar and grasping the concepts. But they can’t objectively identify the ‘you’ part. There’s no neuron for ‘the self.’ No specific neural network which is essentially causing ‘you’ –with all your unique memories, interpretive quirks, and behavioral habits—to read these words have the particular experience you are having.

This problem is illustrated in debates about artificial intelligence. The goal is to create non-biological sentience with a subjective point-of-view, personal memories, and the ability to make choices. The Turing Test is a method for determining whether a machine is truly intelligent, as opposed to just blindly following a program and reacting algorithmically to stimuli. Basically, if a computer or a robot can convince enough people in a blind test that it is intelligent, then it is. That’s the test. The question is, what kind of behaviors and signs would a machine have to have in order to convince you that it’s self-aware?

Voight-Kampf Test

The 1982 film Blade Runner, based on Phillip K. Dick’s novel Do Androids Dream of Electric Sheep?, has a version of this called the Voight-Kampf test. The androids in the story, Nexus-6 Replicants, are so close to humans in appearance and behavior that it takes an intense psychological questionnaire coupled with a scan of retinal and other involuntary responses to determine the difference. A anomalous emotional reaction is symptomatic of artificial, as opposed to natural, intelligence. Rachel, the Tyrell corporation’s most state-of-the-art Replicant, can’t even tell she’s artificial. “How can it not know what it is?” asks Deckard, the bounty hunter charged with ‘retiring’ rogue Replicants. Tyrell says memory implants have given her a sense of self, a personal narrative context through which she views the world. The line between real and artificial humans, therefore, is far from clear. Rachel asks Deckard if he’s ever ‘retired’ a human by mistake. He says he hasn’t, but the fact that Rachel had to ask is telling. Would you want to take this test?

If you think about it, what makes you’re own inner subjectivity provable to others—and their subjectivity provable to you—are the weird kind of quirks, the idiosyncrasies which are unique to you and would be exceedingly difficult for a program to imitate convincingly. This is what philosophers call the problem of other minds. Self-consciousness is the kind of thing which, by its very nature, cannot be turned inside out and objectively verified. This is what Descartes meant by ‘I think, therefore I am.’ Your own mental experience is the only thing in the world you can be sure of. You could, in principle, be deluded about the appearance of the outer world. You think you’re looking at this computer screen, but who do you know you’re not dreaming or hallucinating or are part of Matrix-like simulation? According to Descartes’ premise, even the consciousness of others could be faked, but you cannot doubt the fact that you are thinking right now, because to doubt this proposition is to actually prove it. All we’re left with is our sense of self. We are thinking things.

Fembot Fatale

The Turing Test, however, rips the rug away from this certainty. If the only proof for intelligence is behavior which implies a mindful agent as its  source, are you sure you could prove you’re a mindful, intelligent being to others? Can you really prove it to yourself? Who’s testing who? Who’s fooling who?

The uncanny proposition hinted at in Blade Runner is that you, the protagonist of your own inner narrative, may actually be artificial, too. Like Rachel and the not-so-human-after-all Deckard, you may be an android and not know it. Your neural circuitry may not have evolved by pure accident. The physical substrate supporting your ‘sense of self’ may be the random by-product of natural selection, something that just blooms from the brain, like an oak grows out of an acorn—but ‘the you part’ has to be programmed in. The circuitry is hijacked by a cultural virus called language, and the hardware is transformed in order to house a being that maybe from this planet, but now lives in its own world. Seen this way, the thick walls of the Cartesian self thin out and become permeable—perforated by motivations and powers not your own, but ‘Society’s.’ Seen in this light, it’s not as hard to view yourself as a kind of robot programmed to behave in particular ways in order to serve purposes which are systematically hidden.

This perspective has interesting moral implications. The typical question prompted by A.I. debates is, if we can make a machine that feel and thinks, does it deserve to be treated with the same dignity as flesh and blood human beings? Can a Replicant have rights? I ask my students this question when we read Frankenstein, the first science fiction story. Two hundred years ago, Mary Shelley was already pondering the moral dilemma posed by A.I. Victor Frankenstein’s artificially-intelligent creation becomes a serial-killing monster precisely because his arrogant and myopic creator (the literary critic Harold Bloom famously called Victor a ‘moral idiot’) refuses to treat him with any dignity and respect. He sees his artificial son as a demon, a fiend, a wretch—never as a human being. That’s the tragedy of Shelley’s novel.

Robot, but doesn’t know it

In Blade Runner,the ‘real’ characters come off as cold and loveless, while the artificial ones turn out to be the most passionate and sympathetic. It’s an interesting inversion which suggests that what really makes us human isn’t something that’s reducible to neural wiring or a genetic coding—it isn’t something that can be measured or tested through retinal scans. Maybe the secret to ‘human nature’ is that it can produce the kind of self-awareness which empowers one to make moral decisions and treat other creatures, human and non-human, with dignity and respect. The radical uncertainty which surrounds selfhood, neurologically speaking, only heightens the ethical imperative. You don’t know the degree of consciousness in others, so why not assume other creatures are as sensitive as you are, and do unto others as you would have them do to you.

In other words, how would Jesus treat a Replicant?

%d bloggers like this: