Archive for the anxiety Category

Re-imagining Dragons: Gojira, Kami, and the Kaiju of Unintended Consequences

Posted in anxiety, Apocalypse, archetypes, armageddon, collective unconscious, emotion, Film, filmmaking, Horror, Monster, Monsters, Myth, Mythology, Pop culture, Religion, Science, Science fiction, social psychology, Speculative fiction, Technology, terror, war with tags , , , , , , , , , , , , , , , , , , on May 16, 2014 by Uroboros
Godzilla (2014)

Godzilla (2014)

It must have been an eerie moment when, half an hour after the sky lit up over Bikini Atoll, the flakes began to fall. The crew of a Japanese fishing boat called the Lucky Dragon had no idea the ashes swirling down around them were from Castle Bravo. The 15 megaton hydrogen bomb dropped on March 1, 1954 was the most powerful weapon ever tested by the US military. The blast exceeded its expected radius, and the dust the crew brushed off their heads and shoulders that day was contaminated. Upon returning to Japan, the whole crew was sick, and, seven months later, Aikichi Kuboyama, the boat’s chief radioman, died from the radiation. Less than a decade after the end of WWII, the Lucky Dragon incident reignited the post-Hiroshima traumatic stress still festering in the Japanese psyche, sparking an idea in the mind of filmmaker Ishiro Honda, an iconic character he described as “the A-bomb made flesh:” Gojira (a mash-up of the Japanese words for gorilla and whale), which in English, of course, became ‘Godzilla.’

Operation Bravo: H-Bomb Test

Operation Bravo: H-Bomb Test

In its groundbreaking 1954 cinematic debut, this scaly leviathan, roused and swollen by nuclear contamination, emerges from the sea and smashes its way through Tokyo, whipping its gargantuan tail and spitting radioative fire. The original black and white movie alternates between scenes of fatalistic dread and apocalyptic devastation that are so downbeat and dire one wonders why it was such a hit, spawning, not only an ever-growing brood of sequels, but a whole new film genre. What does it say that, so soon after WWII, Japanese moviegoers paid to witness the simulated destruction of Tokyo again and again? One wonders if it was just a mindless diversion—an escapist fantasy where, in a state of titilation and sublime awe, they could feast their eyes on images of mass destruction—or if something deeper, more cathartic was happening. Were audiences subconsciously cleansing the psychic stain of national traumas and tragedies?

Gojira (1954)

Gojira (1954)

During the postwar occupation, the United States prohibited Japanese filmmakers from depicting anything overtly militaristic, so the Kaiju (“strange creature”) genre became an indirect way for Japanese culture to cope with its collective A-bomb PTSD and critique the accelerating Cold War arms race. Behemoths like Gojira, Mothra, and King Ghidorah were perfect symbols not only for contemporary anxieties but also a repressed, pre-industrial past that must have still haunted Japan. Kaiju can be seen as cinematic incarnations of kami, powerful spirits who often represent natural forces. According to Shinto beliefs, countless kami permeate reality, emerging from a hidden, parallel dimension to intervene in human affairs when our polluting ways upset the natural order and flow of energy. The central concern in Shintoism is purity. Ritual cleanliness pleases the kami and thus increases the chances of a successful, fruitful life, hence Japanese culture’s preoccupation with cleanliness. Pollution, impurity, and contamination, however, incur the wrath of the kami.

Ryujin

Ryujin

From this perspective, Gojira is reminiscent of powerful sea kami like Ryūjin (or Ryōjin a.k.a. Ōwatatsumi), a wingless dragon with massive claws who symbolizes the power of the Pacific Ocean. Fishermen performed cleansing rituals to Ryūjin in hopes of bolstering their catch. In East Asian mythologies, dragons tend to represent the vitality and potency of nature, so here we see a clear psychosocial link between radioactive pollution and a scaly, fire-breathing beast bent on utter destruction. Gojira symbolizes natural forces that, once contaminated by modern humanity’s technological hubris and careless disregard for the environment, return in misanthropic forms to lay waste to the source of the pollution. The King of All Monsters represents modernization run amok—the law of unintended consequences writ large.

It will be interesting to see how Japanese audiences respond to post-Fukushima versions of Godzilla. How do you think Gareth Edwards’ reboot will do in Japan three years removed from another nuclear disaster? Will moviegoers turn out in droves to see the latest incarnation of this wrathful kami? Will Hollywood be able to help cleanse the psychic stain of this national tragedy?

Japanese Godzilla (2014) poster

Japanese Godzilla (2014) poster

For more discussion see:

http://web.archive.org/web/20050203181104/http://www.pennyblood.com/godzilla2.html

http://www.wnyc.org/story/the-making-of-godzilla-japans-favorite-mon-star/

http://www.npr.org/2014/05/02/308955584/the-making-of-godzilla-japans-favorite-mon-star

The Marvel Dilemma: Genetic Enhancement and the Ethics of Supersizing

Posted in anxiety, archetypes, Avengers, comic books, Dystopia, emotion, graphic literature, Morality, Philosophical and Religious Reflections, Philosophy, Pop Cultural Musings, Pop culture, reason, Science, Science fiction, Technology, Uncategorized with tags , , , , , , , , , , , , , , on May 5, 2014 by Uroboros
Cap: Supersized

Cap: Supersized

Some recent superhero movies have looked at the subject of genetic research and the implications of transhumanism. Thanks to the science behind Operation Rebirth’s serum, Steve Rogers is a super-soldier with physical strength and skills far beyond ordinary human capacities. Peter Parker’s superhuman powers are the result of a genetically-engineered spider’s bite, and his many of his nemeses, Lizard, Carrion, Jackal, Kaine, for example, are all products of bad genetic science. Hell, at OSCORP, it’s standard operating procedure. And, as mutants, the X-Men are transhuman outcasts whose powers put them in a precarious position in terms of how they view and relate to ‘normal’ humans.

These stories can be seen as Frankenstein-like morality tales meant to warn us about the dangers lurking up head if we lunge blindly into the brave new world of liberal eugenics. Setting aside the use of genetic technologies for the repair of injuries and treating diseases, which is of course less controversial, these stories raise an interesting ethical issue, the Marvel Dilemma: is it morally permissible to improve an otherwise healthy human body so one could run as fast as Cap or react as quickly as Peter Parker?

Among the moral philosophers weighing in on the ethics of biotech, Peter Singer represents one side of the Marvel Dilemma. He believes that, while we should be concerned with possible negative side-effects of enhancement, we must accept it’s inevitability and find ways of minimizing the downside while maximizing the ways improved bodies and minds can benefit society overall. Michael Sandel, on the other hand, questions not only the inevitability of a genetically-enhanced human race, but more importantly, the motives behind the desire to improve, a drive he finds morally suspect. Sandel’s argument praises the X-Men factor, the virtue of valuing life’s unexpected gifts.

Welcome to OSCORP

Welcome to OSCORP

In “Shopping at the Genetic Supermarket,” Singer considers whether a genetically-enhanced life could be happier, more pleasurable as well as the kinds of policies governments could adopt in order to ensure the positive effects outweigh the negative ones. As a utilitarian philosopher, he dismisses arguments based on prohibitions against ‘playing God’ or duties to moral law, focusing instead on measuring and evaluating likely consequences. “I do not think we have grounds for concluding,” he says, “that a genetic supermarket would harm either those who choose to shop there, or those who are created from the materials they purchase.”

Where many are repulsed and even terrified of the idea of designer babies, we must not forget that parents are constantly trying to design their children through what they feed them, teach them, what and who they allow their kids to play with and so on. It is a parent’s job to design his or her kid. The difference is pushing the techniques deeper into the prenatal phase, all the way to the genetic level, which we are becoming better and better at manipulating. Who wouldn’t want a child who is more likely to become a fit, smart, and emotionally-stable person? If you think it is wrong to tinker with ‘Mother Nature’ and decide to leave things to chance, wouldn’t you be doing your kid a disservice? After all, they will one day have to compete in the classroom, on the playing-field, in the boardroom with people whose parents chose to enhance. In deciding not to, you would be putting your child at a considerable disadvantage. Couldn’t that be seen as, to some degree, a form of abuse?

Singer doesn’t see anything intrinsically wrong with buying and selling gametes. A society of genetically-enhanced children could be a happier, healthier one, if properly regulated in terms of safety and equal access. The big fear, of course, is of the 1% who can afford the enhancements becoming a super-race who will lord it over the 99%, thus ensuring a dystopic nightmare for the rest of us. Singer’s solution is this:

“Assuming that the objective is to avoid a society divided in two along genetic lines, genetic enhancement services could be subsidized…the state should run a lottery in which the prize is the same package of genetic services that the rich commonly buy for themselves. Tickets in the lottery would not be sold; instead every adult citizen would be given one. The number of prizes would relate to how many of these packages society could afford to pay for, and thus would vary with the costs of the genetic services, as well as with the resources available to provide them. To avoid placing a financial burden on the state..the state should be directly involved in promoting genetic enhancement. The justification for this conclusion is simply that it is preferable to the most probable alternative – leaving genetic enhancement to the marketplace.”

Cap gets enhanced

Cap gets enhanced

So while Singer believes in a kind of genetic affirmative action, Michael Sandel takes a step back from the issue and asks a more fundamental question: why enhance at all? In “The Case Against Perfection,” Sandel explores what is at the heart of our ambivalence towards these technologies. “The question is,” he says, “whether we are right to be troubled, and if so, on what grounds.” He concludes that:

“[T]he main problem with enhancement and genetic engineering is…that they represent a kind of hyperagency—a Promethean aspiration to remake nature, including human nature, to serve our purposes and satisfy our desires…what the drive to mastery misses and may even destroy is an appreciation of the gifted character of human powers and achievements.”

As a virtue ethicist, Sandel judges the permissibility of an act, first and foremost, in terms of the desire motivating it, and what Sandel sees here is hubris and anxiety—terror masked as transhumanist optimism—a “one-sided triumph of willfulness over giftedness, of dominion over reverence, of molding over beholding.” What we fear, what we want to master, is the unknown, the unbidden, the contingent. We once called this aspect of life ‘Fate’ or ‘God’s plan,’ the mysterious unfolding of events whose causes are so complex we can’t learn how to anticipate them and fear having to endure them. So why anticipate and endure them at all? Why be open to randomness? Why not master and eliminate the unbidden? Why not deny nature’s strange ‘gifts’ and order what we want ahead of time, so there are no surprises, no unfathomable errors?

Sandel says it is because the motivation is a sign of weaknesses, not strength. The desire to completely remake the world and ourselves in an image of our choosing actually closes life off, enframing the human experience in a hall of mirrors. It shows a lack of courage. “[O]penness ,” he says, “is a disposition worth affirming, not only within families but in the wider world as well. It invites us to abide the unexpected, to live with dissonance, to rein in the impulse to control.” Furthermore, Sandel argues, this disposition will promote humility, solidarity, and responsibility—invaluable virtues in protecting the integrity of our moral landscape.

Sandel’s approach sheds light on the psychology behind the escalation dilemma. Enhancement, the added value of a genetic alteration, needs a baseline in order to measure the degree of improvement. We won’t be able to make rationally-based value judgments unless we have a standard against which to measure them. For example, potential parents decide they want to have a girl who will grow up to be ‘tall’ because they read an article claiming that, in a workplace environment, taller women are perceived to be more powerful and competent, and therefore, tend to be more successful. Let’s say, five feet and eight inches is the current standard for being a ‘tall woman,’ so they get the doctor to alter the gametes to code for five feet, nine inches.

Now, how many other parents have read this article, too? How many other parents want to give their little Jenny the best chance for success? How many females will be born with the five feet, nine code? Pretty soon, five feet nine won’t be ‘tall’ anymore. It will be ‘the new normal.’ We’ve shifted the baseline, and the drive to enhance has to up the ante, and, within a few years the new mark is six feet and so on. If the motivation is improvement for the sake of improvement, or out of fear of that, since other parents are enhancing you are putting your child at a disadvantage, then the benchmark that defines enhancement will keep ratcheting up exponentially until the positive feedback loop unhinges and spins out of control.

This isn’t about the fear of meddling in ‘Mother Nature’s business.’ You don’t have to posit an essential ‘human nature’ or appeal to God’s laws in order to make sense of an argument against this kind of enhancement. ‘Human nature’ is and has always been a dynamic product of technological improvement from the mastery of fire right up to Lasik surgery and Google Glasses. Human nature isn’t a thing, a substance with a fixed set of properties to be meddled with. It is a dynamic, evolutionary process of integrating our genetically-based bodies with whatever ecological contingencies history brings to the equation. Culture is the part of our nature we invent in order to better ensure our survival. So we change ‘human nature’ each time we adapt to a new set of factors. The question is, what is pushing us to change the rhythm of the process in such a deep and radical way?

Some say it is already happening and is going to continue to happen. Pandora’s Jar is already open, and you can’t stop the genetic arms race now. Singer says you might as well learn how to manage the process so we maximize happiness and do the greatest good for the greatest number. But where is the autonomy, the free-will, in that forecast? Are genetically-enhanced superhumans as inevitable as entropy and the heat death of the universe? Or can we make choices that impact the future? If so, individuals will collectively have to decide to enhance or not. We will have to take a position and express an attitude that will influence the way these technologies are viewed and used. The virtue ethics approach in Sandel’s argument says we shouldn’t  encourage it. If what motivates the desire for mastery are mere vanity and pure anxiety, we should condemn or strongly discourage the use of genomic technology for personal ‘improvement’ and look down on those who do. The question is, are we willing to confront the lack of courage that often drives our perfectionist fantasies and, thanks to the laws of technological acceleration and unintended consciousness, could possibly become the source of our damnation instead of salvation.

X-Men: Mutant and proud!

X-Men: Mutant and proud!

So the answer to the Marvel Dilemma isn’t to escalate enhancement, like in the world of OSCORP, but to the embrace the X-Men ethic of being more accepting toward the unbidden and biologically-given, learn to tolerate and have faith in each other. Granted that there’s a clear distinction between treatment and enhancement (and there are limit cases where this isn’t cut and dry), we should strive to use genetic technology to prevent disease and suffering, but not to enhance an otherwise healthy human body, especially when the motivation behind the changes isn’t a virtuous one. Perhaps we could prevent the self-fulfilling nightmare of a genetic arms race if we owned up to the negative emotions inspiring it in the first place. This would not lead to human ‘enhancement,’ after all, but a tragic dehumanization cosmetically-masked as ‘progress.’ Why not channel the time and money to genetic solutions to over-population, food and energy shortages, and global warming instead? We often think of using this tech in terms of supersizing ourselves, but, as Singer points out, we could just as well use it to downsize ourselves, lowering the amount of food and energy we need to consume. Wouldn’t that be better for the planet and the future of humankind?

The Philosophy of Decomposition: Poe and the Perversity of the Gothic Mind

Posted in Ancient Greek, anxiety, Aristotle, barriers to critical thinking, Christianity, Consciousness, ecology, emotion, Enlightenment, Ethics, fiction, French Revolution, Freud, God, Goth, Gothic, Horror, horror fiction, irrational, Jesus, Literature, Morality, Philosophy, psychoanalysis, Psychology, rational animal, Religion, religious, Repression, resistance to critical thinking, Romanticism, Science, Speculative fiction, terror, tragedy, Uroboros, Writing with tags , , , , , , , , , , , , , , , , , , , , , , , , , , on October 27, 2013 by Uroboros

Whether you think Edgar Allan Poe’s stories are expertly-crafted explorations of the dark side of human nature or morbid, overwrought  melodramas, there is no doubt his work has had a tremendous impact on Western culture. Probably his most important contribution, apart from establishing the contemporary short story format and inventing the detective genre, is revitalizing the Gothic genre and pushing horror fiction in a more philosophically interesting direction. His stories are so enduring and influential because of the conceptual depth he added to generic tropes, redefining literature in the process. He accomplished this feat by perverting the Gothic.

Edgar Allan Poe (1809-49), Master of Gothic literature

Edgar Allan Poe (1809-49), Master of Gothic literature

By the time Poe arrived on the scene, Gothic fiction had already fossilized and become fodder for self-parody. What started with the fantastic absurdities of Horace Walpole’s The Castle of Otranto (1764) and culminating in the speculative complexity of Anne Radcliffe’s Mysteries of Udolpho (1794) had eventually led to Northanger Abbey (1817), Jane Austin’s metafictional send up of what had become pretty stale conventions by then: crumbling castles, tormented heroines, supernatural entities, and family curses. Although the external trappings of Gothic plots may have fallen into ruin, its themes remained relevant. According to Joyce Carol Oates, a master of the genre in her own right, Gothic fiction explores the fragmentation of the alienated mind by inscrutable historical and biological forces that can overwhelm one’s ability to rationally understand the world and make intelligent choices, a critical antidote to naïve utopian visions of the future inspired by the Enlightenment and of particular interest to American culture, the intellectual basis of which is rooted in the rational pursuit of happiness. ‘Gothic’ suggests the fear of something primal and regressive that threatens to undermine mental and social stability. In order to be a culturally relevant again, though, Gothic literature needed a writer who could reanimate its tropes. It needed a morbid, hypersensitive, and arrogant genius named Edgar Allan Poe.

Poe’s key twist is turning the tropes inward and starting with the macabre landscape within—“the terror of the soul,” he calls it. By the 1830s, Poe is focused on composing short fiction, crafting tightly-constructed tales, rendered in dense, pompous prose, spewing from the cracked psyches of unreliable narrators. This is the dark heart of many of his best stories: “Ligeia” (1838), “William Wilson” (1839),  “The Black Cat” (1843), “The Tell-Tale Heart” (1843), and “The Cask of Amontillado” (1846), just to name a few (of course, his most accomplished story, “The Fall of the House of Usher” (1839), flips this dynamic: an unnamed and relatively reasonable narrator details the psychic disintegration of Roderick Usher). Poe’s disturbed, epistemologically-challenged protagonists aren’t the true innovation. Marlowe and Shakespeare pioneered that literary territory centuries before. The element that Poe adds—the novelty that both revitalizes and Americanizes the Gothic—is, what Poe himself calls, “the spirit of the perverseness.”

-d328znhThe narrator in “The Black Cat” puts forth this concept to explain his violent deeds. He says perversity is “one of the primitive impulses of the human heart—one of the indivisible primary faculties…which give direction to the character of Man.” What is its function? It is the “unfathomable longing of the soul to vex itself,” the narrator says, “a perpetual inclination, in the teeth of our best judgment” to commit a “vile or a silly action” precisely because we believe it to be ‘vile’ or ‘silly.’ In “The Imp of the Perverse” (1845), the narrator claims that perversity is “a radical, primitive, irreducible sentiment,” so deep and pervasive, that it is ultimately immune to the prescriptions of the analytical mind. In other words, Poe identified the disruptive and neurotic effects of ‘the Unconscious’ half a century before Freud burst onto the scene.

While these narrators claim that philosophers have ignored man’s irrational inclinations, we shouldn’t assume Poe, himself a well-read scholar, wasn’t influenced by obvious precursors to ‘the spirit of perverseness,’ namely Aristotle and St. Augustine. In the Nicomachean Ethics, Aristotle posits his theory of akrasia, the vice of incontinence, i.e. the inability to control oneself and do the virtuous thing even when one knows it is the right choice. This is his corrective to the Socratic-Platonic dictum that to know the good is to do the good: no one willingly does evil. To Aristotle, this is a distorted view of the human condition. We can know theoretically what the virtuous choice is—wisdom Aristotle calls sophiabut that doesn’t automatically compel us to have phronesisor practical wisdom, which is the ability to do the good. In other words, there is a gap between knowledge and action, a notion that surfaces again in Aristotle’s Poetics. In his analysis of drama, Aristotle identifies hamartia as a key characteristic of the tragic hero, referring to the flaws in judgment that lead to a character’s ultimate downfall. An archery metaphor that means “to miss the mark,” hamartia becomes the main word New Testament writers use to translate the Jewish concept of sin into Greek (they weren’t the first to do this: writers of the Septuagint, the 2C BCE Greek translation of Hebrew scripture, had already made this move). By the fifth century CE, St. Augustine, the most influential Christian theologian of late-antiquity, formulates his doctrine of original sin, describing humanity’s lack of self-control as innate, embodied depravity. For Augustine, when Adam and Eve disobeyed God, they condemned their progeny to bondage, chaining the human spirit to this corrupt, uncontrollable, and ultimately decaying flesh. Only Christ’s sacrifice and God’s loving grace, Augustine assures us, can liberate the spirit from this prison.

This is part of the philosophical lineage behind perverseness, despite his narrators’ claims to the contrary. There is, however, some truth to the critique if seen from a mid-19C perspective. From Descartes right through to Locke, ‘Reason‘ is heralded as humanity’s salvation (of course, Hume and Rousseau poke skeptical holes in 18C Europeans’ over-inflated, self-aggrandizing mythology. Kant manages to salvage some of the optimism, but has to sacrifice key epistemic conceits in the process). But enlightened humanistic confidence looks like hubris to Romantic writers and artists, especially in the wake of the French Revolution and the international traumas it spawned. This is the mindset Poe resonates with: one that is highly skeptical of the ‘Man-is-the-rational-animal’ mythos. Anyone familiar with his biography can see why he gravitates toward a dark worldview. As a critic, he loves savaging fellow writers whose dispositions strike him as too sunny, and as a storyteller, his characters often confront—sometimes ironically, sometimes tragically—the limits of reason, a capacity Poe calls (I think with a tongue-in-cheek ambivalence) ‘ratiocination.’

Dark reflections of a perverse mind

Dark reflections of a perverse mind

The ‘spirit of perverseness’ implies that neither divine ‘Grace’ nor humanistic ‘Reason’ can save us from a life of terror and suffering, especially when we ignore and repress our essential sinfulness. Whether you view history through a biblical or Darwinian lens, one thing is clear: humans aren’t naturally inclined to seek rational knowledge anymore than we are given to loving and respecting each other universally. Modern cognitive science and psychology have shown us that the mind evolved to assist in feeding, procreation, and, of course, to protect the body from danger—not to seek objective truths. It evolved to help us band together in small tribal circles, fearing and even hating those who exist outside that circle. Over time we’ve been able to grasp how much better life would be if only we could rationally control ourselves and universally respect each other—and yet “in the teeth of our best judgment” we still can’t stop ourselves from committing vile and silly actions. Self-sabotage, Poe seems to argue, is our default setting.

Poe shifts Gothic terror from foggy graveyards and dark abbeys to broken brains and twisted minds. The true threats aren’t really lurking ‘out there.’ They’re stirring and bubbling from within, perturbing and overwhelming the soul, often with horrifying results. A Gothic mind lives in a Gothicized world—personifying its surroundings in terms of its own anxious and alienated disposition. ‘Evil’ only appears to be ‘out there.’ As literary and ecological theorist Timothy Morton points out, evil isn’t in the eye of the beholder. Evil is the eye of beholder who frets over the corruption of the world without considering the perverseness generated by his own perceptual apparatus. It’s an Uroboric feedback loop that, left to its own devices, will spin out of control and crumble to pieces. The most disturbing implication of Poe-etic perversity is the sense of helplessness it evokes. Even when his characters are perceptive enough to diagnose their own disorders, they are incapable of stopping the Gothic effect. This is how I interpret the narrator’s ruminations in “The Fall of the House of Usher:”

 What was it…that so unnerved me in the contemplation of the House of Usher? It was a mystery all insoluble; nor could I grapple with the shadowy fancies that crowded upon me as I pondered. I was forced to fall back upon the unsatisfactory conclusion, that while, beyond doubt, there are combinations of very simple natural objects which have the power of thus affecting us, still the analysis of this power lies among considerations beyond our depth. It was possible, I reflected, that a mere different arrangement of the particulars of the scene, of the details of the picture, would be sufficient to modify, or perhaps to annihilate its capacity for sorrowful impression…There can be no doubt that the consciousness of the rapid increase of my superstition…served mainly to accelerate the increase itself. Such, I have long known, is the paradoxical law of all sentiments having terror as a basis. And it might have been for this reason only, that, when I again uplifted my eyes to the house itself, from its image in the pool, there grew in my mind a strange fancy…so ridiculous, indeed, that I but mention it to show the vivid force of the sensations which oppressed me. I had so worked upon my imagination as really to believe that about the whole mansion and domain there hung an atmosphere peculiar to themselves and their immediate vicinity—an atmosphere which had no affinity with the air of heaven, but which had reeked up from the decayed trees, and the gray wall, and the silent tarn—a pestilent and mystic vapour, dull, sluggish, faintly discernible, and leaden-hued…

Fall of the House of Usher (1839)

Fall of the House of Usher (1839)

What is language? What can we do with it, and what does it do to us?

Posted in 1984, 99%, anxiety, barriers to critical thinking, Big Brother, Brain Science, Consciousness, critical thinking, Dystopia, Dystopian, emotion, freedom, George Orwell, humanities, irrational, Jason Reynolds, limbic system, Moraine Valley Community College, Neurology, Newspeak, Nineteen Eighty-four, Orwell, paranoia, Philosophical and Religious Reflections, Philosophy, Philosophy of Mind, politics, Politics and Media, rational animal, Rationalization, rationalizing animal, reason, resistance to critical thinking, theory, theory of mind, thoughtcrime, Two Minutes Hate, Uncategorized, Uroboros, Zombies with tags , , , , , , , , , , , , , , , , , , , , , , , , , , on September 20, 2013 by Uroboros

In Orwell’s 1984, INGSOC’s totalitarian control of Oceania ultimately depends on Newspeak, the language the Party is working hard to develop and implement. Once in common use, Newspeak will eliminate the possibility of thoughtcrime, i.e. any idea that contradicts or questions absolute love for and devotion to Big Brother. Newspeak systematically scrubs away all those messy, gray areas from the English language, replacing them with a formal, logically-rigid system. For example, instead of having to decide whether to use ‘awesome,’ ‘fabulous,’ or ‘mind-blowingly stupendous’ to describe a situation, you would algorithmically deploy the Newspeak formula, which reduces the plethora of synonyms you could use to ‘good,’ ‘plusgood,’ or ‘doubleplusgood.’ Furthermore, all antonyms are reduced to ‘ungood,’ ‘plusungood,’ or ‘doubleplusungood.’Newspeak

Syme, a Party linguist, tells Winston, the novel’s rebellious protagonist, that the ultimate goal is to eliminate conscious thought from the speaking process altogether. The Newspeak term for it is ‘duckspeak‘—a more mechanical form of communication that doesn’t require higher-level cognitive functions, like having to pick the word that best expresses your feelings or creating a new one. That sense of freedom and creativity will simply cease to exist once Newspeak has finally displaced ‘Oldspeak.’ “The Revolution will be complete,” Syme tells Winston, “when the language is perfect.” The Proles and the Outer Party (95% of Oceania’s population) will become a mass of mindless duckspeakers, the linguistic equivalent of ‘philosophical zombies’.

Newspeak implies that cognition depends on language—that symbolic communication isn’t merely a neutral means for sending and receiving thoughts. Instead, the words and sentences we use actually influence the way we think about and perceive the world. While Orwell was obviously inspired by the propaganda techniques used by the dictators of his day, perhaps he was also familiar with Nietzsche’s “On Truth and Lying in a Non-Moral Sense” or the work of anthropologists like Boas and Sapir, all of whom embraced some form of what is now called linguistic relativism, a theory which argues for the reality of what Orwell proposed in fiction: we experience the world according to how our language lets us experience it.

Linguist Lera Boroditsky

Linguist Lera Boroditsky

Linguistic relativism is on the rise in the contemporary study of language. The work of, for example, Lera Boroditsky and Daniel Everett provide strong empirical data that supports (at least the weak version of) linguistic relativism, challenging the Chomskian paradigm, which posits a universalist account of how language is acquired, functions, and, by extension, relates to cognition and perception.

In my previous essay on the Uroboric model of mind, I asked about the connection between neuronal processes and symbolic systems: how can an abstract representation impact or determine the outcome of tangible physical processes? How can ionic thresholds in axons and the transmission of hormones across synaptic gaps depend upon the meaning of a symbol? Furthermore, how can we account for this in a naturalistic way that neither ignores the phenomena by defining them out of existence nor distorts the situation by positing physics-defying stuff? In short, how do we give an emergent account of the process?

StopFirst, we ask: what is language? Most linguists will say it means symbolic communication: in other words, information exchanges that utilize symbols. But what is a symbol? As you may recall from your grade school days, symbols are things that stand for, refer to, or evoke other things—for example, the red hexagonal shapes on street corners provokes your foot to press against the brake, or the letters s, t, o, and p each refer to particular sounds, which, when pronounced together, mean ‘put your foot on the brake.’ Simple enough, right? But the facility with which we use language, and with which we reflexively perceive that usage, belies both the complexity of the process and the powerful effects it has on our thinking.

Cognitive linguists and brain scientists have shown that much of our verbal processing happens unconsciously. Generally speaking, when we use language, words just seem to ‘come to mind’ or ‘show up’ in consciousness. We neither need to consciously think about the meaning of each and every word we use, nor do we have to analyze every variation of tone and inflection to understand things like sarcasm and irony. These complex appraisals and determinations are made subconsciously because certain sub-cortical and cortical systems have already processed the nonverbal signals, the formal symbols, and decoded their meaning. That’s what learning a language equips a brain to do, and we can even identify parts that make major contributions. Broca’s area, for example, is a region in the left frontal lobe that is integral to both language production and comprehension. If a stroke damages Broca’s area, the sufferer may lose the ability not only to produce speech, but to comprehend it as well.

Left-brain language regions

Left-brain language regions

Dr. Jill Bolte Taylor

Dr. Jill Bolte Taylor

One of the most publicized cases of sudden ‘language-less-ness’ is that of Dr. Jill Bolte Taylor, the Harvard brain scientist who, in 1996, happened to have a stroke in her left hemisphere, which impacted both the Broca’s and Wernicke’s areas of her brain. She couldn’t remember who she was. She couldn’t use language. Taylor compares it to dying and being reborn, to being an infant in a grown woman’s body. Her insights into a language-less reality shed light on how words and sentences impact cognition. She says she lost her inner voice, that chatter that goes on ‘in’ the head. She no longer organized her experiences in a categorical, analytic way. Reality no longer showed up to her with the same fine-grained detail: it wasn’t divided and subdivided, classified and prejudged in terms of past associations or future expectations, in terms of self and other, us vs. them, and so on. She no longer had an ‘I’ at the center of her experience. Once the left-brain’s anxious, anal-retentive chatter went offline, right-brain processes took over, and, Taylor claims, the world showed up as waves of energy in an interconnected web of reality. She says that, for her at least, it was actually quite pleasant. The world was present in a way that language had simply dialed down and filtered out. [Any of you who are familiar with monotheistic mysticism and/or mindfulness meditation are probably seeing connections to various religious rituals and the oceanic experiences she describes.]

This has profound implications for the study of consciousness. It illustrates how brain anatomy and neural function—purely physical mechanisms—are necessary to consciousness. Necessary, but not sufficient. While we need brain scientists to continue digging deep, locating and mapping the neuronal correlates of consciousness, we also need to factor in the other necessary part of the ‘mystery of consciousness.’ What linguistic relativism and the Bolte Taylor case suggest is that languages themselves, specific symbolic systems, also determine what consciousness is and how it works. It means not only do we need to identify the neuronal correlates of consciousness but the socio-cultural correlates as well. This means embracing an emergent model that can countenance complex systems and self-referential feedback dynamics.

OrwellOrwell understood this. He understood that rhetorical manipulation is a highly effective form of mind control and, therefore, reality construction. Orwell also knew that, if authoritarian regimes could use language to oppress people [20th century dictators actually used these tactics], then freedom and creativity also depend on language. If, that is, we use it self-consciously and critically, and the language itself has freedom and creativity built into it, and its users are vigilant in preserving that quality and refuse to become duckspeakers.

The Science of Myth and the Myth of Science

Posted in anxiety, archetypes, barriers to critical thinking, Brain Science, collective unconscious, Consciousness, Creationism, critical thinking, emotion, God, History, humanities, irrational, Jung, Knowledge, limbic system, Maori, Myth, Mythology, Neurology, paranoia, Philosophical and Religious Reflections, psychoanalysis, Psychology, rational animal, Rationalization, rationalizing animal, reason, Religion, religious, Repression, resistance to critical thinking, Science, social psychology, terror, Terror Management Theory, theory, theory of mind, Uroboros, V.S. Ramachandran, William James with tags on February 3, 2012 by Uroboros

Years ago in a mythology course I taught, a student once came up to me after class with an annoyed look. We’d just covered the Maori creation myth, and something about it had gotten under his skin. According to the myth, father sky, Rangi, and mother earth, Papa, formed out of primordial chaos and tangled in a tight, erotic embrace. Their offspring decided to pry Rangi and Papa apart in order to escape and live on their own. With his ax, Tane, the forest god, finally separated Father Sky and Mother Earth, and in that space, life grew and flourished.

The broad strokes of this creation myth aren’t unique. Ancient Egyptian, Chinese, Greek, and Norse stories (just to name a few) relate life’s origins to the separation of giant primordial parents.

“How could people believe that?” the student asked, shaking his head. It wasn’t his perturbed incredulity that struck me. Often, students initially find stories from ancient cultures to be, well, weird. It was his condescension. For him, ‘myth’ meant not just ‘false,’ but ‘silly.’ In his defense, it’s what it means for most of us. When we want to sneer at strange, fantastical beliefs, we call them ‘myths.’

The term is synonymous with ‘false.’

‘Myth’ originally meant the exact opposite, though. The Ancient Greek root of mythos referred to life’s deepest truths, something discussed and contemplated with a sense of awe and reverence, not incredulity and disdain. Seen in this light, myths are the stories humans tell in order to explain the unknown and make sense of the world. My thesis is that humans are essentially myth-making creatures and will continue to be so—no matter how scientific our stories get.

Scowls form on some students’ faces when they hear a professor say that science is, on a certain level, still mythological. Scientists are still storytellers, though, trying to turn the unknown into the known. Ancient and modern storytellers have different ways of approaching the unknown—different notions about what counts as a valid explanation.

Today, people (tend to) prefer creation stories that fit the scientific paradigm that’s proved so successful in explaining and predicting natural phenomena. But in dismissing past explanations, we overlook some interesting similarities. Ancient and modern stories share what psychologist Carl Jung called archetypal patterns. Jung theorized that humans share underlying patterns of thought because we all inherit the same neurological equipment. The anatomical differences between an ancient human brain and, say, Darwin’s brain are negligible. Setting the obvious differences between the Maori story and Darwin’s theory aside for just a moment, there are archetypal similarities between these accounts.

Darwinism says life began in a kind of primordial soup where, over time, inorganic molecules organized into the first living cell, and then single-celled organisms eventually separated into multicellular organisms, and from that, thanks to genetic mutation and the pressure of natural selection, lifeforms diversified and flourished. The Big Bang has this underlying pattern too: a ‘primordial atom,’ containing all matter, exploded and separated into the cosmic forms we see today.

I think the key difference between ancient and modern creation stories is in the tendency to personify nature, or the lack there of. The modern scientific method tries remove the subjective factor from the equation. Once we stopped projecting our emotions upon ‘Mother Nature,’ we started telling different stories about how ‘she’ works.

Now scientists are investigating how myth-making itself works. Neurologists and evolutionary psychologists are exploring the biological basis of our ability to mythologize and the possible adaptive purposes informing our storytelling instinct. Let’s start by getting hypothetical and do a little ‘state of nature’ thought experiment. Imagine a prehistoric hunter startled by booming thunder. Now we know the meteorological explanation, but he doesn’t. He experiences what thunder feels like to him: anger. But who is angry?

The problem is addressed by the limbic system, the subcortical brain structure that initially processes emotion and memory. Potential dangers must be understood or anxiety will overwhelm the mind, rendering the hunter less able to cope and survive. The amygdala, the brain’s watchdog, primes the body for action—for fight or flight—while the hippocampus tries to associate feelings with memories in order to focus and better define both the stimuli and the appropriate response. This process is entirely unconscious—faster than the speed of consciousness.

The hippocampus recalls an experience of anger, perhaps one involving the hunter’s own father, and then the cerebral cortex, home of our higher cognitive capacities, gets involved. Somewhere in our cortical circuitry, probably in the angular gyrus, where neuroscientist VS Ramachandran says our metaphoric functions reside, storm images are cross-wired with paternal images. A myth is born: sky is father, earth is mother, and the cause-effect logic of storytelling in the brain’s left-hemisphere embellishes until the amygdala eases up, and the anxiety is relatively alleviated. At least the dread becomes more manageable. In neurochemical terms, the adrenaline and cortisol rush are balanced off and contained by dopamine, the calming effect of apparent knowledge, the pleasure of grasping what was once unknown.

From then on, thunder and lightning will be a little less terrifying. Now there is a story to help make sense of it. Storms are a sign of Father Sky’s anger. What do we do? We try to appease this force–to make amends. We honor the deity by singing and dancing. We sacrifice. Now we have myths and rituals. In short, we have a religion.

That’s why so many prehistoric people, who had no contact with one another, came to believe in primordial giants, and we are still not that far removed from this impulse. For example, why do we still name hurricanes? Sometimes, it’s just easier for us to handle nature if we make it a little more human. As neurologists point out, we are hardwired to pick up on patterns in the environment and attribute human-like qualities and intentions to them. Philosophers and psychologists call this penchant for projecting anthropomorphic agency a theory of mind. English teachers call it personification, an imaginative, poetic skill.

This is why dismissive, condescending attitudes toward myth-making frustrate me. The metaphoric-mythic instinct has been, and still is, a tremendous boon to our own self-understanding, without which science, as we know it, probably wouldn’t have evolved. I came to this conclusion while pondering a profound historical fact: no culture in human history ever made the intellectual leap to objective theories first. Human beings start to know the unknown by projecting what they’re already familiar with onto it.

It’s an a priori instinct. We can’t help it.

Modern science helps make us more conscious of this tendency. The scientific method gives us a way of testing our imaginative leaps—our deeply held intuitions about how the world works—so we can come up with more reliable and practical explanations. The mythological method, in turn, reminds us to be critical of any theory which claims to have achieved pure, unassailable objectivity—to have removed, once and for all, the tendency to unconsciously impose our own assumptions and biases on the interpretation of facts. The ability to do that is just as much a myth as the ‘myths’ such claims supposedly debunk. I’ll paraphrase William James here: The truth is always more complex and complicated than the theories which aim to capture it. Just study the history of modern science—the evolution of theories and paradigms over the last 350 years especially—to see evidence for the asymmetrical relationship between beliefs, justifications, and the ever-elusive Truth.

Laid-back, self-aware scientists have no problem admitting the limitations built into the empirical method itself: Scientific conclusions are implicitly provisional. A theory is true for now. The beauty and power of science hinges upon this point—the self-correcting mechanism, the openness to other possibilities. Otherwise, it’s no longer the scientific method at work. It’s politicized dogma peddling. It’s blind mythologizing.

The recent research into the neurology and psychology of myth-making is fascinating. It enhances our understanding of what a myth is: a story imbued with such emotional power and resonance that how it actually lines up with reality is often an afterthought. But what’s equally fascinating to me, is the mythologizing which still informs our science-making.

I think it’s, of course, dangerous to believe blindly in myths, to accredit stories without testing them against experience and empirical evidence. I also believe it’s dangerous to behold scientific theories as somehow above and beyond the mythological instinct. Like the interconnected swirl of the yin-yang, science and myth need each other, and that relationship should be as balanced and transparent as possible.

Uroboros. A universal symbol of balance and immortality.

Uroboros. A universal symbol of balance and immortality.

%d bloggers like this: