Archive for the Uroboros Category

No Faith in Superman: Lovecraft on ‘Nietzscheism’

Posted in cosmicism, critical thinking, Existentialism, horror fiction, Lovecraft, Nietzsche, nihilism, Philosophical and Religious Reflections, Philosophy, rational animal, reason, Speculative fiction, Uroboros on January 8, 2014 by Uroboros

In regards to a recent post on the overlapping ideas of Nietzsche and Lovecraft, Allan McPherson kindly pointed out that H.P. had in fact written a short little essay on Nietzscheism, which is posted here on OHHAI’s tumblr page. It’s a typically Lovecraftian take on the problem of nihilism, i.e. it’s equal parts pessimistic and elitist, flavored with some unfortunate hints of racism (you have to hold your nose here and there when you read it–something no Lovecraft fan isn’t already used to.) It nonetheless deals explicitly with a crucial contemporary issue, one I’m exploring in my own speculative fiction series, Uroboros.

Lovecraft (1890-1937)

Lovecraft (1890-1937)

My question is this: are humans the kind of beings who can use our rational capacities and free-will (granted we have such capacities) to create meanings that can ground and sustain our own existence?In other words, can we have values and purposes to which each individual can freely and clearly consent? Or are we essentially superstitious little creatures who need an authority to submit to, real and/or imagined?

What are your thoughts?

Advertisements

The Philosophy of Decomposition: Poe and the Perversity of the Gothic Mind

Posted in Ancient Greek, anxiety, Aristotle, barriers to critical thinking, Christianity, Consciousness, ecology, emotion, Enlightenment, Ethics, fiction, French Revolution, Freud, God, Goth, Gothic, Horror, horror fiction, irrational, Jesus, Literature, Morality, Philosophy, psychoanalysis, Psychology, rational animal, Religion, religious, Repression, resistance to critical thinking, Romanticism, Science, Speculative fiction, terror, tragedy, Uroboros, Writing with tags , , , , , , , , , , , , , , , , , , , , , , , , , , on October 27, 2013 by Uroboros

Whether you think Edgar Allan Poe’s stories are expertly-crafted explorations of the dark side of human nature or morbid, overwrought  melodramas, there is no doubt his work has had a tremendous impact on Western culture. Probably his most important contribution, apart from establishing the contemporary short story format and inventing the detective genre, is revitalizing the Gothic genre and pushing horror fiction in a more philosophically interesting direction. His stories are so enduring and influential because of the conceptual depth he added to generic tropes, redefining literature in the process. He accomplished this feat by perverting the Gothic.

Edgar Allan Poe (1809-49), Master of Gothic literature

Edgar Allan Poe (1809-49), Master of Gothic literature

By the time Poe arrived on the scene, Gothic fiction had already fossilized and become fodder for self-parody. What started with the fantastic absurdities of Horace Walpole’s The Castle of Otranto (1764) and culminating in the speculative complexity of Anne Radcliffe’s Mysteries of Udolpho (1794) had eventually led to Northanger Abbey (1817), Jane Austin’s metafictional send up of what had become pretty stale conventions by then: crumbling castles, tormented heroines, supernatural entities, and family curses. Although the external trappings of Gothic plots may have fallen into ruin, its themes remained relevant. According to Joyce Carol Oates, a master of the genre in her own right, Gothic fiction explores the fragmentation of the alienated mind by inscrutable historical and biological forces that can overwhelm one’s ability to rationally understand the world and make intelligent choices, a critical antidote to naïve utopian visions of the future inspired by the Enlightenment and of particular interest to American culture, the intellectual basis of which is rooted in the rational pursuit of happiness. ‘Gothic’ suggests the fear of something primal and regressive that threatens to undermine mental and social stability. In order to be a culturally relevant again, though, Gothic literature needed a writer who could reanimate its tropes. It needed a morbid, hypersensitive, and arrogant genius named Edgar Allan Poe.

Poe’s key twist is turning the tropes inward and starting with the macabre landscape within—“the terror of the soul,” he calls it. By the 1830s, Poe is focused on composing short fiction, crafting tightly-constructed tales, rendered in dense, pompous prose, spewing from the cracked psyches of unreliable narrators. This is the dark heart of many of his best stories: “Ligeia” (1838), “William Wilson” (1839),  “The Black Cat” (1843), “The Tell-Tale Heart” (1843), and “The Cask of Amontillado” (1846), just to name a few (of course, his most accomplished story, “The Fall of the House of Usher” (1839), flips this dynamic: an unnamed and relatively reasonable narrator details the psychic disintegration of Roderick Usher). Poe’s disturbed, epistemologically-challenged protagonists aren’t the true innovation. Marlowe and Shakespeare pioneered that literary territory centuries before. The element that Poe adds—the novelty that both revitalizes and Americanizes the Gothic—is, what Poe himself calls, “the spirit of the perverseness.”

-d328znhThe narrator in “The Black Cat” puts forth this concept to explain his violent deeds. He says perversity is “one of the primitive impulses of the human heart—one of the indivisible primary faculties…which give direction to the character of Man.” What is its function? It is the “unfathomable longing of the soul to vex itself,” the narrator says, “a perpetual inclination, in the teeth of our best judgment” to commit a “vile or a silly action” precisely because we believe it to be ‘vile’ or ‘silly.’ In “The Imp of the Perverse” (1845), the narrator claims that perversity is “a radical, primitive, irreducible sentiment,” so deep and pervasive, that it is ultimately immune to the prescriptions of the analytical mind. In other words, Poe identified the disruptive and neurotic effects of ‘the Unconscious’ half a century before Freud burst onto the scene.

While these narrators claim that philosophers have ignored man’s irrational inclinations, we shouldn’t assume Poe, himself a well-read scholar, wasn’t influenced by obvious precursors to ‘the spirit of perverseness,’ namely Aristotle and St. Augustine. In the Nicomachean Ethics, Aristotle posits his theory of akrasia, the vice of incontinence, i.e. the inability to control oneself and do the virtuous thing even when one knows it is the right choice. This is his corrective to the Socratic-Platonic dictum that to know the good is to do the good: no one willingly does evil. To Aristotle, this is a distorted view of the human condition. We can know theoretically what the virtuous choice is—wisdom Aristotle calls sophiabut that doesn’t automatically compel us to have phronesisor practical wisdom, which is the ability to do the good. In other words, there is a gap between knowledge and action, a notion that surfaces again in Aristotle’s Poetics. In his analysis of drama, Aristotle identifies hamartia as a key characteristic of the tragic hero, referring to the flaws in judgment that lead to a character’s ultimate downfall. An archery metaphor that means “to miss the mark,” hamartia becomes the main word New Testament writers use to translate the Jewish concept of sin into Greek (they weren’t the first to do this: writers of the Septuagint, the 2C BCE Greek translation of Hebrew scripture, had already made this move). By the fifth century CE, St. Augustine, the most influential Christian theologian of late-antiquity, formulates his doctrine of original sin, describing humanity’s lack of self-control as innate, embodied depravity. For Augustine, when Adam and Eve disobeyed God, they condemned their progeny to bondage, chaining the human spirit to this corrupt, uncontrollable, and ultimately decaying flesh. Only Christ’s sacrifice and God’s loving grace, Augustine assures us, can liberate the spirit from this prison.

This is part of the philosophical lineage behind perverseness, despite his narrators’ claims to the contrary. There is, however, some truth to the critique if seen from a mid-19C perspective. From Descartes right through to Locke, ‘Reason‘ is heralded as humanity’s salvation (of course, Hume and Rousseau poke skeptical holes in 18C Europeans’ over-inflated, self-aggrandizing mythology. Kant manages to salvage some of the optimism, but has to sacrifice key epistemic conceits in the process). But enlightened humanistic confidence looks like hubris to Romantic writers and artists, especially in the wake of the French Revolution and the international traumas it spawned. This is the mindset Poe resonates with: one that is highly skeptical of the ‘Man-is-the-rational-animal’ mythos. Anyone familiar with his biography can see why he gravitates toward a dark worldview. As a critic, he loves savaging fellow writers whose dispositions strike him as too sunny, and as a storyteller, his characters often confront—sometimes ironically, sometimes tragically—the limits of reason, a capacity Poe calls (I think with a tongue-in-cheek ambivalence) ‘ratiocination.’

Dark reflections of a perverse mind

Dark reflections of a perverse mind

The ‘spirit of perverseness’ implies that neither divine ‘Grace’ nor humanistic ‘Reason’ can save us from a life of terror and suffering, especially when we ignore and repress our essential sinfulness. Whether you view history through a biblical or Darwinian lens, one thing is clear: humans aren’t naturally inclined to seek rational knowledge anymore than we are given to loving and respecting each other universally. Modern cognitive science and psychology have shown us that the mind evolved to assist in feeding, procreation, and, of course, to protect the body from danger—not to seek objective truths. It evolved to help us band together in small tribal circles, fearing and even hating those who exist outside that circle. Over time we’ve been able to grasp how much better life would be if only we could rationally control ourselves and universally respect each other—and yet “in the teeth of our best judgment” we still can’t stop ourselves from committing vile and silly actions. Self-sabotage, Poe seems to argue, is our default setting.

Poe shifts Gothic terror from foggy graveyards and dark abbeys to broken brains and twisted minds. The true threats aren’t really lurking ‘out there.’ They’re stirring and bubbling from within, perturbing and overwhelming the soul, often with horrifying results. A Gothic mind lives in a Gothicized world—personifying its surroundings in terms of its own anxious and alienated disposition. ‘Evil’ only appears to be ‘out there.’ As literary and ecological theorist Timothy Morton points out, evil isn’t in the eye of the beholder. Evil is the eye of beholder who frets over the corruption of the world without considering the perverseness generated by his own perceptual apparatus. It’s an Uroboric feedback loop that, left to its own devices, will spin out of control and crumble to pieces. The most disturbing implication of Poe-etic perversity is the sense of helplessness it evokes. Even when his characters are perceptive enough to diagnose their own disorders, they are incapable of stopping the Gothic effect. This is how I interpret the narrator’s ruminations in “The Fall of the House of Usher:”

 What was it…that so unnerved me in the contemplation of the House of Usher? It was a mystery all insoluble; nor could I grapple with the shadowy fancies that crowded upon me as I pondered. I was forced to fall back upon the unsatisfactory conclusion, that while, beyond doubt, there are combinations of very simple natural objects which have the power of thus affecting us, still the analysis of this power lies among considerations beyond our depth. It was possible, I reflected, that a mere different arrangement of the particulars of the scene, of the details of the picture, would be sufficient to modify, or perhaps to annihilate its capacity for sorrowful impression…There can be no doubt that the consciousness of the rapid increase of my superstition…served mainly to accelerate the increase itself. Such, I have long known, is the paradoxical law of all sentiments having terror as a basis. And it might have been for this reason only, that, when I again uplifted my eyes to the house itself, from its image in the pool, there grew in my mind a strange fancy…so ridiculous, indeed, that I but mention it to show the vivid force of the sensations which oppressed me. I had so worked upon my imagination as really to believe that about the whole mansion and domain there hung an atmosphere peculiar to themselves and their immediate vicinity—an atmosphere which had no affinity with the air of heaven, but which had reeked up from the decayed trees, and the gray wall, and the silent tarn—a pestilent and mystic vapour, dull, sluggish, faintly discernible, and leaden-hued…

Fall of the House of Usher (1839)

Fall of the House of Usher (1839)

Sublimity and the Brightside of Being Terrorized

Posted in Consciousness, conspiracy, critical thinking, emotion, Enlightenment, Ethics, Existentialism, fiction, freedom, Freud, God, Gothic, Horror, humanities, Literature, Lovecraft, Lovecraftian, Morality, nihilism, paranoia, Philosophical and Religious Reflections, Philosophy, Philosophy of Mind, psychoanalysis, Psychology, rational animal, reason, Religion, religious, Romanticism, superheroes, terror, Terror Management Theory, The Walking Dead, theory, theory of mind, Uroboros, Zombies with tags , , , , , , , , , , , , , , on October 6, 2013 by Uroboros
http://en.wikipedia.org/wiki/The_Sleep_of_Reason_Produces_Monsters

Goya’s The Sleep of Reason Produces Monsters

We live in a terrorized age. At the dawn of the 21st century, the world is not only coping with the constant threat of violent extremism, we face global warming, potential pandemic diseases, economic uncertainty, Middle Eastern conflicts, the debilitating consequences of partisan politics, and so on. The list grows each time you click on the news. Fear seems to be infecting the collective consciousness like a virus, resulting in a culture of anxiety and a rising tide of helplessness, despair, and anger. In the U.S.,  symptoms of this chronic unease can be seen in the proliferation of apocalyptic paranoia and conspiracy theories coupled with the record sales of both weapons and tickets for Hollywood’s superhero blockbusters, fables that reflect post-9/11 fears and the desire for a hero to sweep in and save us.

That’s why I want to take the time to analyze some complex but important concepts like the sublime, the Gothic, and the uncanny, ideas which, I believe, can help people get a rational grip on the forces that terrorize the soul. Let’s begin with the sublime.

18c philosopher Immanuel Kant

18C Philosopher Immanuel Kant

The word is Latin in origin and means rising up to meet a threshold. To Enlightenment thinkers, it referred to those experiences that challenged or transcended the limits of thought, to overwhelming forces that left humans feeling vulnerable and in need of paternal protection. Edmund Burke, one of the great theorists of the sublime, distinguished this feeling from the experience of beauty. The beautiful is tame, pleasant. It comes from the recognition of order, the harmony of symmetrical form, as in the appreciation of a flower or a healthy human body. You can behold them without being unnerved, without feeling subtly terrorized. Beautiful things speak of a universe with intrinsic meaning, tucking the mind into a world that is hospitable to human endeavors. Contrast this with the awe and astonishment one feels when contemplating the dimensions of a starry sky or a rugged, mist-wreathed mountain. From a distance, of course, they can appear ‘beautiful,’ but, as Immanuel Kant points out in Observations on the Feeling of the Beautiful and Sublime, it is a different kind of pleasure because it contains a “certain dread, or melancholy, in some cases merely the quiet wonder; and in still others with a beauty completely pervading a sublime plan.”

This description captures the ambivalence in sublime experiences, moments where we are at once paradoxically terrified and fascinated by the same thing. It is important here to distinguish ‘terror’ from ‘horror.’ Terror is the experience of danger at a safe distance, the potential of a threat, as opposed to horror, which refers to imminent dangers that actually threaten our existence. If I’m standing on the shore, staring out across a vast, breathtaking sea, entranced by the hissing surf, terror is the goose-pimply, weirded-out feeling I get while contemplating the dimensions and unfathomable power before me. Horror would be what I feel if a tsunami reared up and came crashing in. There’s nothing sublime in horror. It’s too intense to allow for the odd mix of pleasure and fear, no gap in the feeling for some kind of deeper revelation to emerge.

Friedrich's Monk by the Sea

Friedrich’s Monk by the Sea

While Burke located the power of the sublime in the external world, in the recognition of an authority ‘out there,’ Kant has a more sophisticated take. Without digging too deeply into the jargon-laden minutia of his critique, suffice it to say that Kant ‘subjectivizes’ the concept, locating the sublime in the mind itself. I interpret Kant as pointing to a recursive, self-referential quality in the heart of the sublime, an openness that stimulates our imagination in profound ways. When contemplating stormy seas and dark skies, we experience our both nervous system’s anxious reaction to the environment along with a weird sense of wonder and awe. Beneath this thrill, however, is a humbling sense of futility and isolation in the face of the Infinite, in the awesome cycles that evaporate seas, crush mountains, and dissolve stars without a care in the cosmos as to any ‘meaning’ they may have to us. Rising up to the threshold of consciousness is the haunting suspicion that the universe is a harsh place devoid of a predetermined purpose that validates its existence. These contradictory feelings give rise to a self-awareness of the ambivalence itself, allowing ‘meta-cognitive’ processes to emerge. This is the mind’s means of understanding the fissure and trying to close the gap in a meaningful way.

Furthermore, by experiencing forms and magnitudes that stagger and disturb the imagination, the mind can actually grasp its own liberation from the deterministic workings of nature, from the blind mechanisms of a clockwork universe. In his Critique of Judgment, Kant says “the irresistibility of [nature’s] power certainly makes us, considered as natural beings, recognize our physical powerlessness, but at the same time it reveals a capacity for judging ourselves as independent of nature and a superiority over nature…whereby the humanity in our person remains undemeaned even though the human being must submit to that dominion.” One is now thinking about their own thinking, after all, reflecting upon the complexity of the subject-object feedback loop, which, I assert, is the very dynamic that makes self-consciousness and freedom possible in the first place. We can’t feel terrorized by life’s machinations if we aren’t somehow psychologically distant from them, and this gap entails our ability to think intelligently and make decisions about how best to react to our feelings.

Van Gogh's Starry Night

Van Gogh’s Starry Night

I think this is in line with Kant’s claim that the sublime is symbolic of our moral freedom—an aesthetic validation of our ethical intentions and existential purposes over and above our biological inclinations and physical limitations. We are autonomous creatures who can trust our capacity to understand the cosmos and govern ourselves precisely because we are also capable of being terrorized by a universe that appears indifferent to our hopes and dreams. Seen in this light, the sublime is like a secularized burning bush, an enlightened version of God coming out of the whirlwind and parting seas. It is a more mature way of getting in touch with and listening to the divine, a reasonable basis for faith.

My faith is in the dawn of a post-Terrorized Age. What Kant’s critique of the sublime teaches me is that, paradoxically, we need to be terrorized in order to get there. The concept of the sublime allows us to reflect on our fears in order to resist their potentially debilitating, destructive effects. The antidote is in the poison, so to speak. The sublime elevates these feelings: the more sublime the terror, the freer you are, the more moral you can be. So, may you live in terrifying times.

Friedrich's Wanderer above the Sea of Fog

Friedrich’s Wanderer above the Sea of Fog

What is language? What can we do with it, and what does it do to us?

Posted in 1984, 99%, anxiety, barriers to critical thinking, Big Brother, Brain Science, Consciousness, critical thinking, Dystopia, Dystopian, emotion, freedom, George Orwell, humanities, irrational, Jason Reynolds, limbic system, Moraine Valley Community College, Neurology, Newspeak, Nineteen Eighty-four, Orwell, paranoia, Philosophical and Religious Reflections, Philosophy, Philosophy of Mind, politics, Politics and Media, rational animal, Rationalization, rationalizing animal, reason, resistance to critical thinking, theory, theory of mind, thoughtcrime, Two Minutes Hate, Uncategorized, Uroboros, Zombies with tags , , , , , , , , , , , , , , , , , , , , , , , , , , on September 20, 2013 by Uroboros

In Orwell’s 1984, INGSOC’s totalitarian control of Oceania ultimately depends on Newspeak, the language the Party is working hard to develop and implement. Once in common use, Newspeak will eliminate the possibility of thoughtcrime, i.e. any idea that contradicts or questions absolute love for and devotion to Big Brother. Newspeak systematically scrubs away all those messy, gray areas from the English language, replacing them with a formal, logically-rigid system. For example, instead of having to decide whether to use ‘awesome,’ ‘fabulous,’ or ‘mind-blowingly stupendous’ to describe a situation, you would algorithmically deploy the Newspeak formula, which reduces the plethora of synonyms you could use to ‘good,’ ‘plusgood,’ or ‘doubleplusgood.’ Furthermore, all antonyms are reduced to ‘ungood,’ ‘plusungood,’ or ‘doubleplusungood.’Newspeak

Syme, a Party linguist, tells Winston, the novel’s rebellious protagonist, that the ultimate goal is to eliminate conscious thought from the speaking process altogether. The Newspeak term for it is ‘duckspeak‘—a more mechanical form of communication that doesn’t require higher-level cognitive functions, like having to pick the word that best expresses your feelings or creating a new one. That sense of freedom and creativity will simply cease to exist once Newspeak has finally displaced ‘Oldspeak.’ “The Revolution will be complete,” Syme tells Winston, “when the language is perfect.” The Proles and the Outer Party (95% of Oceania’s population) will become a mass of mindless duckspeakers, the linguistic equivalent of ‘philosophical zombies’.

Newspeak implies that cognition depends on language—that symbolic communication isn’t merely a neutral means for sending and receiving thoughts. Instead, the words and sentences we use actually influence the way we think about and perceive the world. While Orwell was obviously inspired by the propaganda techniques used by the dictators of his day, perhaps he was also familiar with Nietzsche’s “On Truth and Lying in a Non-Moral Sense” or the work of anthropologists like Boas and Sapir, all of whom embraced some form of what is now called linguistic relativism, a theory which argues for the reality of what Orwell proposed in fiction: we experience the world according to how our language lets us experience it.

Linguist Lera Boroditsky

Linguist Lera Boroditsky

Linguistic relativism is on the rise in the contemporary study of language. The work of, for example, Lera Boroditsky and Daniel Everett provide strong empirical data that supports (at least the weak version of) linguistic relativism, challenging the Chomskian paradigm, which posits a universalist account of how language is acquired, functions, and, by extension, relates to cognition and perception.

In my previous essay on the Uroboric model of mind, I asked about the connection between neuronal processes and symbolic systems: how can an abstract representation impact or determine the outcome of tangible physical processes? How can ionic thresholds in axons and the transmission of hormones across synaptic gaps depend upon the meaning of a symbol? Furthermore, how can we account for this in a naturalistic way that neither ignores the phenomena by defining them out of existence nor distorts the situation by positing physics-defying stuff? In short, how do we give an emergent account of the process?

StopFirst, we ask: what is language? Most linguists will say it means symbolic communication: in other words, information exchanges that utilize symbols. But what is a symbol? As you may recall from your grade school days, symbols are things that stand for, refer to, or evoke other things—for example, the red hexagonal shapes on street corners provokes your foot to press against the brake, or the letters s, t, o, and p each refer to particular sounds, which, when pronounced together, mean ‘put your foot on the brake.’ Simple enough, right? But the facility with which we use language, and with which we reflexively perceive that usage, belies both the complexity of the process and the powerful effects it has on our thinking.

Cognitive linguists and brain scientists have shown that much of our verbal processing happens unconsciously. Generally speaking, when we use language, words just seem to ‘come to mind’ or ‘show up’ in consciousness. We neither need to consciously think about the meaning of each and every word we use, nor do we have to analyze every variation of tone and inflection to understand things like sarcasm and irony. These complex appraisals and determinations are made subconsciously because certain sub-cortical and cortical systems have already processed the nonverbal signals, the formal symbols, and decoded their meaning. That’s what learning a language equips a brain to do, and we can even identify parts that make major contributions. Broca’s area, for example, is a region in the left frontal lobe that is integral to both language production and comprehension. If a stroke damages Broca’s area, the sufferer may lose the ability not only to produce speech, but to comprehend it as well.

Left-brain language regions

Left-brain language regions

Dr. Jill Bolte Taylor

Dr. Jill Bolte Taylor

One of the most publicized cases of sudden ‘language-less-ness’ is that of Dr. Jill Bolte Taylor, the Harvard brain scientist who, in 1996, happened to have a stroke in her left hemisphere, which impacted both the Broca’s and Wernicke’s areas of her brain. She couldn’t remember who she was. She couldn’t use language. Taylor compares it to dying and being reborn, to being an infant in a grown woman’s body. Her insights into a language-less reality shed light on how words and sentences impact cognition. She says she lost her inner voice, that chatter that goes on ‘in’ the head. She no longer organized her experiences in a categorical, analytic way. Reality no longer showed up to her with the same fine-grained detail: it wasn’t divided and subdivided, classified and prejudged in terms of past associations or future expectations, in terms of self and other, us vs. them, and so on. She no longer had an ‘I’ at the center of her experience. Once the left-brain’s anxious, anal-retentive chatter went offline, right-brain processes took over, and, Taylor claims, the world showed up as waves of energy in an interconnected web of reality. She says that, for her at least, it was actually quite pleasant. The world was present in a way that language had simply dialed down and filtered out. [Any of you who are familiar with monotheistic mysticism and/or mindfulness meditation are probably seeing connections to various religious rituals and the oceanic experiences she describes.]

This has profound implications for the study of consciousness. It illustrates how brain anatomy and neural function—purely physical mechanisms—are necessary to consciousness. Necessary, but not sufficient. While we need brain scientists to continue digging deep, locating and mapping the neuronal correlates of consciousness, we also need to factor in the other necessary part of the ‘mystery of consciousness.’ What linguistic relativism and the Bolte Taylor case suggest is that languages themselves, specific symbolic systems, also determine what consciousness is and how it works. It means not only do we need to identify the neuronal correlates of consciousness but the socio-cultural correlates as well. This means embracing an emergent model that can countenance complex systems and self-referential feedback dynamics.

OrwellOrwell understood this. He understood that rhetorical manipulation is a highly effective form of mind control and, therefore, reality construction. Orwell also knew that, if authoritarian regimes could use language to oppress people [20th century dictators actually used these tactics], then freedom and creativity also depend on language. If, that is, we use it self-consciously and critically, and the language itself has freedom and creativity built into it, and its users are vigilant in preserving that quality and refuse to become duckspeakers.

The Science of Myth and the Myth of Science

Posted in anxiety, archetypes, barriers to critical thinking, Brain Science, collective unconscious, Consciousness, Creationism, critical thinking, emotion, God, History, humanities, irrational, Jung, Knowledge, limbic system, Maori, Myth, Mythology, Neurology, paranoia, Philosophical and Religious Reflections, psychoanalysis, Psychology, rational animal, Rationalization, rationalizing animal, reason, Religion, religious, Repression, resistance to critical thinking, Science, social psychology, terror, Terror Management Theory, theory, theory of mind, Uroboros, V.S. Ramachandran, William James with tags on February 3, 2012 by Uroboros

Years ago in a mythology course I taught, a student once came up to me after class with an annoyed look. We’d just covered the Maori creation myth, and something about it had gotten under his skin. According to the myth, father sky, Rangi, and mother earth, Papa, formed out of primordial chaos and tangled in a tight, erotic embrace. Their offspring decided to pry Rangi and Papa apart in order to escape and live on their own. With his ax, Tane, the forest god, finally separated Father Sky and Mother Earth, and in that space, life grew and flourished.

The broad strokes of this creation myth aren’t unique. Ancient Egyptian, Chinese, Greek, and Norse stories (just to name a few) relate life’s origins to the separation of giant primordial parents.

“How could people believe that?” the student asked, shaking his head. It wasn’t his perturbed incredulity that struck me. Often, students initially find stories from ancient cultures to be, well, weird. It was his condescension. For him, ‘myth’ meant not just ‘false,’ but ‘silly.’ In his defense, it’s what it means for most of us. When we want to sneer at strange, fantastical beliefs, we call them ‘myths.’

The term is synonymous with ‘false.’

‘Myth’ originally meant the exact opposite, though. The Ancient Greek root of mythos referred to life’s deepest truths, something discussed and contemplated with a sense of awe and reverence, not incredulity and disdain. Seen in this light, myths are the stories humans tell in order to explain the unknown and make sense of the world. My thesis is that humans are essentially myth-making creatures and will continue to be so—no matter how scientific our stories get.

Scowls form on some students’ faces when they hear a professor say that science is, on a certain level, still mythological. Scientists are still storytellers, though, trying to turn the unknown into the known. Ancient and modern storytellers have different ways of approaching the unknown—different notions about what counts as a valid explanation.

Today, people (tend to) prefer creation stories that fit the scientific paradigm that’s proved so successful in explaining and predicting natural phenomena. But in dismissing past explanations, we overlook some interesting similarities. Ancient and modern stories share what psychologist Carl Jung called archetypal patterns. Jung theorized that humans share underlying patterns of thought because we all inherit the same neurological equipment. The anatomical differences between an ancient human brain and, say, Darwin’s brain are negligible. Setting the obvious differences between the Maori story and Darwin’s theory aside for just a moment, there are archetypal similarities between these accounts.

Darwinism says life began in a kind of primordial soup where, over time, inorganic molecules organized into the first living cell, and then single-celled organisms eventually separated into multicellular organisms, and from that, thanks to genetic mutation and the pressure of natural selection, lifeforms diversified and flourished. The Big Bang has this underlying pattern too: a ‘primordial atom,’ containing all matter, exploded and separated into the cosmic forms we see today.

I think the key difference between ancient and modern creation stories is in the tendency to personify nature, or the lack there of. The modern scientific method tries remove the subjective factor from the equation. Once we stopped projecting our emotions upon ‘Mother Nature,’ we started telling different stories about how ‘she’ works.

Now scientists are investigating how myth-making itself works. Neurologists and evolutionary psychologists are exploring the biological basis of our ability to mythologize and the possible adaptive purposes informing our storytelling instinct. Let’s start by getting hypothetical and do a little ‘state of nature’ thought experiment. Imagine a prehistoric hunter startled by booming thunder. Now we know the meteorological explanation, but he doesn’t. He experiences what thunder feels like to him: anger. But who is angry?

The problem is addressed by the limbic system, the subcortical brain structure that initially processes emotion and memory. Potential dangers must be understood or anxiety will overwhelm the mind, rendering the hunter less able to cope and survive. The amygdala, the brain’s watchdog, primes the body for action—for fight or flight—while the hippocampus tries to associate feelings with memories in order to focus and better define both the stimuli and the appropriate response. This process is entirely unconscious—faster than the speed of consciousness.

The hippocampus recalls an experience of anger, perhaps one involving the hunter’s own father, and then the cerebral cortex, home of our higher cognitive capacities, gets involved. Somewhere in our cortical circuitry, probably in the angular gyrus, where neuroscientist VS Ramachandran says our metaphoric functions reside, storm images are cross-wired with paternal images. A myth is born: sky is father, earth is mother, and the cause-effect logic of storytelling in the brain’s left-hemisphere embellishes until the amygdala eases up, and the anxiety is relatively alleviated. At least the dread becomes more manageable. In neurochemical terms, the adrenaline and cortisol rush are balanced off and contained by dopamine, the calming effect of apparent knowledge, the pleasure of grasping what was once unknown.

From then on, thunder and lightning will be a little less terrifying. Now there is a story to help make sense of it. Storms are a sign of Father Sky’s anger. What do we do? We try to appease this force–to make amends. We honor the deity by singing and dancing. We sacrifice. Now we have myths and rituals. In short, we have a religion.

That’s why so many prehistoric people, who had no contact with one another, came to believe in primordial giants, and we are still not that far removed from this impulse. For example, why do we still name hurricanes? Sometimes, it’s just easier for us to handle nature if we make it a little more human. As neurologists point out, we are hardwired to pick up on patterns in the environment and attribute human-like qualities and intentions to them. Philosophers and psychologists call this penchant for projecting anthropomorphic agency a theory of mind. English teachers call it personification, an imaginative, poetic skill.

This is why dismissive, condescending attitudes toward myth-making frustrate me. The metaphoric-mythic instinct has been, and still is, a tremendous boon to our own self-understanding, without which science, as we know it, probably wouldn’t have evolved. I came to this conclusion while pondering a profound historical fact: no culture in human history ever made the intellectual leap to objective theories first. Human beings start to know the unknown by projecting what they’re already familiar with onto it.

It’s an a priori instinct. We can’t help it.

Modern science helps make us more conscious of this tendency. The scientific method gives us a way of testing our imaginative leaps—our deeply held intuitions about how the world works—so we can come up with more reliable and practical explanations. The mythological method, in turn, reminds us to be critical of any theory which claims to have achieved pure, unassailable objectivity—to have removed, once and for all, the tendency to unconsciously impose our own assumptions and biases on the interpretation of facts. The ability to do that is just as much a myth as the ‘myths’ such claims supposedly debunk. I’ll paraphrase William James here: The truth is always more complex and complicated than the theories which aim to capture it. Just study the history of modern science—the evolution of theories and paradigms over the last 350 years especially—to see evidence for the asymmetrical relationship between beliefs, justifications, and the ever-elusive Truth.

Laid-back, self-aware scientists have no problem admitting the limitations built into the empirical method itself: Scientific conclusions are implicitly provisional. A theory is true for now. The beauty and power of science hinges upon this point—the self-correcting mechanism, the openness to other possibilities. Otherwise, it’s no longer the scientific method at work. It’s politicized dogma peddling. It’s blind mythologizing.

The recent research into the neurology and psychology of myth-making is fascinating. It enhances our understanding of what a myth is: a story imbued with such emotional power and resonance that how it actually lines up with reality is often an afterthought. But what’s equally fascinating to me, is the mythologizing which still informs our science-making.

I think it’s, of course, dangerous to believe blindly in myths, to accredit stories without testing them against experience and empirical evidence. I also believe it’s dangerous to behold scientific theories as somehow above and beyond the mythological instinct. Like the interconnected swirl of the yin-yang, science and myth need each other, and that relationship should be as balanced and transparent as possible.

Uroboros. A universal symbol of balance and immortality.

Uroboros. A universal symbol of balance and immortality.

%d bloggers like this: