Archive for the theory Category

Sublimity and the Brightside of Being Terrorized

Posted in Consciousness, conspiracy, critical thinking, emotion, Enlightenment, Ethics, Existentialism, fiction, freedom, Freud, God, Gothic, Horror, humanities, Literature, Lovecraft, Lovecraftian, Morality, nihilism, paranoia, Philosophical and Religious Reflections, Philosophy, Philosophy of Mind, psychoanalysis, Psychology, rational animal, reason, Religion, religious, Romanticism, superheroes, terror, Terror Management Theory, The Walking Dead, theory, theory of mind, Uroboros, Zombies with tags , , , , , , , , , , , , , , on October 6, 2013 by Uroboros
http://en.wikipedia.org/wiki/The_Sleep_of_Reason_Produces_Monsters

Goya’s The Sleep of Reason Produces Monsters

We live in a terrorized age. At the dawn of the 21st century, the world is not only coping with the constant threat of violent extremism, we face global warming, potential pandemic diseases, economic uncertainty, Middle Eastern conflicts, the debilitating consequences of partisan politics, and so on. The list grows each time you click on the news. Fear seems to be infecting the collective consciousness like a virus, resulting in a culture of anxiety and a rising tide of helplessness, despair, and anger. In the U.S.,  symptoms of this chronic unease can be seen in the proliferation of apocalyptic paranoia and conspiracy theories coupled with the record sales of both weapons and tickets for Hollywood’s superhero blockbusters, fables that reflect post-9/11 fears and the desire for a hero to sweep in and save us.

That’s why I want to take the time to analyze some complex but important concepts like the sublime, the Gothic, and the uncanny, ideas which, I believe, can help people get a rational grip on the forces that terrorize the soul. Let’s begin with the sublime.

18c philosopher Immanuel Kant

18C Philosopher Immanuel Kant

The word is Latin in origin and means rising up to meet a threshold. To Enlightenment thinkers, it referred to those experiences that challenged or transcended the limits of thought, to overwhelming forces that left humans feeling vulnerable and in need of paternal protection. Edmund Burke, one of the great theorists of the sublime, distinguished this feeling from the experience of beauty. The beautiful is tame, pleasant. It comes from the recognition of order, the harmony of symmetrical form, as in the appreciation of a flower or a healthy human body. You can behold them without being unnerved, without feeling subtly terrorized. Beautiful things speak of a universe with intrinsic meaning, tucking the mind into a world that is hospitable to human endeavors. Contrast this with the awe and astonishment one feels when contemplating the dimensions of a starry sky or a rugged, mist-wreathed mountain. From a distance, of course, they can appear ‘beautiful,’ but, as Immanuel Kant points out in Observations on the Feeling of the Beautiful and Sublime, it is a different kind of pleasure because it contains a “certain dread, or melancholy, in some cases merely the quiet wonder; and in still others with a beauty completely pervading a sublime plan.”

This description captures the ambivalence in sublime experiences, moments where we are at once paradoxically terrified and fascinated by the same thing. It is important here to distinguish ‘terror’ from ‘horror.’ Terror is the experience of danger at a safe distance, the potential of a threat, as opposed to horror, which refers to imminent dangers that actually threaten our existence. If I’m standing on the shore, staring out across a vast, breathtaking sea, entranced by the hissing surf, terror is the goose-pimply, weirded-out feeling I get while contemplating the dimensions and unfathomable power before me. Horror would be what I feel if a tsunami reared up and came crashing in. There’s nothing sublime in horror. It’s too intense to allow for the odd mix of pleasure and fear, no gap in the feeling for some kind of deeper revelation to emerge.

Friedrich's Monk by the Sea

Friedrich’s Monk by the Sea

While Burke located the power of the sublime in the external world, in the recognition of an authority ‘out there,’ Kant has a more sophisticated take. Without digging too deeply into the jargon-laden minutia of his critique, suffice it to say that Kant ‘subjectivizes’ the concept, locating the sublime in the mind itself. I interpret Kant as pointing to a recursive, self-referential quality in the heart of the sublime, an openness that stimulates our imagination in profound ways. When contemplating stormy seas and dark skies, we experience our both nervous system’s anxious reaction to the environment along with a weird sense of wonder and awe. Beneath this thrill, however, is a humbling sense of futility and isolation in the face of the Infinite, in the awesome cycles that evaporate seas, crush mountains, and dissolve stars without a care in the cosmos as to any ‘meaning’ they may have to us. Rising up to the threshold of consciousness is the haunting suspicion that the universe is a harsh place devoid of a predetermined purpose that validates its existence. These contradictory feelings give rise to a self-awareness of the ambivalence itself, allowing ‘meta-cognitive’ processes to emerge. This is the mind’s means of understanding the fissure and trying to close the gap in a meaningful way.

Furthermore, by experiencing forms and magnitudes that stagger and disturb the imagination, the mind can actually grasp its own liberation from the deterministic workings of nature, from the blind mechanisms of a clockwork universe. In his Critique of Judgment, Kant says “the irresistibility of [nature’s] power certainly makes us, considered as natural beings, recognize our physical powerlessness, but at the same time it reveals a capacity for judging ourselves as independent of nature and a superiority over nature…whereby the humanity in our person remains undemeaned even though the human being must submit to that dominion.” One is now thinking about their own thinking, after all, reflecting upon the complexity of the subject-object feedback loop, which, I assert, is the very dynamic that makes self-consciousness and freedom possible in the first place. We can’t feel terrorized by life’s machinations if we aren’t somehow psychologically distant from them, and this gap entails our ability to think intelligently and make decisions about how best to react to our feelings.

Van Gogh's Starry Night

Van Gogh’s Starry Night

I think this is in line with Kant’s claim that the sublime is symbolic of our moral freedom—an aesthetic validation of our ethical intentions and existential purposes over and above our biological inclinations and physical limitations. We are autonomous creatures who can trust our capacity to understand the cosmos and govern ourselves precisely because we are also capable of being terrorized by a universe that appears indifferent to our hopes and dreams. Seen in this light, the sublime is like a secularized burning bush, an enlightened version of God coming out of the whirlwind and parting seas. It is a more mature way of getting in touch with and listening to the divine, a reasonable basis for faith.

My faith is in the dawn of a post-Terrorized Age. What Kant’s critique of the sublime teaches me is that, paradoxically, we need to be terrorized in order to get there. The concept of the sublime allows us to reflect on our fears in order to resist their potentially debilitating, destructive effects. The antidote is in the poison, so to speak. The sublime elevates these feelings: the more sublime the terror, the freer you are, the more moral you can be. So, may you live in terrifying times.

Friedrich's Wanderer above the Sea of Fog

Friedrich’s Wanderer above the Sea of Fog

What is language? What can we do with it, and what does it do to us?

Posted in 1984, 99%, anxiety, barriers to critical thinking, Big Brother, Brain Science, Consciousness, critical thinking, Dystopia, Dystopian, emotion, freedom, George Orwell, humanities, irrational, Jason Reynolds, limbic system, Moraine Valley Community College, Neurology, Newspeak, Nineteen Eighty-four, Orwell, paranoia, Philosophical and Religious Reflections, Philosophy, Philosophy of Mind, politics, Politics and Media, rational animal, Rationalization, rationalizing animal, reason, resistance to critical thinking, theory, theory of mind, thoughtcrime, Two Minutes Hate, Uncategorized, Uroboros, Zombies with tags , , , , , , , , , , , , , , , , , , , , , , , , , , on September 20, 2013 by Uroboros

In Orwell’s 1984, INGSOC’s totalitarian control of Oceania ultimately depends on Newspeak, the language the Party is working hard to develop and implement. Once in common use, Newspeak will eliminate the possibility of thoughtcrime, i.e. any idea that contradicts or questions absolute love for and devotion to Big Brother. Newspeak systematically scrubs away all those messy, gray areas from the English language, replacing them with a formal, logically-rigid system. For example, instead of having to decide whether to use ‘awesome,’ ‘fabulous,’ or ‘mind-blowingly stupendous’ to describe a situation, you would algorithmically deploy the Newspeak formula, which reduces the plethora of synonyms you could use to ‘good,’ ‘plusgood,’ or ‘doubleplusgood.’ Furthermore, all antonyms are reduced to ‘ungood,’ ‘plusungood,’ or ‘doubleplusungood.’Newspeak

Syme, a Party linguist, tells Winston, the novel’s rebellious protagonist, that the ultimate goal is to eliminate conscious thought from the speaking process altogether. The Newspeak term for it is ‘duckspeak‘—a more mechanical form of communication that doesn’t require higher-level cognitive functions, like having to pick the word that best expresses your feelings or creating a new one. That sense of freedom and creativity will simply cease to exist once Newspeak has finally displaced ‘Oldspeak.’ “The Revolution will be complete,” Syme tells Winston, “when the language is perfect.” The Proles and the Outer Party (95% of Oceania’s population) will become a mass of mindless duckspeakers, the linguistic equivalent of ‘philosophical zombies’.

Newspeak implies that cognition depends on language—that symbolic communication isn’t merely a neutral means for sending and receiving thoughts. Instead, the words and sentences we use actually influence the way we think about and perceive the world. While Orwell was obviously inspired by the propaganda techniques used by the dictators of his day, perhaps he was also familiar with Nietzsche’s “On Truth and Lying in a Non-Moral Sense” or the work of anthropologists like Boas and Sapir, all of whom embraced some form of what is now called linguistic relativism, a theory which argues for the reality of what Orwell proposed in fiction: we experience the world according to how our language lets us experience it.

Linguist Lera Boroditsky

Linguist Lera Boroditsky

Linguistic relativism is on the rise in the contemporary study of language. The work of, for example, Lera Boroditsky and Daniel Everett provide strong empirical data that supports (at least the weak version of) linguistic relativism, challenging the Chomskian paradigm, which posits a universalist account of how language is acquired, functions, and, by extension, relates to cognition and perception.

In my previous essay on the Uroboric model of mind, I asked about the connection between neuronal processes and symbolic systems: how can an abstract representation impact or determine the outcome of tangible physical processes? How can ionic thresholds in axons and the transmission of hormones across synaptic gaps depend upon the meaning of a symbol? Furthermore, how can we account for this in a naturalistic way that neither ignores the phenomena by defining them out of existence nor distorts the situation by positing physics-defying stuff? In short, how do we give an emergent account of the process?

StopFirst, we ask: what is language? Most linguists will say it means symbolic communication: in other words, information exchanges that utilize symbols. But what is a symbol? As you may recall from your grade school days, symbols are things that stand for, refer to, or evoke other things—for example, the red hexagonal shapes on street corners provokes your foot to press against the brake, or the letters s, t, o, and p each refer to particular sounds, which, when pronounced together, mean ‘put your foot on the brake.’ Simple enough, right? But the facility with which we use language, and with which we reflexively perceive that usage, belies both the complexity of the process and the powerful effects it has on our thinking.

Cognitive linguists and brain scientists have shown that much of our verbal processing happens unconsciously. Generally speaking, when we use language, words just seem to ‘come to mind’ or ‘show up’ in consciousness. We neither need to consciously think about the meaning of each and every word we use, nor do we have to analyze every variation of tone and inflection to understand things like sarcasm and irony. These complex appraisals and determinations are made subconsciously because certain sub-cortical and cortical systems have already processed the nonverbal signals, the formal symbols, and decoded their meaning. That’s what learning a language equips a brain to do, and we can even identify parts that make major contributions. Broca’s area, for example, is a region in the left frontal lobe that is integral to both language production and comprehension. If a stroke damages Broca’s area, the sufferer may lose the ability not only to produce speech, but to comprehend it as well.

Left-brain language regions

Left-brain language regions

Dr. Jill Bolte Taylor

Dr. Jill Bolte Taylor

One of the most publicized cases of sudden ‘language-less-ness’ is that of Dr. Jill Bolte Taylor, the Harvard brain scientist who, in 1996, happened to have a stroke in her left hemisphere, which impacted both the Broca’s and Wernicke’s areas of her brain. She couldn’t remember who she was. She couldn’t use language. Taylor compares it to dying and being reborn, to being an infant in a grown woman’s body. Her insights into a language-less reality shed light on how words and sentences impact cognition. She says she lost her inner voice, that chatter that goes on ‘in’ the head. She no longer organized her experiences in a categorical, analytic way. Reality no longer showed up to her with the same fine-grained detail: it wasn’t divided and subdivided, classified and prejudged in terms of past associations or future expectations, in terms of self and other, us vs. them, and so on. She no longer had an ‘I’ at the center of her experience. Once the left-brain’s anxious, anal-retentive chatter went offline, right-brain processes took over, and, Taylor claims, the world showed up as waves of energy in an interconnected web of reality. She says that, for her at least, it was actually quite pleasant. The world was present in a way that language had simply dialed down and filtered out. [Any of you who are familiar with monotheistic mysticism and/or mindfulness meditation are probably seeing connections to various religious rituals and the oceanic experiences she describes.]

This has profound implications for the study of consciousness. It illustrates how brain anatomy and neural function—purely physical mechanisms—are necessary to consciousness. Necessary, but not sufficient. While we need brain scientists to continue digging deep, locating and mapping the neuronal correlates of consciousness, we also need to factor in the other necessary part of the ‘mystery of consciousness.’ What linguistic relativism and the Bolte Taylor case suggest is that languages themselves, specific symbolic systems, also determine what consciousness is and how it works. It means not only do we need to identify the neuronal correlates of consciousness but the socio-cultural correlates as well. This means embracing an emergent model that can countenance complex systems and self-referential feedback dynamics.

OrwellOrwell understood this. He understood that rhetorical manipulation is a highly effective form of mind control and, therefore, reality construction. Orwell also knew that, if authoritarian regimes could use language to oppress people [20th century dictators actually used these tactics], then freedom and creativity also depend on language. If, that is, we use it self-consciously and critically, and the language itself has freedom and creativity built into it, and its users are vigilant in preserving that quality and refuse to become duckspeakers.

The Science of Myth and the Myth of Science

Posted in anxiety, archetypes, barriers to critical thinking, Brain Science, collective unconscious, Consciousness, Creationism, critical thinking, emotion, God, History, humanities, irrational, Jung, Knowledge, limbic system, Maori, Myth, Mythology, Neurology, paranoia, Philosophical and Religious Reflections, psychoanalysis, Psychology, rational animal, Rationalization, rationalizing animal, reason, Religion, religious, Repression, resistance to critical thinking, Science, social psychology, terror, Terror Management Theory, theory, theory of mind, Uroboros, V.S. Ramachandran, William James with tags on February 3, 2012 by Uroboros

Years ago in a mythology course I taught, a student once came up to me after class with an annoyed look. We’d just covered the Maori creation myth, and something about it had gotten under his skin. According to the myth, father sky, Rangi, and mother earth, Papa, formed out of primordial chaos and tangled in a tight, erotic embrace. Their offspring decided to pry Rangi and Papa apart in order to escape and live on their own. With his ax, Tane, the forest god, finally separated Father Sky and Mother Earth, and in that space, life grew and flourished.

The broad strokes of this creation myth aren’t unique. Ancient Egyptian, Chinese, Greek, and Norse stories (just to name a few) relate life’s origins to the separation of giant primordial parents.

“How could people believe that?” the student asked, shaking his head. It wasn’t his perturbed incredulity that struck me. Often, students initially find stories from ancient cultures to be, well, weird. It was his condescension. For him, ‘myth’ meant not just ‘false,’ but ‘silly.’ In his defense, it’s what it means for most of us. When we want to sneer at strange, fantastical beliefs, we call them ‘myths.’

The term is synonymous with ‘false.’

‘Myth’ originally meant the exact opposite, though. The Ancient Greek root of mythos referred to life’s deepest truths, something discussed and contemplated with a sense of awe and reverence, not incredulity and disdain. Seen in this light, myths are the stories humans tell in order to explain the unknown and make sense of the world. My thesis is that humans are essentially myth-making creatures and will continue to be so—no matter how scientific our stories get.

Scowls form on some students’ faces when they hear a professor say that science is, on a certain level, still mythological. Scientists are still storytellers, though, trying to turn the unknown into the known. Ancient and modern storytellers have different ways of approaching the unknown—different notions about what counts as a valid explanation.

Today, people (tend to) prefer creation stories that fit the scientific paradigm that’s proved so successful in explaining and predicting natural phenomena. But in dismissing past explanations, we overlook some interesting similarities. Ancient and modern stories share what psychologist Carl Jung called archetypal patterns. Jung theorized that humans share underlying patterns of thought because we all inherit the same neurological equipment. The anatomical differences between an ancient human brain and, say, Darwin’s brain are negligible. Setting the obvious differences between the Maori story and Darwin’s theory aside for just a moment, there are archetypal similarities between these accounts.

Darwinism says life began in a kind of primordial soup where, over time, inorganic molecules organized into the first living cell, and then single-celled organisms eventually separated into multicellular organisms, and from that, thanks to genetic mutation and the pressure of natural selection, lifeforms diversified and flourished. The Big Bang has this underlying pattern too: a ‘primordial atom,’ containing all matter, exploded and separated into the cosmic forms we see today.

I think the key difference between ancient and modern creation stories is in the tendency to personify nature, or the lack there of. The modern scientific method tries remove the subjective factor from the equation. Once we stopped projecting our emotions upon ‘Mother Nature,’ we started telling different stories about how ‘she’ works.

Now scientists are investigating how myth-making itself works. Neurologists and evolutionary psychologists are exploring the biological basis of our ability to mythologize and the possible adaptive purposes informing our storytelling instinct. Let’s start by getting hypothetical and do a little ‘state of nature’ thought experiment. Imagine a prehistoric hunter startled by booming thunder. Now we know the meteorological explanation, but he doesn’t. He experiences what thunder feels like to him: anger. But who is angry?

The problem is addressed by the limbic system, the subcortical brain structure that initially processes emotion and memory. Potential dangers must be understood or anxiety will overwhelm the mind, rendering the hunter less able to cope and survive. The amygdala, the brain’s watchdog, primes the body for action—for fight or flight—while the hippocampus tries to associate feelings with memories in order to focus and better define both the stimuli and the appropriate response. This process is entirely unconscious—faster than the speed of consciousness.

The hippocampus recalls an experience of anger, perhaps one involving the hunter’s own father, and then the cerebral cortex, home of our higher cognitive capacities, gets involved. Somewhere in our cortical circuitry, probably in the angular gyrus, where neuroscientist VS Ramachandran says our metaphoric functions reside, storm images are cross-wired with paternal images. A myth is born: sky is father, earth is mother, and the cause-effect logic of storytelling in the brain’s left-hemisphere embellishes until the amygdala eases up, and the anxiety is relatively alleviated. At least the dread becomes more manageable. In neurochemical terms, the adrenaline and cortisol rush are balanced off and contained by dopamine, the calming effect of apparent knowledge, the pleasure of grasping what was once unknown.

From then on, thunder and lightning will be a little less terrifying. Now there is a story to help make sense of it. Storms are a sign of Father Sky’s anger. What do we do? We try to appease this force–to make amends. We honor the deity by singing and dancing. We sacrifice. Now we have myths and rituals. In short, we have a religion.

That’s why so many prehistoric people, who had no contact with one another, came to believe in primordial giants, and we are still not that far removed from this impulse. For example, why do we still name hurricanes? Sometimes, it’s just easier for us to handle nature if we make it a little more human. As neurologists point out, we are hardwired to pick up on patterns in the environment and attribute human-like qualities and intentions to them. Philosophers and psychologists call this penchant for projecting anthropomorphic agency a theory of mind. English teachers call it personification, an imaginative, poetic skill.

This is why dismissive, condescending attitudes toward myth-making frustrate me. The metaphoric-mythic instinct has been, and still is, a tremendous boon to our own self-understanding, without which science, as we know it, probably wouldn’t have evolved. I came to this conclusion while pondering a profound historical fact: no culture in human history ever made the intellectual leap to objective theories first. Human beings start to know the unknown by projecting what they’re already familiar with onto it.

It’s an a priori instinct. We can’t help it.

Modern science helps make us more conscious of this tendency. The scientific method gives us a way of testing our imaginative leaps—our deeply held intuitions about how the world works—so we can come up with more reliable and practical explanations. The mythological method, in turn, reminds us to be critical of any theory which claims to have achieved pure, unassailable objectivity—to have removed, once and for all, the tendency to unconsciously impose our own assumptions and biases on the interpretation of facts. The ability to do that is just as much a myth as the ‘myths’ such claims supposedly debunk. I’ll paraphrase William James here: The truth is always more complex and complicated than the theories which aim to capture it. Just study the history of modern science—the evolution of theories and paradigms over the last 350 years especially—to see evidence for the asymmetrical relationship between beliefs, justifications, and the ever-elusive Truth.

Laid-back, self-aware scientists have no problem admitting the limitations built into the empirical method itself: Scientific conclusions are implicitly provisional. A theory is true for now. The beauty and power of science hinges upon this point—the self-correcting mechanism, the openness to other possibilities. Otherwise, it’s no longer the scientific method at work. It’s politicized dogma peddling. It’s blind mythologizing.

The recent research into the neurology and psychology of myth-making is fascinating. It enhances our understanding of what a myth is: a story imbued with such emotional power and resonance that how it actually lines up with reality is often an afterthought. But what’s equally fascinating to me, is the mythologizing which still informs our science-making.

I think it’s, of course, dangerous to believe blindly in myths, to accredit stories without testing them against experience and empirical evidence. I also believe it’s dangerous to behold scientific theories as somehow above and beyond the mythological instinct. Like the interconnected swirl of the yin-yang, science and myth need each other, and that relationship should be as balanced and transparent as possible.

Uroboros. A universal symbol of balance and immortality.

Uroboros. A universal symbol of balance and immortality.