NeuroLinguistic Programming (NLP): Cracking the Modal Code
If you really want to establish rapport with someone, it is first necessary to step into the world they inhabit. True communication begins with a small act of empathic imagination…
We are what we think.
All that we are arises
With our thoughts.
With our thoughts,
We make our world.
—The Buddha
This series here on Wisdom School is a concise introduction to the ideas and methods of a theory of communication and behavior called NeuroLinguistic Programming (NLP)[1].
A practical theory, it was developed in the 1970s by two pioneering thinkers, Richard Bandler and John Grinder. It has had a major impact on professional message makers, including leading figures from the worlds of polling, politics, psychology, self-help, sales and advertising.
Though the techniques of NLP are now being used around the world to pitch everything from political candidates to toys for children, it is also a powerful and transformative psychotherapeutic and relationship tool that can benefit individuals seeking greater control over the minds and satisfaction in their relationships and professional lives. Seminars now offer to teach NLP to therapists, teachers, and parents. NLP can be used to for everything from the improvement of communication skills to the defeat of phobias. It has the power to change lives.
Just as NLP is for everybody, so is this series.
The fundamental insights of NLP are based on what it calls Representational Systems. For now, simply consider them the primary senses — sight, sound, feeling, taste, smell — through which the world enters our minds. Many people are shocked when they first discover that not everybody “sees” (or “hears” or “feels”) the world the same way they do. Understanding how and why this is so is the key to NLP’s transformative power.
It’s a fact that we each have our own particular ways of experiencing life, and most people have a single representational system upon which they most heavily rely. We’ll explore this, and the consequences of it, in the following pages, as well as discover how knowing another person’s “primary representational system” is a vital bit of information if you want to work with, help heal, or even communicate with that person.
In the following articles in this NLP series, every word means precisely what it says. The tools of communication revealed herein are also used in its writing. You may spot many of these on your first read through, although they will probably be most visible, most clearly heard, and most easily picked out on a subsequent reading.
The experience of reading this series will permanently transform your understanding of the world.
Proceed with joyful enthusiasm!
II. MODALITIES: THE KEY TO COMMUNICATION
If you really want to establish rapport with someone, it is first necessary to step into the world they inhabit. True communication begins with a small act of empathic imagination.
To get to the building blocks that make up a person’s fundamental story about the world, we must first listen to how they speak. When we do this, and only when we do this, we begin to understand how they think.
So, how do people speak?
We talk about the world in pretty much the same way that we experience it.
Listen long enough and you’ll discover that communication begins with the senses.
Here’s an example from my own family. A few Christmases ago, my wife and I were fortunate enough to have all three of our adult kids with us, and we decided to all go to a movie. We had spread the newspaper out on the table, turned to the movie pages, and were trying to decide which movie to go to.
My visually oriented wife, Louise, said, “Well, let’s see which movie we should go see.”
One of our daughters, also very visual, immediately replied, “Well, I’d like to go see that one,” and she pointed to a particular movie ad.
We kept talking, and we all noticed that our son hadn’t made a suggestion. So we asked him, “Which movie do you want to go to?”
He said, “I’m not really sure.”
So our first daughter asked, “Well, which one looks best to you?”
And he answered, “I don’t know.”
Then our second daughter, who’s very auditory (like me), asked, “Well, which one sounds best to you?”
Again, he answered, “I don’t know.”
All of a sudden I realized what was going on and so I said to him, “Which one feels right to you?”
And he said, “Well, that one!” and pointed to one of the movies.
All along my son knew what he felt, but he couldn’t explain it in a visual sense or an auditory sense in response to visually or auditorily based questions.
This is very common. We experience the world through our senses. If we were unable to see, to hear, to feel, to notice balance, to smell, and to taste, we would have no experience of the world’s existence. All internal experience begins as some sort of external experience that we then internalize and place in a representational context.
What’s critical to deciphering others’ communication is to realize that we each have different ways of encoding and storing and using that information. My son didn’t “hear” the rest of the family’s questions because he experiences the world primarily through feeling things rather than hearing or seeing them. Most people rely primarily on one of our sensory systems and use that system as their main way of experiencing the world.
These senses, like the rest of the communication code, are value-neutral. Someone who is visually oriented can be conservative or liberal. That’s useful. It means that once you identify someone’s primary way of experiencing the world, you can use that tool to communicate with them at a very basic level even if you don’t share the same story about the meaning of the world.
Modalities
People in the communication field refer to sensory information used in this way as modalities. People who primarily see the world are using a visual modality. People who use their hearing to experience the world are using an auditory modality. People who feel the world are using what is called a kinesthetic modality.
We can usually use all of these sensory modalities when we need to, but most of us have one primary modality that acts as our first filter.
For example, my primary way of knowing the world is auditory. It’s probably why I enjoy doing radio. I think it’s also why I can write reasonably well and love to read — because that’s all auditory: when I’m writing I’m hearing my own voice inside my head going out on the page, and when I’m reading I’m hearing the voices of the author and the characters.
Most people in our culture are visual. My wife, Louise, who is very visual, reads about twice as fast as I do. When she’s reading, she’s making pictures in her mind, but she only rarely sounds out the words in her head the way I do.
No matter which sensory modality is primary for individuals, they disclose it in the way they think, talk, and write.
Visual people think/experience/talk/write in pictures. A visual person will say, for example:
“I see what you’re talking about.” “I now understand—it’s crystal clear!” “We need to shine the light of day on this to see it better.” “Let’s show our readers what we mean.” Here’s how I see this happening.” They’ll even say on the telephone, when they can’t actually see the person to whom they’re talking, “I’ll see you later.”
Auditory people often prefer listening to the radio instead of watching television; they live in a sea of sound. When they communicate, an auditory person will say:
“I hear what you’re saying.” “We need to tell this story to as many people as possible so they can all hear it in a meaningful way.” “Here’s how I’d want the message to sound.” Auditory people often end phone calls with, “I’ll talk with you later.”
People whose primary way of understanding is kinesthetic experience the world through their feelings. When they see and hear things, they translate those visions and stories into feelings. Kinesthetic people will say:
“I get your point.” “Thanks for sharing.” “I like how you said it, and I want to take it to the next level.” They’ll often end a phone call with, “I’ll catch you later” or “Let’s stay in touch.”
There are a few of us who have as a primary sensory modality a sense other than seeing, hearing, or feeling. In my seventy-plus years on this planet, I’ve met two people who leaned heavily on their sense of smell and taste, who remembered events by what they ate there, and who described ideas as having “a good flavor” or with similar olfactory or gustatory metaphors. Both of these people had a weight problem, by the way, although that’s a pretty small sample from which to draw any conclusions. But such people are rare: most of us are visual, auditory, or kinesthetic.
Psychologist David Lemire found that 75 percent of adult learners in America are primarily visual. Having worked in aboriginal and indigenous societies on five continents, I’ve found that between 75 and 95 percent of these folks are primarily kinesthetic. Yet when they move from “the reservation” into “the city” and grow up going to “European” schools, they very often become primarily visual or auditory, like the average American.
This tells me that kinesthetics represent the “natural” human way of being, probably our most functional and useful way of being, but also a way of being that’s changed or distorted by the experience of our public schools and early life, where virtually 100 percent of information is presented either auditorily (teachers lecturing or by reading books) or visually (through metaphor, pictures, and TV).
Cracking the Modal Code
Sensory modalities are very important for communication because when we process new information, we tend to use our primary sensory system to create internal memory structures describing what the world means and how things are.
For example, when a memorable event like 9/11 occurs, we will each store that memory differently.
Even reading those numbers, people who are primarily visual may have flashed to pictures of the towers falling. People who are primarily auditory may remember the sounds of the towers falling, or the screaming and sirens, or a particular newscaster’s voice. People who are primarily kinesthetic will remember their shock at hearing the news and the surreal inertial feelings they may have experience for days afterward.
The memory is literally stored in our brains through these sensory modalities. And different parts of our brains process each: watch a person’s eyes as they remember a past event and look up and to their left. When they’re remembering a conversation, their eyes will move to the side. When they’re remembering a feeling, they’ll typically look down and to their right. (These generally reverse for left-handed people.)
If you want to effectively communicate with someone, it’s important to first know which sensory modality is primary for them and then to use that modality to communicate. The key to cracking the modal code is simple once you understand modalities: people tend to communicate using metaphors from their primary modalities.
When a kinesthetic person communicates, he does so using metaphors and words based on physical experience, touch, and feelings. For example, when you want to effectively communicate with a kinesthetic person, use language and metaphor that likewise evoke physical experience, touch, and feelings. Similarly, talk in visual metaphor to visually oriented people, and auditory metaphor to auditory folks.
If you want to communicate with a group of people, however, it’s useful to craft the message in such a way that it touches all three modalities. You can hear me do that on the radio (or see me do it on this page) by stringing all the modalities together into a sentence.
I’ll start out saying:
“I want to talk to you today about the stories that we tell ourselves, the way we view the world, and the way we all feel as Americans.”
So some visual people in the room are thinking, I see that.
Auditory people are saying, “Oh, yeah, I hear that. That makes sense.”
And the kinesthetic people say to themselves, “Feels good to me.”
Feeling Comes Before Thinking
When we communicate with each other, we often believe we are using the most rational parts of our minds. We think we know exactly why we say something in a certain way. We think we know what someone else is saying to us.
But we humans, being the product of a long evolutionary process, really have three brains. And effective communicators are able to do so because they speak to all three of those brains.
First, there’s the most primitive of our brains, sometimes referred to as the “reptilian brain” because we share it in common with reptiles like alligators and komodo dragons.
The reptilian brain has a singular focus: survival. It doesn’t think in abstract terms, and it doesn’t feel complex emotions. Instead, it’s responsible for fight or flight, hunger and fear, attack or run. It’s also nonverbal — you can stimulate it with the right words, but it operates purely at the level of visceral stimulus/response.
The second brain is one we share with the animals that came along after reptiles: mammals.
The “mammalian brain”—sometimes referred to as the “limbic brain” because it extends around and off of the reptilian brain in a dog-leg shape that resembles a limb — handles complex emotions like love, indignation, compassion, envy, and hope (among hundreds of others).
Anybody who has worked with animals or had a pet knows that mammals share these emotions with humans because we share this brain. Although a snake can’t feel shame or enthusiasm, it’s completely natural for a dog or cat. And, like the reptilian brain, the mammalian brain can be stimulated indirectly by words and is also nonverbal. It expresses itself exclusively in the form of feelings, most often felt in the heart or the gut.
The third brain — the neocortex (“new” cortex) — is a brain structure we share with the higher apes, although ours is a bit more sophisticated. Resting over the limbic brain (which is atop the reptilian brain), our neocortex is where we process abstract thought, words and symbols, logic, and time.
Recent research has shown that the brain creates information as well as processes it. A study published a few years back in Spain demonstrated that the brain uses the same exact mechanism to perceive a smile and to create a smile. This is pure biology — our senses detect a smile and our brain replicates it.
Does that mean we’re just machines? No. In a way scientists are just beginning to understand, these biological processes are also related to our feelings. When we see a smile, and when we smile back, we actually feel happier. Actually, anytime we smile, we feel happier. The leading psychologist of human emotion, Dr. Robert Zajonc, has shown that simply smiling — even faking a smile — can make you feel happy.
This is a very deep part of the communication code. Above and beyond all the amazing things we do, we humans are still animals, and our communication is tied to our biology. We react to sensations. That is why the primary modalities of truly effective communication all are based on the senses.
And within these modalities are sub-modalities that drill down even deeper into the brain.
How Feelings Effect Communication
Since the days of René Descartes, many in the Western world have believed, “I think, therefore I am.” We prioritize thinking. That point of view has led to some very important achievements. Some would say that America would not be a democracy today if the Founders hadn’t believed they could rationally determine the very best system of governance.
Rationality was at the core of our nation’s founding. One of the essential differences between the conservative and the liberal worldview, for example, is that liberals believe in rationality whereas conservatives believe that human events are ordered by forces that are beyond the ability of our rational mind to understand or control.
(This belief in rationality was so strong among the Founders that Thomas Jefferson even took the first four books of the New Testament — the four Gospels that tell the stories of Jesus — and cut out of them all supernatural events. What he strung back together were largely the words and the non-miraculous deeds of Jesus, and Jefferson’s handiwork has been continuously in print for more than two hundred years as The Jefferson Bible.)
Yet even for the most stolid believers in rationality, the reality is that when we are making the most important life decisions, we almost always base them on our feelings rather than our rational thoughts. Some people call these feelings their “gut” or “intuition.” Some people who are science-minded like to call them “rational certainties.” Call them what you will; the decisions we make are based in small or large part on feelings.
Science, that most rational of endeavors, has finally acknowledged that feelings precede thinking. In a groundbreaking work published in American Psychologist, the journal of the American Psychological Association, Dr. Robert Zajonc argued persuasively that decision-making is based on our feelings.
Zajonc points out that we can “like something or be afraid of it before we know precisely what it is and perhaps even without knowing what it is.” That’s because we “think” first through the limbic brain. Zajonc writes:
“The limbic system that controls emotional reactions was there before we evolved language and our present form of thinking.” He continues, “It is rather…likely that the affective system retained its autonomy,” remaining separate from cognition. That means our ways of feeling precede and are different and separate from our ways of thinking.
Our five senses transmit sensory data directly to the reptilian brain, which translates the data into our most primitive forms of emotion like fight or flight. Those data are then transferred to the higher limbic and cortical brains. We feel before we even have a chance to think.
What that means for communication is that the most effective communicators rely on feeling-based (kinesthetic) communicative strategies. They reach people at the level of the limbic brain, at the seat of feeling.
The Submodalities Code
The communication code for our feelings is based in the sensory modalities. But within the primary modalities are submodalities that allow us to fine-tune and direct our specific impact and message toward the limbic brain.
Auditory submodalities have to do with how we hear. Do you hear the idea or event behind you? Do you hear it in front of you? Is it loud or soft, distant or far, clear or muffled?
Kinesthetic submodalities have to do with where we experience our feelings in our bodies (there is always a physical corollary to kinesthetic content) and how we feel them. Our language reveals that we often feel ideas in parts of our body: “He felt that like a hot poker to the leg”; “He felt punched in the gut”; “Her heart raced.”
Visual submodalities have to do with how we see an idea or event. Is it big or small? Near or far? Bright or dim? Color or black-and-white? Still picture or a movie? Crystal clear or fuzzy?
Submodalities are the ways our brain sorts and then stores sensory information. Sensory information does not come in through the rational part of our brain, the left hemisphere, the part that is logical, the part that does quadratic equations. Instead, all the information we gather about the world comes in through our sensory reptilian and limbic brains; it is first processed, mostly in the irrational and nonspeaking right hemisphere of our neocortex.
We have experiences throughout the day; we see things, hear things, feel things. We have to figure out a way to make sense of and to store and save those experiences as memories. Throughout the course of the day, we write much of them down on a little one-day scratchpad in the reptilian brain called the hippocampus.
Then at night, as we dream, we process that information, which is one reason why our dreams seem irrational: we’re trying to figure out, Okay, what happened today that I need to store and what do I need to throw away? We attach emotional tags to ideas and events that we need to store.
We store information by emotion.
Imagine that the mind’s filing system is like the Pentaflex office filing system. Imagine your mind is filled with colored folders—blue, yellow, red, each representing an emotion; then inside those folders you can put the smaller manila file folders representing specific event memories.
Speaking metaphorically, the big colored folders encode or sort and store information by emotions. There’s a yellow folder for happiness, a green for love, and a red folder for anger, and each folder is very subtle. There are thousands of variations of emotion for which we don’t even have words: these are the many colored folders. And inside of those are the smaller manila folders made up of the individual experiences.
The way the larger, colored folders are organized is by submodality. Submodalities are the labels that define not only where the memories are but also how they exist.
Here’s how we crack the communication code: because we actually understand the world through these submodalities, you can use them to change how you—and others—feel and thus how we (or they) think.
Creating a Scalpel
You can actually change how someone feels about past events or ideas by helping them change their “locations” in their brain’s sorting system.
If your listener is willing, you can actually change how he thinks about an event in just a few minutes by asking him to refile the experience under a different submodality. If he sees his experience in color, ask him to change it to black-and-white. If he sees it close up, ask him to push it farther away. If he feels it as cold, ask him to make it feel warm.
The point is to change how the experience is filed in the brain’s filing system. As your listener changes the submodalities, she is actually moving the memory to different physical parts of the brain. You can see that happen on an electroencephalogram (EEG) and with their eye motion: different parts of the brain have different resources; they have different ways of dealing with things.
In the field of psychotherapy, if you want to heal a negative memory, you can let the brain do that work by moving the memory around. (I describe this technique in my book Walking Your Blues Away: How to Heal the Mind and Create Emotional Well-being.) Somewhere along the line, that memory will pass through a part of the brain where there is a resource — some skill or experience — that will be able to change the memory into one that’s more useful.
You’ll know when you’ve succeeded in truly changing the emotional charge and filing system associated with a memory because the story about the meaning of that memory changes.
Changing Submodalities
A Transcript from the Thom Hartmann Program, December 15, 2006
After describing how submodalities work, I asked for volunteers to call in who would be okay with my changing one of their memories on the air, using the submodality filing system. The first volunteer was Barbara, listening in Los Angeles.
Thom: Hi, Barbara! Thanks for calling in. What I’d like you to do is remember something in the recent past, the last week or so, that was a moderately unpleasant experience. The example that I would give is like somebody flipping you off in traffic or a surly waiter.
Barbara: Yeah, I got one.
Thom: You got one, okay, great. Now I want you to describe to me the submodalities associated with it. In other words, when you remember that experience, and you see the picture of that, do you see the picture in color or black-and-white?
Barbara: Color.
Thom: Okay. Describe in space where it is: in front of you, behind you, you know, where is it?… Is it like 5 feet in front of you, is it 20 feet behind you, is it 6 feet off to your left?
Barbara: In front of me.
Thom: It’s in front of you. And how far away?
Barbara: I would say 10, 5 feet.
Thom: 5 to 10 feet. Okay. Is there sound associated with it?
Barbara: Yeah, there’s sound.
Thom: Okay. And is it a movie or is it a still picture?
Barbara: It’s a movie.
Thom: Okay, great. So, what I’d like you to do, Barbara, is first of all push that picture, so we figured out some of the submodalities associated with it, right?
Barbara: Yeah.
Thom: What I’d like you to do is push that picture about twice as far away from you. Push it like 15, 20 feet away from you first of all.
Barbara: Okay. I’ve done it.
Thom: Okay, and now turn it black-and-white.
Barbara: Okay. But that’s hard. That’s hard, to turn it into black-and-white.
Thom: Oh, interesting. Okay, well you can leave it color if you want. Maybe your brain doesn’t want it black-and-white yet.
Barbara: No, my brain doesn’t want black-and-white.
Thom: Okay, well that’s cool. Leave it as color. I want you to scroll all the way to the end of the movie and freeze-frame it…
Barbara: Freeze-frame it, okay.
Thom: And now, in just a second I’m going to make a whoosh sound and I’m going to ask you to play it backward; and you know how you play a movie backward, everybody moves like in the old Charlie Chaplin movies and everybody talks backward like Donald Duck. And I want you to right now play it backward all the way back to the beginning with everybody going, everything going backward—whoosh. Like that.
Barbara: Yep. Yep, I did.
Thom: Okay, all the way to the beginning. Now freeze-frame it at the beginning; and do you see yourself in the picture or do you see it as if you were there?
Barbara: I see myself in the picture.
Thom: Okay, great. What I’d like you to do is I’d like you to paint rainbows across that picture now and put donkey ears on everybody in the picture except yourself…
Barbara: Okay.
Thom: Now, how do you feel about the experience right now?
Barbara: (Laughter.) What I did is I went back to the beginning and said, “Okay, this didn’t happen.”
Thom: Aha! So, you’re laughing now. A few minutes ago I’m guessing you weren’t laughing about this experience.
Barbara: No, I wasn’t.
Thom: Okay, number one, you’re laughing now, and, number two, what you just told me is that story you tell yourself about the experience has changed. Has it?…The story that you tell yourself about what happened there?
Barbara: Yeah.
Thom: Typically, what happens is stories change from “Oh, gee, I was a victim” to “Oh, that happened but I learned from it.” That kind of a transition, was it something like that?
Barbara: No, it was Okay, that happened; let’s just accept it and deal with it, that’s it, instead of being so anxious about it.
Thom: Okay, so you have transformed a memory by simply shifting the filing system that your brain uses for it.
Barbara: Yes! Okay, thank you, Thom.
Thom: You’re welcome.…Thanks, Barbara, for calling.
In the political field, the key is knowing how we use submodalities to encode data in the mind. They’re the “sharp edge” of the scalpel of modalities; and if we include them carefully, our message will have more impact.
For example, Reagan’s speechwriter quoted John Winthrop’s 1630 sermon “A Model of Christian Charity.” It was a powerful metaphor, first used by John F. Kennedy, in a January 9, 1961, speech, in which JFK said:
“I have been guided by the standard John Winthrop set before his shipmates on the flagship Arbella three hundred and thirty-one years ago, as they, too, faced the task of building a new government on a perilous frontier.
“‘We must always consider,’ he said, ‘that we shall be as a city upon a hill — the eyes of all people are upon us.’”
Today the eyes of all people are truly upon us — and our governments, in every branch, at every level, national, state and local, must be as a city upon a hill — constructed and inhabited by men aware of their great trust and their great responsibilities.
But Reagan’s speechwriter added a critical submodality to it — shining.
This so dramatically increased the impact of the image of the city on a hill that most Americans don’t remember that Reagan’s speech had essentially plagiarized Kennedy’s or that Kennedy had been quoting Winthrop. We think Reagan invented the concept and that the concept itself was world changing.
All because the submodality of shining was added into the visual metaphor of a city on a hill.
Here’s the key: if you can change the submodality under which an experience or idea is filed, you can change not only where the experience or idea is filed but also how someone experiences or thinks about it.
[1] NLP is a trademark registered in the United Kingdom by Richard Bandler, and in all cases use of NLP on this website are meant to acknowledge that and any other trademarks.
Modalities are very involved in how we structure education.
The modalities have been used in theories called learning styles--visual, auditory, or kinesthetic. There are even taxonomies of learning that roughly correspond to each area. These theories have faced growing questions about their validity. For learning to happen, there has to be some sort of coding of the perception and that is where the simplicity unravels.
Coding ends up confounding the simplicity of the learning styles. A concept is coded and stored in the our "embodied mind" differently depending on its presentation. If it is solely in a written form--a code for the spoken language--it is stored differently than if it first appeared in a lecture--auditory--or a combined technique. Add movement, which is what skilled teachers do, and there is a third component to the coding and storage.
The simplicity of learning styles is further confounded by what is called retrieval and re-coding. As we recall or retrieve something, perhaps through reading a passage aloud, doing a presentation, answering a series of Socratic questions, etc., we re-code the same knowledge differently. If the cycle is repeated, the knowledge moves from short-term to long-term memory and becomes subconscious actually in a different part of the mind. Adequately spaced retrieval and re-coding is the key technique used by skilled language acquisition teachers.
A great book on this is Benedict Carey's "How we learn." It is available as an audio book for those of use more inclined to auditory learning.
Within educational design there is a movement called Universal Design for Learning which seeks to present material in different modalities. The process of structuring learning for effectiveness is quite involved and more than just have some smart person talking in front of a room.