Discover Interview: The Radical Linguist Noam Chomsky

Over 50 years ago, he began a revolution that's still playing out today.

By Marion Long
Nov 29, 2011 12:00 AMJun 29, 2023 3:00 PM

Newsletter

Sign up for our email newsletter for the latest science news
 

For centuries experts held that every language is unique. Then one day in 1956, a young linguistics professor gave a legendary presentation  at the Symposium on Information Theory at MIT. He argued that every intelligible sentence conforms not only to the rules of its particular language but to a universal grammar that encompasses all languages. And rather than absorbing language from the environment and learning to communicate by imitation, children are born with the innate capacity to master language, a power imbued in our species by evolution itself. Almost overnight, linguists’ thinking began to shift.

Avram Noam Chomsky was born in Philadelphia on December 7, 1928, to William Chomsky, a Hebrew scholar, and Elsie Simonofsky Chomsky, also a sch olar and an author of children’s books. While still a youngster, Noam read his father’s manuscript on medieval Hebrew grammar, setting the stage for his work to come. By 1955 he was teaching linguistics at MIT, where he formulated his groundbreaking theories. Today Chomsky continues to challenge the way we perceive ourselves. Language is “the core of our being,” he says. “We are always immersed in it. It takes a strong act of will to try not to talk to yourself when you’re walking down the street, because it’s just always going on.”

Chomsky also bucked against scientific tradition by becoming active in politics. He was an outspoken critic of American involvement in Vietnam and helped organize the famous 1967 protest march on the Pentagon. When the leaders of the march were arrested, he found himself sharing a cell with Norman Mailer, who described him in his book Armies of the Night as “a slim, sharp-featured man with an ascetic expression, and an air of gentle but absolute moral integrity.”

Chomsky discussed his ideas with Connecticut journalist Marion Long after numerous canceled interviews. “It was a very difficult situation,” Long says. “Chomsky’s wife was gravely ill, and he was her caretaker. She died about 10 days before I spoke with him. It was Chomsky’s first day back doing interviews, but he wanted to go through with it.” Later, he gave even more time to DISCOVER reporter Valerie Ross, answering her questions from his storied MIT office right up to the moment he dashed off to catch a plane.

You describe human language as a unique trait. What sets us apart?
 Humans are different from other creatures, and every human is basically identical in this respect. If a child from an Amazonian hunter-gatherer tribe comes to Boston, is raised in Boston, that child will be indistinguishable in language capacities from my children growing up here, and vice versa. This unique human possession, which we hold in common, is at the core of a large part of our culture and our imaginative intellectual life. That’s how we form plans, do creative art, and develop complex societies.

When and how did the power of language arise?
 If you look at the archaeological record, a creative explosion shows up in a narrow window, somewhere between 150,000 and roughly 75,000 years ago. All of a sudden, there’s an explosion 
of complex artifacts, symbolic representation, measurement of celestial events, complex social structures–a burst of creative activity that almost every expert on prehistory assumes must have been connected with the sudden emergence of language. And it doesn’t seem to be connected with physical changes; the articulatory and acoustic [speech and hearing] systems of contemporary humans are not very different from those of 600,000 years ago. There was a rapid cognitive change. Nobody knows why.

What first sparked your interest in human language? 
 I read modern Hebrew literature and other texts with my father from a very young age. It must have been around 1940 when he got his Ph.D. from Dropsie College, a Hebrew college in Philadelphia. He was a Semitist, working on medieval Hebrew grammar. I don’t know if I officially proofread my father’s book, but I read it. I did get some conception of grammar in general from that. But back then, studying grammar meant organizing the sounds, looking at the tense, making a catalog of those things, and seeing how they fit together.

Linguists have distinguished between historical grammars and 
descriptive grammars. What is the difference between the two?  Historical grammar is a study of how, say, modern English developed from Middle English, and how that developed from Early and Old English, and how that developed from Germanic, and that developed from what’s called Proto-Indo-European, a source system that nobody speaks so you have to try to reconstruct it. It is an effort to reconstruct how languages developed through time, analogous to the study of evolution. Descriptive grammar is an attempt to give an account of what the current system is for either a society or an individual, whatever you happen to be studying. It is kind of like the difference between evolution and psychology.

And linguists of your father’s era, what did they do? 
 They were taught field methods. So, suppose you wanted to write a grammar of Cherokee. You would go into the field, and you would elicit information from native speakers, called informants.

What sort of questions would the linguists ask? 
 Suppose you’re an anthropological linguist from China and you want to study my language. The first thing you would try to do is see what kind of sounds I use, and then you’d ask how those sounds go together. So why can I say “blick” but not “bnick,” for example, and what’s the organization of the sounds? How can they be combined? If you look at the way word structure is organized, is there a past tense on a verb? If there is, does it follow the verb or does it precede the verb, or is it some other kind of thing? And you’d go on asking more and more questions like that.

But you weren’t content with that approach. Why not?
 I was at Penn, and my undergraduate thesis topic was the modern grammar of spoken Hebrew, which I knew fairly well. I started doing it the way we were taught. I got a Hebrew-speaking informant, started asking questions and getting the data. At some point, though, it just occurred to me: This is ridiculous! I’m asking these questions, but I already know the answers.

Soon you started developing a different approach to linguistics. How did those ideas emerge?
 Back in the early 1950s, when I was a graduate student at Harvard, the general assumption was that language, like all other human activities, is just a collection of learned behaviors developed through the same methods used to train animals—by reinforcement. That was virtually dogma at the time. But there were two or three of us who didn’t believe it, and we started to think about other ways of looking at things.

In particular, we looked at a very elementary fact: Each language provides a means to construct and interpret infinitely many structured expressions, each of which has a semantic interpretation and an expression in sound. So there’s got to be what’s called a generative procedure, an ability to generate infinite sentences or expressions and then to connect them to thought systems and to sensory motor systems. One has to begin by focusing on this central property, the unbounded generation of structured expressions and their interpretations. Those ideas crystallized and became part of the so-called biolinguistic framework, which looks at language as an element of human biology, rather like, say, the visual system.

You theorized that all humans have “universal grammar.” What is that? 
 It refers to the genetic component of the human language faculty. Take your last sentence, for example. It’s not a random sequence of noises. It has a very definite structure, and it has a very specific semantic interpretation; it means something, not something else, and it sounds a particular way, not some other way. Well, how do you do that? There are two possibilities. One, it’s a miracle. Or two, you have some internal system of rules that determines the structures and the interpretations. I don’t think it’s a miracle.

What were the early reactions to your linguistic ideas?
 At first, people mostly dismissed or ignored them. It was the period of behavioral science, the study of action and behavior, including behavior control and modification. Behaviorism held that you could basically turn a person into anything, depending on how you organized the environment and the training procedures. The idea that a genetic component entered crucially into this was considered exotic, to put it mildly. 

Later, my heretical idea was given the name “the innateness hypothesis,” and there was a great deal of literature condemning it. You can still read right now, in major journals, that language is just the result of culture and environment and training. It’s a commonsense notion, in a way. We all learn language, so how hard could it be? We see that environmental effects do exist. People growing up in England speak English, not Swahili. And the actual principles—they’re not accessible to consciousness. We can’t look inside ourselves and see the hidden principles that organize our language behavior any more than we can see the principles that allow us to move our bodies. It happens internally.

How do linguists go about searching for these hidden principles?
 You can find information about a language by collecting a corpus of data—for instance, the Chinese linguist studying my language could ask me various questions about it and collect the answers. That would be one corpus. Another corpus would just be a tape recording of everything I say for three days. And you can investigate a language by studying what goes on in the brain as people learn or use language. Linguists today should concentrate on discovering the rules and principles that you, for example, are using right now when you interpret and comprehend the sentences I’m producing and when you produce your own.

Isn’t this just like the old system of grammar that you rejected? 
 No. In the traditional study of grammar, you’re concentrating on the organization of sounds and word formation and maybe a few observations about syntax. In the generative linguistics of the last 50 years, you’re asking, for each language, what is the system of rules and principles that determines an infinite array of structured expressions? Then you assign specific interpretations to them.

Has brain imaging changed the way we understand language?
 There was an interesting study of brain activity in language recently conducted by a group in Milan. They gave subjects two types of written materials based on nonsense language. One was a symbolic language modeled on the rules of Italian, though the subjects didn’t know that. The other was devised to violate the rules of universal grammar. To take a particular case, say you wanted to negate a sentence: “John was here, John wasn’t here.” There are particular things that you are allowed to do in languages. You can put the word “not” in certain positions, but you can’t put it in other positions. So one invented language put the negation element in a permissible place, while the other put it in an impermissible place. The Milan group seems to have found that permissible nonsense sentences produced activity in the language areas of the brain, but the impermissible ones—the ones that violated principles of universal grammar—did not. That means the people were just treating the impermissible sentences as a puzzle, not as language. It’s a preliminary result, but it strongly suggests that the linguistic principles discovered by investigating languages have neurocorrelates, as one would expect and hope.



Recent genetic studies also offer some clues about language, right?
 In recent years a gene has been discovered called FOXP2. This gene is particularly interesting because mutations on it correspond with some deficiencies in language use. It relates to what’s called orofacial activation, the way you control your mouth and your face and your tongue when you speak. So FOXP2 plausibly has something to do with the use of language. It’s found in many other organisms, not just humans, and functions in many different ways in different species; these genes don’t do one single thing. But that’s an interesting preliminary step toward finding a genetic basis for some aspects of language.

You say that innate language is uniquely human, yet FOXP2 shows a 
continuity among species. Is that a contradiction? It’s almost meaningless that there’s a continuity. Nobody doubts that the human language faculty is based on genes, neurons, and so on. The mechanisms that are involved in the use, understanding, acquisition, and production of language at some level show up throughout the animal world, and in fact throughout the organic world; you find some of them in bacteria. But that tells you almost nothing about evolution or common origins. The species that are maybe most similar to humans with regard to anything remotely like language production are birds, but that’s not due to common origin. It’s what’s called convergence, a development of somewhat analogous systems independently. FOXP2 is quite interesting, but it’s dealing with fairly peripheral parts of language like [physical] language production. Whatever’s discovered about it is unlikely to have much of an effect on linguistic theory.

Over the past 20 years you’ve been working on a “minimalist” 
understanding of language. What does that entail? 
 Suppose language were like a snowflake; it takes the form it does because of natural law, with the condition that it satisfy these external constraints. That approach to the investigation of language came to be called the minimalist program. It has achieved, I think, some fairly significant results in showing that language is indeed a perfect solution for semantic expression—the meaning—but badly designed for articulate expression, the particular sound you make when you say “baseball” and not “tree.” 



What are the outstanding big questions in linguistics? 
 There are a great many blanks. Some are “what” questions, like: What is language? What are the rules and principles that enter into what you and I are now doing? Others are “how” questions: How did you and I acquire this capacity? What was it in our genetic 
endowment and experience and in the laws of nature? And then there are the “why” questions, which are much harder: Why are the principles of language this way and not some other way? To what extent is it true that the basic language design yields an optimal solution to the external conditions that language must satisfy? That’s a huge problem. To what extent can we relate what we understand about the nature of language to activity taking place in the brain? And can there be, ultimately, some serious inquiry into the genetic basis for language? In all of these areas there’s been quite a lot of progress, but huge gaps remain.

Every parent has marveled at the way children develop language. It seems incredible that we still know so little about the process. We now know that an infant, at birth, has some information about its mother’s language; it can distinguish its mother’s language from some other language when both are spoken by a bilingual woman. There are all kinds of things going on in the environment, what William James called a “blooming, buzzing confusion.” Somehow the infant reflexively selects out of that complex environment the data that are language-related. No other organism can do that; a chimpanzee can’t do that. And then very quickly and reflexively the infant proceeds to gain an internal system, which ultimately yields the capacities that we are now using. What’s going on in the [infant’s] brain? What elements of the human genome are contributing to this process? How did these things evolve?

What about meaning at a higher level? The classic stories that people retell from generation to generation have a number of recurring themes. Could this repetition indicate something about innate human language? 
 In one of the standard fairy tales, the handsome prince is turned into a frog by the wicked witch, and finally the beautiful princess comes around and kisses the frog, and he’s the prince again. Well, every child knows that the frog is actually the prince, but how do they know it? He’s a frog by every physical characteristic. What makes him the prince? It turns out there is a principle: We identify persons and animals and other living creatures by a property that’s called psychic continuity. We interpret them as having some kind of a mind or a soul or something internal that persists independent of their physical properties. Scientists don’t believe that, but every child does, and every human knows how to interpret the world that way.

You make it sound like the science of linguistics is just getting started.
 There are many simple descriptive facts about language that just aren’t understood: how sentences get their meaning, how they get their sound, how other people comprehend them. Why don’t languages use linear order in computation? For example, take a simple sentence like “Can eagles that fly swim?” You understand it; everyone understands it. A child understands that it’s asking whether eagles can swim. It’s not asking whether they can fly. You can say, “Are eagles that fly swimming?” You can’t say, “Are eagles that flying swim?” Meaning, is it the case that eagles that are flying swim? These are rules that everyone knows, knows reflexively. But why? It’s still quite a mystery, and the origins of those principles are basically unknown.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.