I was out with a friend last night, and he asked about the book I’m working on, and I said it was on syntax. So he asked, reasonably enough, what’s syntax?
Well, how do you answer that for a non-linguist? This is what I came up with.
Suppose you want to make a machine that spits out English sentences all day long. There should be no (or very few) repetitions, and each one should be good English.
How would you make that machine in the simplest way possible?
That is, we’re not interested in a set of rules that require the Ultimate Computer from Douglas Adams’s sf. We know that “make a human being” is a possible answer, but we’re looking for the minimum. (We also, of course, don’t want a machine that can’t do it— that misses some sentences, or spits out errors. We want the dumbest machine that works.)
One more stipulation: we don’t insist that they be meaningful. We’re not conducting a conversation with the machine. It’s fine if the machine outputs John is a ten foot tall bear. That’s a valid sentence— we don’t care whether or not someone named John is nearby, or if he’s a bear, or if he’s a big or a small bear.
That machine is a generative grammar.
The rules of Chomsky’s Syntactic Structures are in fact such a machine— though a partial one. And along with the book I’m creating a web tool that allows you to define rules and let it generate sentences with the Syntactic Structures rules, or any other set. It works like a charm. But the SS rules were not, of course, a full grammar.
Now, besides the amusement value, why do we do this?
- It’s part of the overall goal of describing language.
- It puts some interesting lower bounds on any machine that handles language.
- As a research program, it will uncover a huge store of facts about syntax, most of them never noticed before. Older styles of grammar were extremely minimal about syntax, because they weren’t asking the right questions.
- It might help you with computer processing of language.
- It might tell you something about how the brain works.
I said we wouldn’t worry about semantics, but in practice generative grammar has a lot to say about it. Just as we can’t quite separate syntax from morphology, we can’t quite separate it from semantics and pragmatics.
You might well ask (and in fact you should!), well, how do you make such a machine? What do the rules look like? But for that you’ll have to wait for Chapter Two.
At this point I’ve written about 150 pages, plus two web toys. (One is already available— a Markov text generator.)
I mentioned before that my syntax books didn’t extend much beyond 1990. Now I’ve got up to 2013, kind of. I read a book of that date by Andrew Carnie, which got me up to speed, more or less, on Chomsky’s middle period: X-bar syntax, government & binding, principles & parameters. The good news is that all this is pretty compatible with what I knew from earlier works, especially James McCawley.
I’m also awaiting two more books, one on Minimalism, one on Construction Grammar.
Fortunately, I’m not training people to write dissertations in Chomskyan (or any other) orthodoxy… so I don’t have to swallow everything in Chomsky. (But you know, rejecting Chomsky is almost a full time job. He keeps changing his mind, so you have to study quite a lot of Chomsky before you know all the stuff you can reject.)