|
|
There are varying degrees of "correctness" in language. Some people's nerves become fried when people dare to even use split infinitives, as in "to boldly go where no one has gone before", while some gleefully use "ain't" without a second thought. Lately, some English teachers have begun to wonder whether it makes sense to teach people to use formal language and call what is spoken by the general population "wrong". This, of course, causes some problems -- Where do you draw the line? Is all formal grammar instruction bad just because some of it sounds stilted? On the other hand, teaching students that "if I was rich, I would be happy" is incorrect may seem unreasonable given that most people say it that way. When is it time to "go with the flow" and when is it time to stand firm and be old-fashioned? Where do you stand on this issue? Griz
65 responses total.
Split infinitives put the modifier closer to the word being modified. Makes sense to me. It's a hold over from foreign languages where the "to" form is actually one word. Kind of difficult to split infinitives in that case.
Re 1. Actually, as it was explained to me, it was a holdover from Latin, which was often held up as the model for other languages by the grammar teachers of yore.
True, a lot of grammar is modeled after Latin, but infinitive forms appeared long before Latin. Most notably in anceint Greek, where the infinitive was just another inflected form of the verb.
All right, I'll take a stab at it. To me, the split infinitive is correct, regardless of what prescriptivists say. I also love the subjunctive, but the fact is that it is dying out in English, and I'm not going to resist change. But I am certainly not in favor of phasing out grammar teaching all together. Griz
There're lots of ways to teach grammar. You can teach "correct
grammar", prescriptive grammatizing, or you can teach "how to study
grammar", how to find out why people put sentences together the way they
do, and what are the different types of phrases, subsentences, parts of
speech, etc.
It can be useful to know the second, because you can find out things
like how you might confuse someone with bad sentences. (By placing the
modifiers too far from the words they modify, for example.) By knowing
what the different parts of speech are -- knowing what a verb is, a noun,
an article, etc, you can use a dictionary better, and you'll have a better
chance of knowing what a new word means, and how it can be used.
Knowing the prescriptive rules of grammar allows you to pass grammar
tests, and may (or may not) help you impress people. If everyone agreed
on all the "ideal" rules of grammar, it might help to regularize the
language, so that everyone everywhere would say the same thing more or
less the same way, and anyone anywhere else would be able to understand
them.
However, people soak up the real rules of grammar when they learn to
talk. Prescriptive grammar is basically either making people re-learn how
to talk and write, or it's teaching them what they already know.
My stand on the grammar issue is predictable. As for whether or not sloppy speech should be encouraged, let me paraphrase Captain Kirk: It is far easier for a civilized person to behave like a barbarian than a barbarian to act like a civilized person. If you learn to speak correctly, you can always change your style to suit the circumstances. There are no options the other way.
I think what some teachers and many linguists object to is the teaching of such prescriptive forms as "correct", rather than saying "this is how you would say it in this context. Griz
Well, if the word "correct" makes some linguists squeamish, how about calling it a "baseline" grammar, i.e. the one you use when you are unsure of the context.
If we all adhered to a strict 'grammatically correct' english, we'd still be inflecting more words than we do. I think there is a fine line to draw. General agreement between number and tense in noun/verb structures is something that I would consider to be important, whereas whether you say 'The man who I saw yesterday...' as opposed to 'The man whom I saw yesterday...' is a little less important.
I would tend to agree with you Ty, and I think most language teachers do, as well (other than the REALLY old-fashioned ones). But where do you draw the line? Griz
In the old days, the teachers didn't worry about where to draw the line. They simply taught correct (or baseline, for the squeamish) English and let the student decide what level of rules they wished to retain outside the classroom. It seemed to work too. In the past 20 years, teachers have agonis /agonis/agonized/ over where to draw the line and have tolerate progressively "incorrect" speech. Over the past 20 years, we have become inundated with students who can't speak, read, or write. Do you suppose there is any connection?
Actually, no, I don't "suppose" that. The real reason for being inundated with students who can't speak, read, or write belongs in the politics conference, not in the language conference. Griz
Not to mention that people have been bemoaning "bad language skills" since we were evolved enough to be elitists.
So, should the New York Times hire a writer who splits infinitives, dangles participles, and whose parts of speech don't agree? I hope not. E. B. White would roll over in his grave.
Now, katie, I didn't even APPLY for a job at the New York Times, and I think it's unlikely in the extreme that they'd ever hire me.
Re #12. Is education more inherently political than it is inherently educational? Re #13. Just because you've heard the warning before doesn't make it untrue or less valid. (I guess striving for perfection is evil ..... ooh .... ELITIST .... bad! Must be better if everyone is EQUALLY incompetent.)
What's 'perfection'--keeping language stale and immutable? Is 'perfection' using one perfectly grammatical expression over another because it's always been done that way? How about keeping linguistic rules that few people use, and which are rooted in a language we don't speak anyway?
Obviously not. Changes to the language will occur whether we want them to or not. Moderating these changes is a worthwhile pursuit, however. The question we need to ask is whether the change makes the language more understandable or not.
(I wasn't addressing you, Larry. I read the whole item in one swell foop after joining it late, and I was just adding my two cents to the discussion as a whole.
(C'mon, Katie, couldn't you tell I was kidding?)
The descriptivist says, "Use the language as I describe it or you will be misunderstood." The prescriptivist says, "Use the language as I prescribe it or you will be wrong." The descriptivist sees everyone stopping at red lights and concludes therefrom that stopping at red lights is "standard". The prescriptivist says that stopping at red lights is the law, so obviously everyone has to do it. There are philosophical differences between the two, but in practice they're both telling me how to speak and write. jep is right: Kids learn standard usage by listening and reading long before they learn the names of the rules they've been following. We all start off as descriptivists in our native languages, but some of us end up prescriptivists. We fall in love with the rules.
And what is the "standard usage" that kids learn? If they grow up hearing gender-neutral language, they'll consider that 'normal', even though third person masculine is considered grammatically correct.
But at least once they learn the rules, they'll be able to see how awkward and silly some of the new-speak is.
Re #23: Emphasis on "some of" ...
A fair amount of formalspeak is pretty silly and awkwards too. And not uncommonly, the proscribed forms turn out to have better roots in english than the perscribed forms. Shakespeare used double negatives quite cheerfully. "Ain't" turns out to be quite old. At best, formalspeak represents a sort of idealized form of 19th century oldspeak. At worst, formalspeak can result in some truely opaque and turgid prose, lifeless, without beauty, and devoid of meaningful information content. I think the most useful plan is to teach kids to cope with a wide variety of different -speak's, to recognize and appreciate the differences, and to give them the ability to use more than one solution to use in different contexts, according to which one will work the best. The language you use to sell yourself on a resume is probably not what you should use to give a report to your boss that tries to say little with much, which in turn is probably not what you should use to write a love letter, which is not what you need to decipher advertisements in Newsweek. And none of those is going to help you should you choose to watch "Are You Being Served?" on PBS-30/Toledo 11pm weekdays. (Which interestingly has the only real claim on the name ``English''.)
And, of course, it helps in communicating with other groups than one's own, especially if those groups use nonstandard dialect.
Yes, but wouldn't communication be SO much easier if everyone spoke standard, rather than training kids and adults alike to keep adapting to an ever increasing number of different dialects?
You're right. If we all spoke "black Detroit English," or East Manchester dialect, or even Jamaican patois, we'd understand each other much better. And figuring out what you want to have for dinner would be lots easier if all food tasted like boiled ham.
While I do adapt my use of language to the audience, no one is going to be adept at all the various dialects, not even you, Laurel. That being the case, it is useful to have some standard language. Seriously, think what the law would be like if you didn't have a baseline legal language. (re #27: It's probably a mistake to write any kind of report that tries to say little with much.)
As a dialectologist, I have no trouble understanding the idea that it would be easier if everyone spoke the standard form of the language when speaking with people who do not speak their own dialect. However, this does not, at least as far as I can tell, logically lead into the idea that the dialects are "inferior", which is a view commonly held by native speakers of the standard. In fact, many things can be expressed much more easily with dialect forms than with the standard form, in any language. But then there is the problem of education. How are we to teach *everyone* in the country who speaks a non-standard dialect to be proficient in the standard? Surely you can see the difficulties with this. It would be far more possible in a country where there was a form of the language that was *universally accepted* to be the standard, and a form of the language that few actually learned at home. Then the standard could be taught in kindergarten and first grade. This is the present situation in Germany, where extreme regional and social variation makes bidialectalism not only common, but a necessity in many cases. Yet in the United States, many of the non-standard dialects are purely social dialects, and forcing someone to speak the standard often reeks of cultural brainwashing.
Sometimes it seems as if "baseline legal language" is used as a barrier to effective communications. I'd hardly use that as a good example of why a standard is necessary. (For those who object to this characterization of legalese, I offer the example of old egyptian hierglyphics, which very late (before it died out altogether) evolved into an almost completely non-understandable mish-mash of gibberish before the last few preservers of the heritage were completely overwhelmed by new ideas.) The real problem here is formalspeak represents an attempt to create a single, static, and unchanging form of the language, that will be the same everywhere and for all time. Unfortunately, the spoken language is still evolving, and while it's not really fragmenting into lots of small pieces (as suggested above), there are indeed many different streams of english that haven't merged into one despite the best efforts of the american advertiser. And that means the standard is under considerable pressure to evolve to "keep up with the times". Some of that pressure is just "sloppy-isms", violations of rules in the standard that probably weren't necessary in the first place, and are rapidly disappearing today. Dangling prepositions are one example. Some of that represents new social ideas and so forth. Try using the word "man" or "he" today.
"Still" "evolving", Marcus? Do you mean to suggest the spoken language is getting *better*, Darwinian style, and will someda*stop* doing so? <jennie looks at Marcus in amazement>
The law would be MUCH more pleasant without the stupid legalisms, honestly. There is a big movement towards "plain English" in pleadings, documents, etc. to make it easier for non-lawyers to deal with the legal system.
I'm not sure I agree that having standards leads to a static language. The goal of having language standards is communication. When those standards fail to do that, they should be changed. That seems to be the way things work now. There are some arch-conservatives out there who bemoan every bit of slang, but there are also a large number of readers and writers who hold to most conventions and change things little by little when those rules come up short.
I said "evolving", not "improving". Evolution is a dynamic process, but in nature, you can find thousands of species that have been "evolving" for millions of years that look "identical" to what's in the fossil record. But "evolution" does provide for the possibility of change in response to changing environment, and ensures that, in most cases, the result of that process of change will be better suited than the old language. In this limited sense, evolution does represent indeed represent improvement. Chances are, whatever language Cro-Magnon man used was every bit as rich and varied as any modern tongue. Chances are, too, that it would be a rather poor language to describe how to make bread in a 20th century kitchen. (Vocabulary would likely be only the first of many problems.) On the other hand, it might well be a much better language for oral tales or discussing complicated kinship relationships. We might think English is "improved", but from the perspective of Cro-Magnon, it might well appear to be most degenerate and awful, and worse in quality in every way, from English.
See, Jennie, I *told* you there was a difference between "evolving" and "improving."
Language changes to fit the needs of the people using it. Language
will always change, because people will always change.
It's meaningless to say whether language improves or declines in
usefulness, because there's no standard by which to judge the evaluation.
20th century English wouldn't be better for Americans of the 19th century;
they didn't need to talk about television, computers or nuclear power.
They needed to talk about things most of us don't need to talk about any
more. I think we can get our points across now about as well as they
could get theirs across then.
Re #36: Then Katie's house has "pillars". :-)
re #35: Interesting idea about Cro-Magnon language, although I doubt that is was as rich and varied as modern languages. Language arises from experience, and I bet modern humans have a much more varied and rich life than Cro-Magnons. You can perhaps argue that English as we use it is not as rich and varied, but that does not mean the language itself is so limited.
|
|
- Backtalk version 1.3.30 - Copyright 1996-2006, Jan Wolter and Steve Weiss