My final post on language articles in newspapers (probably)

I’ve written a few posts about the (mis)treatment of linguistics and language in many newspaper articles. In short, it’s bad. But then what should we expect? Journalism is largely page filling these day (for more about churnalism read this excellent book) So I’ve decided this is my last article about bad linguistic journalism. Whatever I write and no matter how much evidence can be brought to bear on these kinds of wrong-headed articles, history shows that they will continue to proliferate. However, as I will hopefully show, the tide of history is almost always on the side of usage.
 
The author of the today’s article “There are lots of bacteria, but there is only one genetic codeDr Dixon, is a scientist and unsurprisingly also an editor (like grammargirl). Many journalists have a seemingly fetishistic obsession with prescriptivism regardless of  the mountain of evidence against it. An example from an earlier post is Bill Bryson describing all the reasons why “whom” could be allowed to die a natural death but then stating “I, for one, would not like to see it go”.  This is pretty much the way with many writers. Scientists who would question almost any claim about their field and demand evidence have seemingly no problem swallowing linguistic rules without the slightest curiosity as to their validity. 
 
Ironically, Dixon penned an piece complaining about the press’ lousy coverage of science, yet he, -a non-linguist, has set himself up as an arbiter of proper language usage. And what particular idiosyncrasies does he have a problem with? Before I dissect the article in detail I would quickly like to go over the mistaken argument from linguistic regularity which, in short, is the assumption that languages are regular and logical. Like organisms, languages evolve and as with organisms this doesn’t lead to perfectly functioning “designed” languages but rather languages with a lot of inherent waste.
 
In animals, an example Dixon might understand is the recurrent laryngeal nerve. the nerve especially in animals like giraffes is massively wasteful looping down then back up the neck when in reality, it only needs to travel a few inches. This is the result of evolution applying an “if it aint broke” approach.  Languages too have massive amounts of waste because they weren’t designed either, though people like Dixon act as if they were. Phrases like “the reason why“, “the end result” and “over-exaggerate” are redundant. So are things like the third person ‘s’ on verbs, the word “whom” the word “fewer“, the bizarre conjugations of the “be” verb and “do support” which only a handful of languages possess at all. As this chaos swirls around them the Dixons of this world accept 95% of the disorder but vehemently oppose the last 5% like victims of the titanic complaining that their shoes are getting wet.

So let’s examine the article in a little more detail.

She’s developed something called anorexia.”

“I was reading about that in the newspaper. It’s quite serious, isn’t it?”

“Yes, and more young women are getting anorexia these days.”

A simple enough conversation. What the speakers did not realise is that they were not talking about anorexia at all. Anorexia means loss of appetite. That is its definition in both medical and general dictionaries. There is, however, also a condition called anorexia nervosa – a psychological illness, commonest among female adolescents, in people who deliberately starve or use other methods, such as vomiting, to lose weight. But relatively suddenly, anorexia has lost its original meaning. In the media and in everyday conversation, anorexia now means anorexia nervosa


This reminds me of “Frankenstein’s monster” bores who insist on pedantically telling everyone and anyone who’ll listen that Frankenstein is not the monster! anorexia is understood by most speakers as shorthand for anorexia nervosa as “Frankenstein” is for his monster. Who cares about this? The good doctor apparently. He goes on:

Language and the connotations of words and expressions evolve over time – helpfully so, when new distinctions and subtleties arise. But meanings also change simply as a result of ignorance or error. So when, some years ago, more and more people began to say “disinterested” when they meant “uninterested”, the misuse gradually became a normal meaning of that word

Dixon thus is quite happy for language to change, -so long as it changes in ways he likes. The short sighted nature of his rant can be illustrated be looking at some of the words he uses. “error” for example means to “to wander” originally, the misuse seems to have gradually become the normal meaning. More interesting is that Dixon is actually wrong as his potted history falls a little short of the truth. Etymologically speaking:

Disinterested and uninterested share a confused and confusing history. Disinterested was originally used to mean “not interested, indifferent”; uninterested in its earliest use meant “impartial.” By various developmental twists, disinterested is now used in both senses. Uninterested is used mainly in the sense “not interested, indifferent.” It is occasionally used to mean “not having a personal or property interest.”


So rather than switching, it’s perhaps more accurate to say that they switched back to their original meanings. But don’t let facts get in the way of a good old language rant Doc!

What seems to be happening today is that such shifts are occurring more and more quickly. Consider the word “issue”. I heard a cricket commentator saying that an Indian batsman was “having issues with” an opponent’s spin bowling. As recently as five years ago, he would have said the batsman was confused by the spinning ball, which he was failing to hit in the way he intended. “Issue” then meant something quite different. Since then, however, it has come to mean “problem”.

Dixon offers (as all news articles it should be said) no evidence for his (amazing if true) suggestion that the pace and amount of language change is increasing in English. He seems to have issues with the use of “have issues with” (hohoho) and actually Google Ngram does show an increase in its use between 1990-2008. That said, if the alternative to “having issues with” really is “confused by the spinning ball, which he was failing to hit in the way he intended” then forgive me for calling it a welcome addition to English.

What is especially surprising nowadays is that misuses of words can increasingly be found even in specialised communications – as in my own particular field of science. Long ago, when I was a microbiology student, I learned that the singular of bacteria was bacterium. Then, towards the end of the 20th century, print and broadcast journalists began to say “this bacteria”. And the alteration did not stop there. It is now affecting professional discourse too. In the past three months, I have seen “a bacteria” or “this bacteria” six times in research journals. I have even heard a speaker making the same mistake throughout his conference presentation.


Most of what is described here I have dealt with in detail in this post and so won’t rehash suffice to say that what Dixon describes as a “mistake” is a change in usage and also that it is altogether unreasonable to expect foreign words to retain their foreign morphemes in a host language. Note that Dixon admits that it was not evidence of usage that led to his knowledge about “bacteria” but rather that that is what he was taught -and what he unquestioningly accepted. When his friends tell him that “this spaghetti is delicious” does he, I wonder correct them “THESE spaghetti ARE delicious!” Shoddy stuff for a scientist.

Sports commentators appear to be culpable in another area – the demise of the adverb. Within the past 10 years, firstly snooker commentators and then those in other sports began to tell us that “he hit that one strong”, that “she’s playing confident”, and that “he’s bowling accurate”. The habit is now spreading more widely.

This is a pretty old chestnut. This type of inflexible thinking is what leads to people avoiding saying “I am good” (I am well!) and “This food is healthy” (This food is healthful!) A basic question to be asked when people talk about “bad” grammar is whether communication is actually breaking down. As with the pedantic parental chide “two negatives make a positive” (they don’t) the listener understands perfectly what they speaker is saying but just insists that language must, for some unknown reason, stand still at this moment in time and stop changing. Really quite a bizarre position for someone presumably well versed in evolutionary theory. The good Dr. seems to be suffering from a linguistic form of the “golden age fallacy” namely that there was a time when the English language was perfect. Perhaps a little after Shakespeare and a bit before the Bronte sisters. Ever since then it’s just been on the decline and will eventually reach a state where morlock-esque yoof roam the streets of Neo-England burbling an incomprehensible text-like patois to each other as society collapses.

In many cases, although it is impossible to pinpoint the initial change
 

And of course this is true though as a biologist such a statement should have him holding his head in shame. He’s basically asking for a missing link which could never exist. Just as no old world monkey gave birth to a human, no linguistic change can really ever be pinpointed. As McWhorter notes, Latin didn’t die, it turned into French, Spanish and Italian, but there wasn’t a day when people woke up saying “OK, now we’re speaking French”.

the reason why people begin to adopt erroneous usages so quickly is probably one of fashion and a desire to demonstrate familiarity with the modish vernacular. Consider “fantastic”, which is now a universal expression of hyperbole. Anyone interviewed in the media about anything that impresses or excites them will repeatedly call it “fantastic”. Over recent weeks, I have heard a celebrity chef describe a particular dish as fantastic (when he meant unusually succulent), a drama critic call an actor’s performance fantastic (when she meant disturbingly realistic) and a politician describe a party conference speech as fantastic (when he meant inspiring).
 

Reading this I’m almost tempted to think Dixon is pulling our collective legs. The idea that language use is something like wearing hipster jeans and not related to a complex set of social and psychological factors is quite staggeringly simplistic. Also does he really have a problem with the word ‘fantastic’? Does he really expect people only to use the word only for “characteristic of fantasy”? He goes on to talk about ‘literally’ which I dealt with here and so won’t go into nor will I again deal with what he calls the “important differences between ‘who and whom‘” (tl;dr = It isn’t an important difference.)

At the end he wonders “why are even the editors of scientific journals adopting fashionable but incorrect usages?” and this is where I would like return to the point I made in the first paragraph. The reason that editors are (sensibly) adopting “fashionable” usages is because those usages will almost certainly, triumph. And for the second part of this (rather long!) blog I’d like to take you and Dixon on an Dickensian tour of the ghosts of pedants past. I’m hopeful there is still time for the good Doctor to have a change of heart. So here it is, a list of complaining prescriptivists throughout the ages:


George Fox (1624-1691) wrote in his Epistle:


If you’re not getting that then Fox was complaining that only a ruddy great idiot would say “you” to one person, when as any fule kno the right word was “thou”.

Jonathan Swift (1667-1745) disliked past tense ‘ed’ being pronounced softly, (i.e. as we pronounce it now) he wanted it pronounced as the ‘aed’ in ‘dead’. He also didn’t like the words “sham, banter, bully, bubble, cutting and shuffling”.

Robert Lowth (1710-1787) thought that (formal) sentences shouldn’t end in prepositions. Somehow this opinion became accepted as a ‘rule’ by editor types at some point although almost everyone ignores it. Lowth also argued for “My wife and I” over “me and my wife”.

William Cobbet (1763-1835) wrote “A grammar of the English language” and complained about the use of the following as past tenses “Awoke, built, dealt, threw, swam and meant” (among many others) arguing instead for the more ‘proper’ “awaked, builded, dealed, throwed, swimmed and meaned”. (McWhorter)

Strunk and White, authors of a famous style guide loved by the likes of Dixon and first published in 1919, proscribed against the use of “hopefully”, the verbs “host” and “chair” and the passive voice. Rather amusingly (or at least, this passes for amusing to sad folks like me) of the four examples they give of mistaken use of the passive voice, only one of them is actually passive. And this isn’t their only mistake. As Pullum notes, they “The book’s contempt for it’s own grammatical dictates seems almost wilful.” The awful situation therefore is that we have know-nothings telling others how they should be writing and incredibly being listened to.

Steven Pinker notes that the verbs parent, input, showcase, impact, and contact “have all been denounced in this century” (1994:379)


So, if we examine the broad sweep of history we can see that most of the things that have been railed against have become normal and natural. Dr. Dixon and people like him can get upset about usage but his views about the word “fantastic” will, if they don’t already, eventually be seen as preposterous to future generations. I might be losing the battles against this type of newspaper article buthistory shows that the war is already won.






 




 

I couldn’t care fewer*

I got told off by a guy at work once for saying “We have less students this year”.



“for god’s sake” he said “fewer!”

Especially shameful, supposedly because I’m an English teacher and so should know better. But doesn’t the fact I had lived a good thirty years without knowing better, not perhaps tell us something about this word?  In fact, it’s completely possible that  large numbers of people will live and die in English without knowing that they are getting it ‘wrong’. And the people they are talking to  often don’t know that they are also getting it ‘wrong’. In fact the only people who are bothered seem to be the ones getting it right.
 
Yeah Jane Moore!  You idiot!

And my god are they bothered. People actually get very worked up about this (check the tweets, right). You can read blogs about just how bothered here, here, here and here. Or here, here, or here. And some more here and here. People really hate this.


So what are the rules?

GrammarGirl, (who I talked about here) gives us this handy guide and it’s actually fairly straightforward. Countable nouns use “fewer” and uncountable nouns use “less”. If that isn’t clear then look at this table:

Less
Time, money, bread
Fewer
Students, problems potatoes


Simply put, things that you can count, (1 monkey, 2 squirrels, 3 turnips, etc) should be used with ‘fewer’, with other things, like money, (moneys) you should use ‘less’. Simple really, -so why can’t thick thickos like me (and supermarkets) get this into their thick thicko skulls?

Well when we examine GrammarGirl’s advice we find this interesting note:

There are exceptions to these rules
 

Oh yes?….do go on!

for example, it is customary to use the word less to describe time, money, and distance (2, 3). For example, you could say, “That wedding reception lasted less than two hours. I hope they paid the band less than $400.” So keep in mind that time, money, and distance are different, but if you stick with the quick and dirty tip that less is for mass nouns and fewer is for count nouns, you’ll be right most of the time

Ah-ha! so things are not actually that straight-forward. I don’t want to be right “most” of the time dammit, I want to right all of the time! OK, so just use “fewer” with count nouns, except for time money and distance…right? right, I’ve got it!

But what about weight?  Can I say “I weigh 5kg less than last year” or should it be “I weigh 5kg fewer?” The latter sounds ugly so I’m going to go ahead and add weight to those exceptions.  OK so, time money, distance and weight, got it!

Well not quite, it also seems that you can’t use “fewer” with singular count nouns. For example “that’s one less thing to worry about.” should be wrong but no one say “one fewer thing to worry about”. So is this another exception or do we have to make some ugly compromise like “Now I don’t have so much to worry about”

And what about “less” in the phrases “more or less”? Surely regardless of what number was being referred to  a person would always say less, like “I ate 10 of those cakes, more or less”, but they would never say “more or fewer”. So set phrases seem to be exempt as well. (This is turning out to be as useful as the I before E rule.)
 
“Illiterate” signs? Hmm

Don’t even get me started on the mind-boggling world of “least number/amount, fewest number/amount”. I’ve never heard anyone get upset about this, but a Google search shows a huge state of disarray. If you’re going to get upset at supermarket signs, then don’t go anywhere near this one. workers in the UK take the least number of paid holidays” says the daily mail, noting later in the same article that they take the fewest. If ‘holidays’ are countable then it should be “fewest”, no? And least number? Shouldn’t it be ‘least amount’ and ‘fewest number’? or just fewest? (head asplodes)

 
An important question that the people who get angry about this never seem to ask is, -why does  English even need two words for things being smaller in number/amount when we manage to get by fine with one word for things being larger in number? No one has a problem saying “more money, more friends, more time and more stupid grammar rules.” No one gets confused and feels the need to invent a word to fill that gap. So why do people get so upset about this? Why the mindless observance of this useless rule?
 
Some people might say that we need to retain the historically correct rules of English. That’s a nice idea but as the Motivated Grammar blog notes, this so-called rule has only been around for a few hundred years:

As it turns out, this whole notion that fewer is countable and less is uncountable has been traced back to 1770 by the Merriam-Webster Dictionary of English Usage. And it wasn’t a rule back then, but rather a preference of a single author, Robert Baker.

That’s right, if you’re insisting on this in 2012 then you’re basically peddling the preferences of some eighteenth century dude. You’re getting angry over something someone 200 years ago didn’t like the sound of. You might imagine yourself the arbiter of “good grammar” but you might as well be running around shouting “don’t use the word bully, Jonathan Swift didn’t like it!”

The earliest example of someone getting it “wrong” was Alfred from Batman the Great who in 888ad wrote Swa mid læs worda swa mid ma, swæðer we hit yereccan mayon” or “with less words or with more, whether we may prove it.”. However I don’t think that people are concerned with historical value at all, they are concerned, as always where language is involved, with showing that they are more educated, more discerning and thus better than those oiks who get it wrong. Thus, like so much maven prescriptivism, this is yet another foundationless linguistic Shibboleth.

 
If we listen to these kinds of people we’ll end up with supermarket signs saying “10 items or fewer”, teachers saying “Write an essay of five-hundred words or fewer” and people being forced to say “that’s one fewer thing to worry about” and let’s be honest, that just sounds crap. Ignore these pedants, and if they insist then tell them that you couldn’t care fewer.




* Thanks to Florentina Taylor for pointing out that there is a difference between the adverbial use of “less” and the adjectival use.

 

The False Gods of Grammar

In a recent tweet Conan O’Brien asked:


One reply was from Grammar girl, (mignon Forgarty) author of “quick and dirty grammar tips”. Grammar Girl is a grammar expert and is an editor and an MS graduate in biology,  -not linguistics, and while this shouldn’t matter, I’ll explain later why it does. Her reply was:

First of, it’s important to say that this is absolutely correct and she presents a completely accurate explanation of the differences on her websitetoo. My issue is with the rule itself. A grammar expert can repeat learned rules but it strikes me that someone with a background in science, like Grammar Girl, might want to peek a bit further behind the curtain and think about why those rules exist and if they are worth following at all. These kind of language ‘rules’ along with splitting infinitives and ending sentences with prepositions, only exist because people in authority have decided they should exist, and a small band of self-proclaimed “experts” (from the 16th Century at least) have pronounced on their particular proclivities.

What’s wrong with who/whom?

A good place to start would be this piece about John McWorter’s (professor of linguistics) take on Who/whom. “Whom” is a fossilized piece of old English which is somehow still clinging to life. In  “myths, lies and half truths of language usage” he notes that many language experts, including the influential Robert Lowth fought for the survival of “whom”. However, McWorther notes, Lowth also fought for the survival of Sitten (sat), spitten (spat), wert (was) and Chicken as a plural (I have two Chicken). How many of these strike you as worth keeping?

If we look at other similar pronouns we can see how odd “whom” is:

pronoun use

Subject 

Object

Place

Where

Where

Time

When

When

Things

Which

Which

People

Who

Whom

General all purpose

That

That

 Linguistics quirks like this serve no purpose, as far as I can see, but to intimidate others and give people the chance to demonstrate their superior learning. The whole thing works like something of a catch 22. You can willfully split your infintives or refuse to use “whom”, despite knowing the “rules” but the maven you’re talking to may judge you as being less well educated, so you might feel obliged to use it anyway. It’s also worth noting that who/whom has been a source of mistakes throughout history, with errors appearing in The Bible, and works by Shakespeare, Dickens, Churchhill and Swift. So if you are confused, you’re in good company.  It’s certainly no indicator of stupidity.

The Grammar


In grammatical terms, who/whom are pronouns, they often appear in relative clauses such as:
        The person who/whomyou’re talking to is a blithering idiot.



Grammar “experts” would tell you that because the word is an object here, then it “ought to be” “whom” not “who”. If we follow Grammar Girl’s rule (above) we would say “I’m talking to him” and thus use “whom”. This “ought to be” is what is called prescriptivism. But what does that mean?  Steven Pinker( linguist and cognitive scientist) defines it like this:

The contradiction begins in the fact that the words “rule,” “grammatical,” and “ungrammatical” have very different meanings to a scientist and to a layperson. The rules people learn (or, more likely, fail to learn) in school are called prescriptive rules, prescribing how one “ought” to talk. Scientists studying language propose descriptive rules, describing how people do talk. 

Most of what I write here has been said before, notably by Pinker in his 1994 book The Language Instinct. Although this is a lengthy quote it is worth reproducing here:

[who/whom] is one of the standard prescriptivist complaints about common speech. In reply, one might point out that the who/ whom distinction is a relic of the English case system, abandoned by nouns centuries ago and found today only among pronouns in distinctions like he/him. Even among pronouns, the old distinction between subject ye and object you has vanished, leaving you to play both roles and ye as sounding completely archaic. Whom has outlived yebut is clearly moribund; it now sounds pretentious in most spoken contexts. No one demands of Bush that he say Whom do ye trust? If the language can bear the loss of ye, using you for both subjects and objects, why insist on clinging to whom, when everyone uses who for both subjects and objects?

It also follows that if a person believes “whom” to be necessary when in an object position, shouldn’t they also extend that rule to spoken English? Look at the following sentences:

Who are you looking at?

Who do you think you are?

Both of these, according to the “rule” are incorrect. They should read “whom are you looking at” and “whom do you think you are”. Now if you think that this sounds odd and would rather say “incorrect” things like “who are you talking about?” then why on earth would you insist on using whom at all?

More expertise

Bill Bryson is another such language expert. His popular style guide troublesome words, shows again how keen people are for an authority figure to tell them what the “rules” are. People seem to crave this kind of stuff (judging from the reviews). The section on Who/whom is typical of much of the book. Bryson has done his homework and seems to understand the arguments against this kind of rule but inexplicably always chooses to support the rule anyway, because…well…he’s fond of it:

 

English has been shedding its pronoun declension for hundreds of years; today who is the only relative pronoun that is still declinable. Preserving the distinction between who and whom does nothing to promote clarity or reduce ambiguity. It has become merely a source of frequent errors and perpetual uncertainty. Authorities have been tossing stones at whom for at least 200 years. -Noah Webster was one of the first to call it needless- but the word refuses to go away. (Bryson 1984: 216)

Bryson then goes on to say, right after this barrage, “I, for one, would not like to see it go”.

As an interesting aside, Bryson also notes Grammar Girls “him/he” rules but then points out that it doesn’t always work. He offers the example of:
 

“They rent in to whoever needs it”

Apply the rule and we get “they rent it to him” him = whom (but who is correct) 

In order to apply this “quick and dirty” rule you have to have the grammatical knowledge that the clause “whoever needs it” is the object of “rent”, not “him”.That is you should say “he needs it” to reach the correction pronoun “who”.

Confused?

Language Experts

 

The problem again with advice like this is that it is not based on any empirical findings, but rather, as throughout history, on the predilections of “authorities” and the recitation of commonly accepted “rules” which usually again originate in the predilections of “authorities” or a mistaken/superficial understanding of how the English language works. The real experts, professors in applied linguistics for example, are usually ignored and words like “whom” are kept alive on the artificial respirator of prescriptivism.

I have shown above that linguistics like Pinker and McWhorter have quite a different take on who/whom than “language experts” like Bryson and Grammar Girl.  The difference is that  Bryson and Grammar Girl are essentially more involved with journalism and publishing than linguistics.  Writers and editors get their ideas from style guides like the Chicago Manuel of Style and Strunk and White who are again often just rehashing of previously held prejudices and blackboard grammar rules.  McWhorter comments that Strunk and White “made decisions based on how nice they thought something looked or sounded, just like arranging furniture.”  And while Grammar Girl and Bryson have made notable leaps forward, accepting, for instance, split infintives, there is still a tendency to let personal preferences dictate rules:

 

She also tends to accept the word of authorities without questioning them.  In this interview She notes that “like” is frowned upon but she uses it:

MF: I tend to use “like” as a conjunction. Technically, we’re supposed to say “It looks as if it’s going to rain” or “It looks as though it’s going to rain.” I tend to say “It looks like it’s going to rain.” That’s wrong, but I’ve been saying it that way my whole life and it’s a hard habit to break. I’m constantly correcting myself.



To a linguist, the idea of you using something your whole life which is “wrong” is an astonishing notion and one which Pinker gently mocks here:
 


Imagine that you are watching a nature documentary. The video shows the usual gorgeous footage of animals in their natural habitats. But the voiceover reports some troubling facts. Dolphins do not execute their swimming strokes properly. White-crowned sparrows carelessly debase their calls. Chickadees’ nests are incorrectly constructed, pandas hold bamboo in the wrong paw, the song of the humpback whale contains several well-known errors, and monkeys’ cries have been in a state of chaos and degeneration for hundreds of years. Your reaction would probably be, What on earth could it mean for the song of the humpback whale to contain an “error”? Isn’t the song of the humpback whale whatever the humpback whale decides to sing? Who is this announcer, anyway?

I have nothing against Grammar Girl personally. She’s a popular, talented and successful person, if anything I’m a bit jealous, -but I do wish she turn over the grammatical rocks and look a bit deeper underneath. It’s fine knowing the “rules” but it’s more important to know where those rules come from and if they are worth following. Admittedly the facts are perhaps not as crystal clear or as neat and satisfying as the “rules”, but surely the facts are more important.

 

References (not hyperlinked)


 

Bryson, B 1984 Troublesome words London: Penguin














 

 
 

 

 

 

 

 

 

 

.