My final post on language articles in newspapers (probably)

I’ve written a few posts about the (mis)treatment of linguistics and language in many newspaper articles. In short, it’s bad. But then what should we expect? Journalism is largely page filling these day (for more about churnalism read this excellent book) So I’ve decided this is my last article about bad linguistic journalism. Whatever I write and no matter how much evidence can be brought to bear on these kinds of wrong-headed articles, history shows that they will continue to proliferate. However, as I will hopefully show, the tide of history is almost always on the side of usage.
 
The author of the today’s article “There are lots of bacteria, but there is only one genetic codeDr Dixon, is a scientist and unsurprisingly also an editor (like grammargirl). Many journalists have a seemingly fetishistic obsession with prescriptivism regardless of  the mountain of evidence against it. An example from an earlier post is Bill Bryson describing all the reasons why “whom” could be allowed to die a natural death but then stating “I, for one, would not like to see it go”.  This is pretty much the way with many writers. Scientists who would question almost any claim about their field and demand evidence have seemingly no problem swallowing linguistic rules without the slightest curiosity as to their validity. 
 
Ironically, Dixon penned an piece complaining about the press’ lousy coverage of science, yet he, -a non-linguist, has set himself up as an arbiter of proper language usage. And what particular idiosyncrasies does he have a problem with? Before I dissect the article in detail I would quickly like to go over the mistaken argument from linguistic regularity which, in short, is the assumption that languages are regular and logical. Like organisms, languages evolve and as with organisms this doesn’t lead to perfectly functioning “designed” languages but rather languages with a lot of inherent waste.
 
In animals, an example Dixon might understand is the recurrent laryngeal nerve. the nerve especially in animals like giraffes is massively wasteful looping down then back up the neck when in reality, it only needs to travel a few inches. This is the result of evolution applying an “if it aint broke” approach.  Languages too have massive amounts of waste because they weren’t designed either, though people like Dixon act as if they were. Phrases like “the reason why“, “the end result” and “over-exaggerate” are redundant. So are things like the third person ‘s’ on verbs, the word “whom” the word “fewer“, the bizarre conjugations of the “be” verb and “do support” which only a handful of languages possess at all. As this chaos swirls around them the Dixons of this world accept 95% of the disorder but vehemently oppose the last 5% like victims of the titanic complaining that their shoes are getting wet.

So let’s examine the article in a little more detail.

She’s developed something called anorexia.”

“I was reading about that in the newspaper. It’s quite serious, isn’t it?”

“Yes, and more young women are getting anorexia these days.”

A simple enough conversation. What the speakers did not realise is that they were not talking about anorexia at all. Anorexia means loss of appetite. That is its definition in both medical and general dictionaries. There is, however, also a condition called anorexia nervosa – a psychological illness, commonest among female adolescents, in people who deliberately starve or use other methods, such as vomiting, to lose weight. But relatively suddenly, anorexia has lost its original meaning. In the media and in everyday conversation, anorexia now means anorexia nervosa


This reminds me of “Frankenstein’s monster” bores who insist on pedantically telling everyone and anyone who’ll listen that Frankenstein is not the monster! anorexia is understood by most speakers as shorthand for anorexia nervosa as “Frankenstein” is for his monster. Who cares about this? The good doctor apparently. He goes on:

Language and the connotations of words and expressions evolve over time – helpfully so, when new distinctions and subtleties arise. But meanings also change simply as a result of ignorance or error. So when, some years ago, more and more people began to say “disinterested” when they meant “uninterested”, the misuse gradually became a normal meaning of that word

Dixon thus is quite happy for language to change, -so long as it changes in ways he likes. The short sighted nature of his rant can be illustrated be looking at some of the words he uses. “error” for example means to “to wander” originally, the misuse seems to have gradually become the normal meaning. More interesting is that Dixon is actually wrong as his potted history falls a little short of the truth. Etymologically speaking:

Disinterested and uninterested share a confused and confusing history. Disinterested was originally used to mean “not interested, indifferent”; uninterested in its earliest use meant “impartial.” By various developmental twists, disinterested is now used in both senses. Uninterested is used mainly in the sense “not interested, indifferent.” It is occasionally used to mean “not having a personal or property interest.”


So rather than switching, it’s perhaps more accurate to say that they switched back to their original meanings. But don’t let facts get in the way of a good old language rant Doc!

What seems to be happening today is that such shifts are occurring more and more quickly. Consider the word “issue”. I heard a cricket commentator saying that an Indian batsman was “having issues with” an opponent’s spin bowling. As recently as five years ago, he would have said the batsman was confused by the spinning ball, which he was failing to hit in the way he intended. “Issue” then meant something quite different. Since then, however, it has come to mean “problem”.

Dixon offers (as all news articles it should be said) no evidence for his (amazing if true) suggestion that the pace and amount of language change is increasing in English. He seems to have issues with the use of “have issues with” (hohoho) and actually Google Ngram does show an increase in its use between 1990-2008. That said, if the alternative to “having issues with” really is “confused by the spinning ball, which he was failing to hit in the way he intended” then forgive me for calling it a welcome addition to English.

What is especially surprising nowadays is that misuses of words can increasingly be found even in specialised communications – as in my own particular field of science. Long ago, when I was a microbiology student, I learned that the singular of bacteria was bacterium. Then, towards the end of the 20th century, print and broadcast journalists began to say “this bacteria”. And the alteration did not stop there. It is now affecting professional discourse too. In the past three months, I have seen “a bacteria” or “this bacteria” six times in research journals. I have even heard a speaker making the same mistake throughout his conference presentation.


Most of what is described here I have dealt with in detail in this post and so won’t rehash suffice to say that what Dixon describes as a “mistake” is a change in usage and also that it is altogether unreasonable to expect foreign words to retain their foreign morphemes in a host language. Note that Dixon admits that it was not evidence of usage that led to his knowledge about “bacteria” but rather that that is what he was taught -and what he unquestioningly accepted. When his friends tell him that “this spaghetti is delicious” does he, I wonder correct them “THESE spaghetti ARE delicious!” Shoddy stuff for a scientist.

Sports commentators appear to be culpable in another area – the demise of the adverb. Within the past 10 years, firstly snooker commentators and then those in other sports began to tell us that “he hit that one strong”, that “she’s playing confident”, and that “he’s bowling accurate”. The habit is now spreading more widely.

This is a pretty old chestnut. This type of inflexible thinking is what leads to people avoiding saying “I am good” (I am well!) and “This food is healthy” (This food is healthful!) A basic question to be asked when people talk about “bad” grammar is whether communication is actually breaking down. As with the pedantic parental chide “two negatives make a positive” (they don’t) the listener understands perfectly what they speaker is saying but just insists that language must, for some unknown reason, stand still at this moment in time and stop changing. Really quite a bizarre position for someone presumably well versed in evolutionary theory. The good Dr. seems to be suffering from a linguistic form of the “golden age fallacy” namely that there was a time when the English language was perfect. Perhaps a little after Shakespeare and a bit before the Bronte sisters. Ever since then it’s just been on the decline and will eventually reach a state where morlock-esque yoof roam the streets of Neo-England burbling an incomprehensible text-like patois to each other as society collapses.

In many cases, although it is impossible to pinpoint the initial change
 

And of course this is true though as a biologist such a statement should have him holding his head in shame. He’s basically asking for a missing link which could never exist. Just as no old world monkey gave birth to a human, no linguistic change can really ever be pinpointed. As McWhorter notes, Latin didn’t die, it turned into French, Spanish and Italian, but there wasn’t a day when people woke up saying “OK, now we’re speaking French”.

the reason why people begin to adopt erroneous usages so quickly is probably one of fashion and a desire to demonstrate familiarity with the modish vernacular. Consider “fantastic”, which is now a universal expression of hyperbole. Anyone interviewed in the media about anything that impresses or excites them will repeatedly call it “fantastic”. Over recent weeks, I have heard a celebrity chef describe a particular dish as fantastic (when he meant unusually succulent), a drama critic call an actor’s performance fantastic (when she meant disturbingly realistic) and a politician describe a party conference speech as fantastic (when he meant inspiring).
 

Reading this I’m almost tempted to think Dixon is pulling our collective legs. The idea that language use is something like wearing hipster jeans and not related to a complex set of social and psychological factors is quite staggeringly simplistic. Also does he really have a problem with the word ‘fantastic’? Does he really expect people only to use the word only for “characteristic of fantasy”? He goes on to talk about ‘literally’ which I dealt with here and so won’t go into nor will I again deal with what he calls the “important differences between ‘who and whom‘” (tl;dr = It isn’t an important difference.)

At the end he wonders “why are even the editors of scientific journals adopting fashionable but incorrect usages?” and this is where I would like return to the point I made in the first paragraph. The reason that editors are (sensibly) adopting “fashionable” usages is because those usages will almost certainly, triumph. And for the second part of this (rather long!) blog I’d like to take you and Dixon on an Dickensian tour of the ghosts of pedants past. I’m hopeful there is still time for the good Doctor to have a change of heart. So here it is, a list of complaining prescriptivists throughout the ages:


George Fox (1624-1691) wrote in his Epistle:


If you’re not getting that then Fox was complaining that only a ruddy great idiot would say “you” to one person, when as any fule kno the right word was “thou”.

Jonathan Swift (1667-1745) disliked past tense ‘ed’ being pronounced softly, (i.e. as we pronounce it now) he wanted it pronounced as the ‘aed’ in ‘dead’. He also didn’t like the words “sham, banter, bully, bubble, cutting and shuffling”.

Robert Lowth (1710-1787) thought that (formal) sentences shouldn’t end in prepositions. Somehow this opinion became accepted as a ‘rule’ by editor types at some point although almost everyone ignores it. Lowth also argued for “My wife and I” over “me and my wife”.

William Cobbet (1763-1835) wrote “A grammar of the English language” and complained about the use of the following as past tenses “Awoke, built, dealt, threw, swam and meant” (among many others) arguing instead for the more ‘proper’ “awaked, builded, dealed, throwed, swimmed and meaned”. (McWhorter)

Strunk and White, authors of a famous style guide loved by the likes of Dixon and first published in 1919, proscribed against the use of “hopefully”, the verbs “host” and “chair” and the passive voice. Rather amusingly (or at least, this passes for amusing to sad folks like me) of the four examples they give of mistaken use of the passive voice, only one of them is actually passive. And this isn’t their only mistake. As Pullum notes, they “The book’s contempt for it’s own grammatical dictates seems almost wilful.” The awful situation therefore is that we have know-nothings telling others how they should be writing and incredibly being listened to.

Steven Pinker notes that the verbs parent, input, showcase, impact, and contact “have all been denounced in this century” (1994:379)


So, if we examine the broad sweep of history we can see that most of the things that have been railed against have become normal and natural. Dr. Dixon and people like him can get upset about usage but his views about the word “fantastic” will, if they don’t already, eventually be seen as preposterous to future generations. I might be losing the battles against this type of newspaper article buthistory shows that the war is already won.






 




 

That’s so gay!

Reading (actually listening to) Steven Pinker’s massive Better Angels of our nature, I came across an interesting section on the use of the word “gay” as a pejorative.  there was something of a storm a few years back when Chris Moyles of BBC radio 1 used it on air and there is a lot of hand-wringing about this modern usage; I always feel a bit guilty when I say it. 

However Pinker presents evidence to suggest that perhaps we shouldn’t feel so bad.  Referring to a survey of American views of homosexually, he notes:
Many people have informed me that younger Americans have become homophobic, based on the observation that they use “That’s so gay!” as a putdown. But the numbers say otherwise: the younger the respondents, the more accepting they are of homosexuality. Their acceptance, moreover, is morally deeper. Older tolerant respondents have increasingly come down on the “nature” side of the debate on the causes of homosexuality, and naturists are more tolerant than nurturists because they feel that a person cannot be condemned for a trait he never chose. But teens and twenty-somethings are more sympathetic to the nurture explanation and they are more tolerant of homosexuality. The combination suggests that they just find nothing wrong with homosexuality in the first place, so whether gay people can “help it” is beside the point. The attitude is: “Gay? Whatever, dude.”(Pinker 2011:619)
 
In short, young people may use the word “gay” more often than older people, but they are also the group with the most tolerant views of homosexuals.  It’s therefore difficult to equate using “gay”, in this way, with being homophobic.  That’s not to say that it wouldn’t cause offence, but that it’s likely none was intended, and to reinforce this, when questioned kids often adamantly deny they are homophobic
 
Is it possible that this is an example of words not necessarily being linked to their literal meaning as discussed by me here or could it be the birth of a new homonym, reminiscent of fat/fat (later phat to avoid confusion which mirrors the recent appearance of “ghey“) which was in use when I was an undergraduate or funny/funny which is, if you think about it, an incredibly inefficient word for communication,  requiring, as it sometimes does, confirmation of meaning with the phrase “funny haha or funny peculiar?”  These kinds of words prove that languages evolve and are not regular or perfect but messy and constantly changing.  Language is so gay!

Edit:  I did come across an article which claimed that heaing the phrase “that’s so gay” may cause students to be more likely to suffer from “headaches, eating problems and feelings of isolation”.  I haven’t had a chance to read the study yet. 

 

The False Gods of Grammar

In a recent tweet Conan O’Brien asked:


One reply was from Grammar girl, (mignon Forgarty) author of “quick and dirty grammar tips”. Grammar Girl is a grammar expert and is an editor and an MS graduate in biology,  -not linguistics, and while this shouldn’t matter, I’ll explain later why it does. Her reply was:

First of, it’s important to say that this is absolutely correct and she presents a completely accurate explanation of the differences on her websitetoo. My issue is with the rule itself. A grammar expert can repeat learned rules but it strikes me that someone with a background in science, like Grammar Girl, might want to peek a bit further behind the curtain and think about why those rules exist and if they are worth following at all. These kind of language ‘rules’ along with splitting infinitives and ending sentences with prepositions, only exist because people in authority have decided they should exist, and a small band of self-proclaimed “experts” (from the 16th Century at least) have pronounced on their particular proclivities.

What’s wrong with who/whom?

A good place to start would be this piece about John McWorter’s (professor of linguistics) take on Who/whom. “Whom” is a fossilized piece of old English which is somehow still clinging to life. In  “myths, lies and half truths of language usage” he notes that many language experts, including the influential Robert Lowth fought for the survival of “whom”. However, McWorther notes, Lowth also fought for the survival of Sitten (sat), spitten (spat), wert (was) and Chicken as a plural (I have two Chicken). How many of these strike you as worth keeping?

If we look at other similar pronouns we can see how odd “whom” is:

pronoun use

Subject 

Object

Place

Where

Where

Time

When

When

Things

Which

Which

People

Who

Whom

General all purpose

That

That

 Linguistics quirks like this serve no purpose, as far as I can see, but to intimidate others and give people the chance to demonstrate their superior learning. The whole thing works like something of a catch 22. You can willfully split your infintives or refuse to use “whom”, despite knowing the “rules” but the maven you’re talking to may judge you as being less well educated, so you might feel obliged to use it anyway. It’s also worth noting that who/whom has been a source of mistakes throughout history, with errors appearing in The Bible, and works by Shakespeare, Dickens, Churchhill and Swift. So if you are confused, you’re in good company.  It’s certainly no indicator of stupidity.

The Grammar


In grammatical terms, who/whom are pronouns, they often appear in relative clauses such as:
        The person who/whomyou’re talking to is a blithering idiot.



Grammar “experts” would tell you that because the word is an object here, then it “ought to be” “whom” not “who”. If we follow Grammar Girl’s rule (above) we would say “I’m talking to him” and thus use “whom”. This “ought to be” is what is called prescriptivism. But what does that mean?  Steven Pinker( linguist and cognitive scientist) defines it like this:

The contradiction begins in the fact that the words “rule,” “grammatical,” and “ungrammatical” have very different meanings to a scientist and to a layperson. The rules people learn (or, more likely, fail to learn) in school are called prescriptive rules, prescribing how one “ought” to talk. Scientists studying language propose descriptive rules, describing how people do talk. 

Most of what I write here has been said before, notably by Pinker in his 1994 book The Language Instinct. Although this is a lengthy quote it is worth reproducing here:

[who/whom] is one of the standard prescriptivist complaints about common speech. In reply, one might point out that the who/ whom distinction is a relic of the English case system, abandoned by nouns centuries ago and found today only among pronouns in distinctions like he/him. Even among pronouns, the old distinction between subject ye and object you has vanished, leaving you to play both roles and ye as sounding completely archaic. Whom has outlived yebut is clearly moribund; it now sounds pretentious in most spoken contexts. No one demands of Bush that he say Whom do ye trust? If the language can bear the loss of ye, using you for both subjects and objects, why insist on clinging to whom, when everyone uses who for both subjects and objects?

It also follows that if a person believes “whom” to be necessary when in an object position, shouldn’t they also extend that rule to spoken English? Look at the following sentences:

Who are you looking at?

Who do you think you are?

Both of these, according to the “rule” are incorrect. They should read “whom are you looking at” and “whom do you think you are”. Now if you think that this sounds odd and would rather say “incorrect” things like “who are you talking about?” then why on earth would you insist on using whom at all?

More expertise

Bill Bryson is another such language expert. His popular style guide troublesome words, shows again how keen people are for an authority figure to tell them what the “rules” are. People seem to crave this kind of stuff (judging from the reviews). The section on Who/whom is typical of much of the book. Bryson has done his homework and seems to understand the arguments against this kind of rule but inexplicably always chooses to support the rule anyway, because…well…he’s fond of it:

 

English has been shedding its pronoun declension for hundreds of years; today who is the only relative pronoun that is still declinable. Preserving the distinction between who and whom does nothing to promote clarity or reduce ambiguity. It has become merely a source of frequent errors and perpetual uncertainty. Authorities have been tossing stones at whom for at least 200 years. -Noah Webster was one of the first to call it needless- but the word refuses to go away. (Bryson 1984: 216)

Bryson then goes on to say, right after this barrage, “I, for one, would not like to see it go”.

As an interesting aside, Bryson also notes Grammar Girls “him/he” rules but then points out that it doesn’t always work. He offers the example of:
 

“They rent in to whoever needs it”

Apply the rule and we get “they rent it to him” him = whom (but who is correct) 

In order to apply this “quick and dirty” rule you have to have the grammatical knowledge that the clause “whoever needs it” is the object of “rent”, not “him”.That is you should say “he needs it” to reach the correction pronoun “who”.

Confused?

Language Experts

 

The problem again with advice like this is that it is not based on any empirical findings, but rather, as throughout history, on the predilections of “authorities” and the recitation of commonly accepted “rules” which usually again originate in the predilections of “authorities” or a mistaken/superficial understanding of how the English language works. The real experts, professors in applied linguistics for example, are usually ignored and words like “whom” are kept alive on the artificial respirator of prescriptivism.

I have shown above that linguistics like Pinker and McWhorter have quite a different take on who/whom than “language experts” like Bryson and Grammar Girl.  The difference is that  Bryson and Grammar Girl are essentially more involved with journalism and publishing than linguistics.  Writers and editors get their ideas from style guides like the Chicago Manuel of Style and Strunk and White who are again often just rehashing of previously held prejudices and blackboard grammar rules.  McWhorter comments that Strunk and White “made decisions based on how nice they thought something looked or sounded, just like arranging furniture.”  And while Grammar Girl and Bryson have made notable leaps forward, accepting, for instance, split infintives, there is still a tendency to let personal preferences dictate rules:

 

She also tends to accept the word of authorities without questioning them.  In this interview She notes that “like” is frowned upon but she uses it:

MF: I tend to use “like” as a conjunction. Technically, we’re supposed to say “It looks as if it’s going to rain” or “It looks as though it’s going to rain.” I tend to say “It looks like it’s going to rain.” That’s wrong, but I’ve been saying it that way my whole life and it’s a hard habit to break. I’m constantly correcting myself.



To a linguist, the idea of you using something your whole life which is “wrong” is an astonishing notion and one which Pinker gently mocks here:
 


Imagine that you are watching a nature documentary. The video shows the usual gorgeous footage of animals in their natural habitats. But the voiceover reports some troubling facts. Dolphins do not execute their swimming strokes properly. White-crowned sparrows carelessly debase their calls. Chickadees’ nests are incorrectly constructed, pandas hold bamboo in the wrong paw, the song of the humpback whale contains several well-known errors, and monkeys’ cries have been in a state of chaos and degeneration for hundreds of years. Your reaction would probably be, What on earth could it mean for the song of the humpback whale to contain an “error”? Isn’t the song of the humpback whale whatever the humpback whale decides to sing? Who is this announcer, anyway?

I have nothing against Grammar Girl personally. She’s a popular, talented and successful person, if anything I’m a bit jealous, -but I do wish she turn over the grammatical rocks and look a bit deeper underneath. It’s fine knowing the “rules” but it’s more important to know where those rules come from and if they are worth following. Admittedly the facts are perhaps not as crystal clear or as neat and satisfying as the “rules”, but surely the facts are more important.

 

References (not hyperlinked)


 

Bryson, B 1984 Troublesome words London: Penguin














 

 
 

 

 

 

 

 

 

 

.