Sturgis’s Law #11

A very long while back, like in May 2015, I started an occasional series devoted to Sturgis’s Laws. “Sturgis” is me. The “Laws” aren’t Rules That Must Be Obeyed. Gods forbid, we writers and editors have enough of those circling in our heads and ready to pounce at any moment. These laws are more like hypotheses based on my observations over the years. They’re mostly about writing and editing, but I can’t help noticing that some of them apply to other aspects of life as well. None of them can be proven, but they do come in handy from time to time.

It’s been more than three and a half years since I blogged about Sturgis’s Law #10, and I’m only halfway through the list. Time to get cracking! As I blog about them, I add the link to Sturgis’s Laws on the drop-down from the menu bar. Here at long last is Sturgis’s Law #11:

The burden of proof is on the editor.

We editors live to make good prose better and awkward prose readable. We mean well and most of us are at least pretty good at what we do, but this has its downside: the writers we deal with are usually pretty good at what they do, and even when they’re not, they generally have a better idea of what they’re trying to get across than we do.

Newly fledged editors can be a bit, well, full of ourselves. I sure as hell was. I got hired for my first professional (i.e., paid) editor job on the basis of my knowledge of English grammar, usage, spelling — the basics, in other words. I was quickly introduced to “Chicago style,” which then in its 12th edition was still called A Manual of Style. (It became the Chicago Manual of Style with the 13th edition and so it’s continued through the 17th and current one.)

Oh dear! So many recommendations to remember and apply! I learned, I applied — and I got pretty obnoxious about some of it, notably the which/that distinction: That is used for restrictive clauses, which for non-restrictive, and which is invariably preceded by a comma. Thus —

The house that I grew up in had green shutters.

That house, which was built in 1956, is the one I grew up in.

In the first example, “that I grew up in” provides information essential to identifying the house. In the second, “which was built in 1956” is almost an aside: you could put it in parentheses or drop it completely. (For what it’s worth, the house I grew up in was built in 1956, but it had no shutters at all.)

Never mind that I’d lived almost three decades and learned to write pretty well knowing zip about the which/that distinction — now it became my litmus test for sorting writers into categories: those who “got it” and those who didn’t. This stood me in good stead when, almost two decades later, I started freelancing for U.S. publishers, because many of them include the which/that distinction in their house style, plus it’s in Chicago, which most of them use as a style guide.

Long before that, however, I’d learned that in British English “which” is often used for restrictive clauses and little if any confusion results; it also dawned on me that the distinction between restrictive/essential and non-restrictive/non-essential often isn’t all that important to the sentence at hand. Consider, for instance, the convention for setting off non-essential words with commas. I’m supposed to write “My dog, Tam, likes to ride in the car” because (1) I’ve only got one dog, and (2) it’s important that the reader know that. True, I’ve only got one dog, but if it’s important that the reader know this I’m not going to rely on commas to get the idea across. Besides, that’s an awful lot of commas for a short sentence.

I also learned that in turning which/that into a litmus test, I was acting perilously like the English-language grammarians and educators in the mid to late 19th century. Concerned by increasing literacy among the working classes, they came up with a bunch of rules to distinguish the properly educated from the riffraff. Most of those “rules,” like the injunction against splitting an infinitive or ending a sentence with a preposition, have been properly consigned to the dungheap by good writers and editors. Nevertheless, they’re tenacious enough to have been dubbed “zombie rules” because they don’t stay dead.

Me at work in my EDITOR shirt

While that first editorial job introduced me to the potential for editorial arrogance, it also presented a couple of antidotes. One was Theodore Bernstein’s The Careful Writer: A Modern Guide to English Usage. My paperback copy is in two pieces from years of frequent consultation. Since it was first published in the mid-1960s, it’s no longer quite as “modern,” but it’s still a good antidote for editors, educators, and other word people who are sometimes tempted to take ourselves and our esoteric knowledge a little too seriously. Bernstein is also the author of Miss Thistlebottom’s Hobgoblins: The Careful Writer’s Guide to the Taboos, Bugbears, and Outmoded Rules of English Usage, which I think is still in print.

Most important, that job required that each manuscript be “cleared”: you sat down side by side with the writer and went through the whole ms. line by line, answering the writer’s questions and explaining why you’d made this or that change. (These were pamphlets, brochures, training manuals, and such, ranging up to perhaps 40 pages in length, not full-length books.) These writers weren’t pros. Some were definitely more capable than others, and it wasn’t uncommon for the less capable to be the most defensive about edits. I learned to justify every change I made to myself so that I could explain it clearly to the writer.

When freelancing for trade publishers these days, I have zero direct contact with the authors of the book-length mss. I work on, but I know they’re going to see the edits I make and the queries I write. On most other jobs, I do deal directly with the author, but almost exclusively by email. That early experience has stood me in very good stead over the decades: I never forget that there’s a real human being on the other side of the manuscript.

For more about that first staff editor job, including how I got that T-shirt, see “1979: I Become an Editor” in my new blog, The T-Shirt Chronicles.

S Is for Style

Over the years of working with English as an editor and writer I’ve learned to be careful of the words “right” and “wrong.” When asked if something is right or not, I often begin with “It depends” — on your intended audience, on context, on which side of “the pond” (aka the Atlantic Ocean) you’re on, and so on.

We talk about the “rules of grammar” as if they’re hard, fast, and uncompromising, but they aren’t. Even the basic ones have their exceptions. Take “subject-verb agreement.” The subject should always agree with its verb in number, right? Most of the time, yes, but some nouns can be singular or plural depending on how they’re being used. Some examples: couple and family take a singular verb when referring to the unit, but a plural verb when its members are being emphasized.

The same principle applies to majority and many other words denoting groups of persons, places, or things: is it referring to the group as a whole or to its constituent parts? (Tip: Is it preceded by the definite article the or the indefinite a(n)?

  • The majority has voted to replace the bridge.
  • A majority (of participants or whatever) are coming to the party.
Arbiters of style, in hardcopy

Which brings me around to style. Style is far more flexible than grammar, and for this very reason publications, publishers, and academic disciplines adopt distinctive styles. These are often based on one of the major style guides. Most U.S. publishers use the Chicago Manual of Style, often with their own additions and exceptions. Most U.S. newspapers and periodicals start with the AP Stylebook. (AP stands for Associated Press, a nonprofit news agency that dates back to the mid-19th century.) Other common styles include MLA (Modern Language Association), which is especially popular in the humanities, and APA (American Psychological Association), widely used in the social sciences.

I’m on a first-name basis with Chicago, having been using it since 1979, and I have a nodding acquaintance with AP. A significant difference between the two is in how they handle numbers. Chicago generally spells out numbers through one hundred. AP spells out one through nine but uses figures for 10 and up. Another is in the use of italics: Chicago employs them in a variety of ways, notably for titles of books, films, and other full-length works. AP style doesn’t use them at all. Before the digital age, italics couldn’t be transmitted “over the wires,” so AP style developed without them (and without boldface, for the same reason).

Unsurprisingly, Chicago, MLA, and APA styles devote a lot of attention to citations. All three are widely used by academics, whose writing is based on previously published work or unpublished work that can be found in manuscript collections. (Chicago began as the style guide of the University of Chicago Press. Though it’s widely used by trade publishers and even fiction writers, its scholarly origins are obvious in the chapters devoted to quotations and citation style.)

It’s no surprise either that AP devotes virtually no attention to footnotes, endnotes, and bibliographies. Reporters may quote from public documents, but their primary sources are interviews and public statements. They may have recorded backup, or they may rely on notes scribbled the old way in a notebook.

So what does this mean to you? The top two lessons I’ve learned over the years as a writer and editor are (1) right and wrong, correct and incorrect, are shiftier than one learns in school, and (2) nevertheless, rules and conventions are important. The better you know them, the more command you’ll have over your writing — which is a big plus when you decide to stretch, bend, or break them.

For U.S. writers of general nonfiction, creative nonfiction (e.g., memoir), and fiction, the Chicago Manual of Style is a good place to start. No, you don’t need to read it straight through. (I never have, and there are a couple of chapters that I’ve rarely ever looked at.) The further you get from scholarly nonfiction, the more flexible you should be about applying its recommendations. As I keep saying, these are guidelines, not godlines.

When I’m working, I usually have three dictionaries — Merriam-Webster’s, American Heritage, and Oxford/UK — open in my browser, along with the Chicago Manual of Style. I subscribe to the AP Stylebook and consult it from time to time. This reminds me continually that even “the authorities” differ. For colloquialisms and current slang, Google is only a click away.

I just realized that I haven’t said a thing about style sheets. Fortunately I wrote about them at some length a few years ago: “What’s a Style Sheet?” Short version: A style sheet is for keeping track of all the style choices one makes when copyediting a manuscript. It includes general choices about the styling of, e.g., numbers and the use of quote marks and italics. It also includes words, dozens of words: unusual words, words that aren’t in the dictionary, words for which there is more than one spelling. In biographies and history books, the list of personal names might be as long as the word list. When I turn in the completed copyediting job, my style sheet goes with it. When I receive a proofreading job, I get the copyeditor’s style sheet too.

For writers, keeping a style sheet is a handy way to maintain consistency, especially in a novel or other book-length work. It can also remind you to check the spelling of names and places. Publishers don’t encourage authors to submit style sheets with their manuscripts, but I wish they did.

G Is for Grammar

Grammar scares the hell out of many people. In the very late 1990s, when I started participating in online groups that weren’t oriented to editors and/or writers, people would sometimes apologize to me for their bad grammar or spelling. Once in a while someone would attack me for making them feel inferior. I was mystified. For one thing, their grammar wasn’t bad at all, and for another I wasn’t criticizing anyone’s grammar, spelling, or anything else.

Then I got it: I was using the same sig line I used in online groups of writers, editors, and other word people. It identified me as an editor. I cut “editor” out of my sig line. The apologies and attacks stopped.

Grammar gets a bad rap. (NB: I just took a little detour to look up “bad rap,” like why isn’t it “bad rep,” as in “reputation”? Check it out on the Merriam-Webster’s website.) Plenty of us learned in school that there’s only one right way to write and every other way is substandard. Taken to heart, that’s enough to paralyze anybody.

There’s no shortage of people who’ll sort you into a category according to how you speak or write. (Take a break here if you like to listen to “Why Can’t the English?” from My Fair Lady.) A common assumption seems to be that editors all come from this judgmental tribe. While it’s true that most of us who become editors were language adepts in school — we spot grammatical errors and misspellings as readily as musicians detect sour notes in a concert — the best editors I know put serious effort into learning more about how our language is used in the real world, and how writers use it.

Some grammars are descriptivist: they describe how a language is used by its speakers. Others are prescriptivist: they tell speakers of a language how they ought to be using it. Language changes over time, no doubt about it. It also varies across different populations, which is why both writers and editors need to consider the audience for whatever they’re working on.

Think of grammar as a tool in your toolkit. As tools go, it’s a pretty complex one and takes a while to master — it’s more like a piano than a screwdriver. On the other hand, a sentence has fewer moving parts than the human body, so learning the parts of speech takes a lot less time than learning all the bones and muscles. Understanding how the parts are supposed to work together makes it easier to recognize when a sentence isn’t working, how to fix it, and how to explain it all to someone else.

If you never learned to diagram sentences in school, or even if you did, you might find that diagramming helps you visualize how the parts of a sentence fit together. There are plenty of how-tos online, including this one.

Since my first editorial job four decades ago, my go-to reference for grammar questions has been Words Into Type. It hasn’t been revised in just about that long, so it can be hard to find, so I asked some editorial colleagues what their favorite references were. Here are a few of them:

  • The Copyeditor’s Handbook, 4th ed., by Amy Einsohn and Marilyn Schwartz, University of California Press. I’ve got the 3rd edition, the last one Amy completed solo before her death in 2014. And no, it’s not just for copyeditors.
  • Good Grief, Good Grammar: The Business Person’s Guide to Grammar and Usage, by Dianna Booher, Ballantine Books
  • The Blue Book of Grammar and Punctuation, by Jane Straus, Lester Kaufman, and Tom Stern, Wiley
  • The Gregg Reference Manual, by William Sabin, McGraw-Hill
  • The Little, Brown Handbook, by H. Ramsey Fowler, Jane E. Aaron, and Michael Greer, Pearson
  • The Chicago Manual of Style, 17th ed., University of Chicago Press. Also available by subscription online. I’ve been using it since the 12th edition, when it was still called A Manual of Style.
My go-to reference books

U Is for Usage

People are regularly accused of not knowing their grammar when the real issue is a possibly shaky grasp of usage.

Here’s Bryan Garner, whom I’ve invoked more often in the last week or so than in the previous 10 years, on grammar: “Grammar consists of the rules governing how words are put together into sentences” (Chicago Manual of Style, 16th ed., section 5.1).

And here he is on usage: “The great mass of linguistic issues that writers and editors wrestle with don’t really concern grammar at all — they concern usage: the collective habits of a language’s native speakers” (CMS 16, section 5.216).

Language eddies and ripples and never stops moving.

Those collective habits tend to change a lot faster than the underlying rules. Think of a river, a pond, or the ocean: the surface sparkles and ripples and can be quite turbulent, while what’s underneath moves more sedately or maybe not at all.

Usage isn’t uniform across speakers of a particular language either. Nowhere close. Much has been made of the differences between British English (BrE) and American English (AmE), but both BrE and AmE include great internal diversity, by nation, region, and other factors.

Usage that raises no eyebrows in a particular field may seem clunky, appalling, or even incomprehensible in another. Recently an editor queried the editors’ e-list we’re both on about a use of “interrogate” that raised her hackles; in her experience, suspects could be interrogated but not theories, Those of us who regularly edit in certain academic disciplines assured her that in those fields theories can be interrogated too.

The editors’ groups I’m in are not only international, they include editors from many fields, genres, and disciplines. So when we ask if a certain usage is OK or not, we mention the intended audience for whatever we’re working on: fiction or nonfiction? AmE or BrE? academic discipline? subject matter? Is the tone informal or formal?

Colloquialisms and, especially, slang can be especially tricky. Slang often arises within a particular group, and part of its purpose is to set that group off from others. A word that means one thing in the wider world may mean something else within the group. By the time the wider world catches on, it’s passé within the group. This poses a challenge for, say, novelists writing for teenagers and young adults, who in every generation come up with words and phrases that set the adults’ teeth on edge: how to come across as credible when by the time the book appears in print (usually at least a year after it’s turned in to the publisher), the dialogue may come across as ridiculously outdated to its target audience.

Dog in driver's seat

I’ll assure my insurance company that I’m wise enough to ensure my safety by not letting Travvy drive my car.

In English, usage gaffes often result when words sound alike; when their meanings are related, the potential for confusion grows. Consider this sentence: “I assured my friends that I’d ensured my own safety by insuring my car against theft.” “Insure” appears regularly for “ensure,” which means, more or less, “guarantee,” and to make it even more fun, my car is insured through Plymouth Rock Assurance — which works, sort of, because they’re assuring me that I’ll be covered in case of an accident.

By Googling frequently confused words I turned up lots of lists, including this one from the Oxford Dictionaries site. I see most of them pretty regularly, the exceptions being the ones that involve distinctively BrE spellings or words, like “draught,” “kerb,” and “barmy.” It’s missing one that I see a lot: reign/rein. The expression “rein in” has come adrift from its origin, which has to do with horses. “Reign in” sounds exactly the same, but written down it doesn’t make sense.

Working editors, especially copyeditors, store all these frequently confused words in our heads. We’re always adding to the collection — and discussing whether a particular word has graduated from confusable to acceptable, at least in certain quarters. These discussions can get quite heated.

The English-language dictionaries most commonly used these days are descriptive, not prescriptive. That is, they describe how speakers are actually using the words, not how they should be using the words. Take “imply” and “infer”: I can imply (suggest or hint) that something is true, but you can infer (deduce or understand) that I don’t believe it. “Infer” is used to mean “imply” often enough that this is listed as a meaning in both Merriam-Webster’s and American Heritage, though both dictionaries include a cautionary note about this usage.

My editorial mentor, circa 1980, railed against the use of “target” as a verb, which to me at the time, a generation younger, seemed unexceptional. A few years later, however, I and others were railing against the use of “impact” as a verb. What the hell’s wrong with “affect”? we asked. I’ve pretty much given up on that one, though I don’t use it myself.

I cheer loudly whenever an author uses “comprise” correctly, which isn’t very often, but mostly I’ve given up on that one too. Once upon a time the whole comprised the parts, and the list of parts was assumed to be comprehensive. If it wasn’t, you used “include.” So few people remember that distinction that if it’s important that readers know that the list of parts is comprehensive, you better not rely on “comprise” alone to get the idea across.

Similarly, I was well on in my editorial career when I learned that “dogs such as Alaskan malamutes and Siberian huskies” was assumed to include malamutes and huskies, whereas “dogs like Alaskan malamutes and Siberian huskies” did not, presumably with the rationale that malamutes are not like malamutes; they are malamutes. No way would I expect a general readership, even a literate, well-informed readership, to know this.

At the same time, I do occasionally feel a little smug because I’ve got all this esoterica stored in my head. But I do try to keep it under control when I’m editing.

G Is for Grammar

We’re so quick to say that someone “doesn’t know their grammar” that it might be surprising how many of us aren’t entirely sure just what “grammar” is. This would include me. I just had to look it up (again). Here is what Bryan A. Garner, author of the “Grammar and Usage” chapter of the Chicago Manual of Style, has to say:

Grammar defined. Grammar consists of the rules governing how words are put together into sentences. These rules, which native speakers of a language learn largely by osmosis, govern most constructions in a given language. The small minority of constructions that lie outside these rules fall mostly into the category of idiom and usage.

In the very next paragraph he notes that “there are many schools of grammatical thought,” that “grammatical theories have been in great flux in recent years,” and that “the more we learn the less we seem to know.”

button: grammar police enforce the syntaxNot to worry about all this flux and multiplicity, at least not too much. A couple of things to keep in mind, however, when someone accuses you or not knowing your grammar or when, gods forbid, you are tempted to accuse someone else: (1) spelling and punctuation are not grammar, and (2) some of the rules you know are bogus.

If you’re not sure which of the rules you know are bogus — well, I just Googled bogus grammar rules (without quote marks) and got 338,000 hits. Bogus rules are the ones we generally don’t learn by osmosis. They are stuffed down our throats by those in authority, often teachers or parents.

At the top of almost everybody’s list are the injunctions against splitting an infinitive and ending a sentence with a preposition. They’ve both been roundly debunked, but I still get asked about one or the other from time to time so I’m pretty sure they’re not dead yet. Plenty of writers and even editors still get anxious when a “to” is split from its verb or a preposition bumps up against a period/full stop.

The general purpose of bogus rules is not to help one write more clearly; it’s to separate those who know them from those who don’t. As literacy spread and anyone could learn to read and write, the excruciatingly well educated upper classes confronted a dilemma: how in heaven’s name can we tell US from THEM? Hence the rules — about language, etiquette, and various other things.

Note, however, that the uppermost class can generally get away with anything, so the ones who follow and strive to enforce the bogus rules are often those a notch or two below in the pecking order. That’s how they demonstrate their loyalty to those at the top. Watch out for them.