Geoff Nunberg

Geoff Nunberg is the linguist contributor on NPR's Fresh Air with Terry Gross.

He teaches at the School of Information at the University of California at Berkeley and is the author of The Way We Talk Now, Going Nucular, Talking Right and The Years of Talking Dangerously. His most recent book is Ascent of the A-Word. His website is www.geoffreynunberg.com.

"Infobesity," "lumbersexual," "phablet." As usual, the items that stand out as candidates for word of the year are like its biggest pop songs, catchy but ephemeral. But even a fleeting expression can sometimes encapsulate the zeitgeist. That's why I'm nominating "God view" for the honor.

To judge from some of the headlines, it was a very big deal. At an event held at the Royal Society in London, for the first time ever, a computer passed the Turing Test, which is widely taken as the benchmark for saying a machine is engaging in intelligent thought. But like the other much-hyped triumphs of artificial intelligence, this one wasn't quite what it appeared. Computers can do things that seem quintessentially human, but they usually take a different path to get there.

A lot of things had to come together to turn Thomas Piketty's controversial Capital in the Twenty-First Century into the tome of the season. There's its timeliness, its surprising accessibility and the audacity of its thesis, that capitalism inevitably leads to greater concentrations of wealth at the very top.

"There goes the neighborhood." Every so often that cry goes up in San Francisco, announcing a new chapter in American cultural history, as the rest of the country looks on. There were the beats in North Beach, then the hippies in the Haight, then the gays in the Castro. Now it's the turn of the techies who are pouring into my own Mission neighborhood, among other places. Only this time around, the green stuff that's perfuming the air is money, not weed.

When I took the SATs a very long time ago, it didn't occur to us to cram for the vocabulary questions. Back then, the A in SAT still stood for "aptitude," and most people accepted the wholesome fiction that the tests were measures of raw ability that you couldn't prepare for — "like sticking a dipstick into your brain," one College Board researcher said.

I feel a little defensive about choosing "selfie" as my Word of the Year for 2013. I've usually been partial to words that encapsulate one of the year's major stories, such as "occupy" or "big data." Or "privacy," which is the word Dictionary.com chose this year. But others go with what I think of as mayfly words — the ones that bubble briefly to the surface in the wake of some fad or fashion.

Even taken together, the charges didn't seem to amount to that big a deal — just a matter of quoting a few factual statements and a Wikipedia passage without attributing them. But as Rand Paul discovered, the word "plagiarism" can still rouse people to steaming indignation. Samuel Johnson called plagiarism the most reproachful of literary crimes, and the word itself began as the name of a real crime. In Roman law, a plagiarius was someone who abducted a child or a slave — it's from "plaga," the Latin word for a net or a snare.

Evidently it was quite fortuitous. Just a couple of days after MTV's Video Music Awards, Oxford Dictionaries Online released its quarterly list of the new words it was adding. To the delight of the media, there was "twerk" at the top, which gave them still another occasion to link a story to Miley Cyrus' energetic high jinks.

The likes of you and I can't buy Google Glass yet. It's available only to the select developers and opinion-makers who have been permitted to spring $1,500 for the privilege of having the first one on the block. But I've seen a few around my San Francisco neighborhood among the young techies who commute down to the Google and Facebook campuses in WiFi-equipped shuttle buses or who pedal downtown to Zynga and Twitter on their fixies.

"This is just metadata. There is no content involved." That was how Sen. Dianne Feinstein defended the NSA's blanket surveillance of Americans' phone records and Internet activity. Before those revelations, not many people had heard of metadata, the term librarians and programmers use for the data that describes a particular document or record it's linked to.

Mass shootings, bus crashes, tornadoes, terrorist attacks — we've gotten adept at talking about these things. Act of God or act of man, they're all horrific. At least that was the word you kept hearing from politicians and newscasters describing the Boston bombings and the explosion at the fertilizer plant in Texas.

It's a funny thing about dictionaries. First we're taught to revere them, then we have to learn to set them aside. Nobody ever went wrong starting a middle-school composition with, "According to Webster's ..." but that's not how you start an op-ed commentary about terrorism or racism. When it comes to the words that do the cultural heavy lifting, we're not about to defer to some lexicographer hunched over a dusty keyboard.

Has there ever been an age that was so grudging about suspending its disbelief? The groundlings at the Globe Theatre didn't giggle when Shakespeare had a clock chime in Julius Caesar. The Victorians didn't take Dickens to task for having the characters in A Tale of Two Cities ride the Dover mail coach 10 years before it was established. But Shakespeare and Dickens weren't writing in the age of the Internet, when every historical detail is scrutinized for chronological correctness, and when no "Gotcha!" remains unposted for long.

Where does the phrase "the whole nine yards" come from? In 1982, William Safire called that "one of the great etymological mysteries of our time."

He thought the phrase originally referred to the capacity of a cement truck in cubic yards. But there are plenty of other theories.

"Big Data" hasn't made any of the words-of-the-year lists I've seen so far. That's probably because it didn't get the wide public exposure given to items like "frankenstorm," "fiscal cliff" and YOLO.

Mitt Romney was on CNN not long ago defending the claims in his campaign ads — "We've been absolutely spot on," he said. Politics aside, the expression had me doing an audible roll of my eyes. I've always associated "spot on" with the type of Englishman who's played by Terry-Thomas or John Cleese, someone who pronounces "yes" and "ears" in the same way — "eeahzz." It shows up when people do send-ups of plummy British speech. "I say — spot on, old chap!"

When you consider how carefully staged and planned the debates are and how long they've been around, it's remarkable how often candidates manage to screw them up. Sometimes they're undone by a simple gaffe or an ill-conceived bit of stagecraft, like Gerald Ford's slip-up about Soviet domination of eastern Europe in 1976, or Al Gore's histrionic sighing in 2000. Sometimes it's just a sign of a candidate having a bad day, like Ronald Reagan's woolly ramblings in the first debate with Walter Mondale in 1984.

I have a quibble with the title of David Skinner's new book, The Story of Ain't. In fact, that pariah contraction plays only a supporting role in the story. The book is really an account of one of the oddest episodes in American cultural history, the brouhaha over the appearance of Merriam-Webster's Third International Dictionary in 1961.

People are saying that Mitt Romney's selection of Wisconsin Rep. Paul Ryan as his running mate creates an opportunity to hold what Ryan likes to call an "adult conversation" about entitlement spending. In the present political climate, it would be heartening to have an adult conversation about anything. But bear in mind that "entitlement" doesn't put all its cards on the table. Like a lot of effective political language, it enables you to slip from one idea to another without ever letting on that you've changed the subject.

Sometimes it's small government you need to keep your eye on. Take Middleborough, Mass., whose town meeting recently imposed a $20 fine for swearing in public. According to the police chief, the ordinance was aimed at the crowds of unruly teenagers who gathered downtown at night yelling profanities at people, not just someone who slams a finger in a car door. But whatever the exact idea was, nobody thought it was a good one.

Geoff Nunberg is the linguist contributor on NPR's Fresh Air. His new book, Ascent of the A-Word, will be appearing this summer.

Geoff Nunberg, the linguist contributor on NPR's Fresh Air, is the author of the book The Years of Talking Dangerously.

There was something anticlimactic to the news that the AP Stylebook will no longer be objecting to the use of "hopefully" as a floating sentence adverb, as in, "Hopefully, the Giants will win the division." It was like seeing an obituary for someone you assumed must have died around the time that Hootenanny went off the air.

Geoff Nunberg, the linguist contributor on NPR's Fresh Air with Terry Gross, is the author of the book The Years of Talking Dangerously.

"My choice of words was not the best," Rush Limbaugh said in his apology. That's the standard formula for these things — you apologize not for what you said but for the way you said it.

Geoff Nunberg, the linguist contributor on NPR's Fresh Air with Terry Gross, is the author of the book The Years of Talking Dangerously.

If the word of the year is supposed to be an item that has actually shaped the perception of important events, I can't see going with anything but occupy. It was a late entry, but since mid-September it has gone viral and global. Just scan the thousands of hashtags and Facebook pages that begin with the word: Occupy Wall Street, Occupy Slovakia. Occupy Saskatoon, Sesame Street, the Constitution. Occupy the hood.

Steve Jobs did his last product launch last March, for the iPad 2. At the close, he stood in front of a huge picture of a sign showing the intersection of streets called Technology and Liberal Arts.

It was a lifelong ideal for Jobs, the same one that had drawn him to make his famous 1979 visit to the Xerox Palo Alto Research Center, or Xerox PARC for short. That was where a group of artistically minded researchers had developed the graphical user interface, or GUI, which Apple's developers were to incorporate into the Lisa and the Macintosh a few years later.