Brian Christian’s feature article in the current issue of The Atlantic, “Mind vs Machines,” is billed on the cover as “Why Machines Will Never Beat the Human Mind,” which nicely captures the distance between what the magazine’s editors think will sell and Christian’s rather different point, reflected in the title of his forthcoming book, from which the piece was adapted: The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive. You can win an award for being “the most human human,” as Christian himself has done, if you participate in the Loebner Prize, an annual event that recreates the Turing Test, and snatch victory from the jaws of artificial-intelligence engineers more frequently than the other human contestants. The crux of Christian’s report is that what makes the Turing Test compelling is the insight that it generates into human complexity. The funhouse aspect of the exercise — trying to fool judges into thinking that they’re talking to people when they’re in fact talking to machines — makes for good headlines, but if Brian Christian is correct, we can expect a jockeying back and forth between man and motherboard in which human beings, regularly losing the title to ever-smarter computers, just as regularly figure out how to win it back.Â
Christian reminds us of Alan Turing’s brilliant condensation of the thorny question that emerged after World War II: would the new computing machines ever be capable of thought?
Instead of debating this question on purely theoretical grounds, Turing proposed an experiment. Several judges each pose questions, via computer terminal, to several pairs of unseen correspondents, one a human “confederate,” the other a computer program, and attempt to discover which is which. The dialogue can range from small talk to trivia questions, from celebrity gossip to heavy-duty philosophy — the whole gamut of human conversation. Turing predicted that by the year 2000 computers would be able to fool 30 percent of human judges after five minutes of conversation, and that as a result, one would “be able to speak of machines as thinking without expecting to be contradicted.”Â
The millennium turned without smiling on Turing’s forecast, but, in 2008, a computer program came within a hair of winning the Loebner Prize. This inspired Christian to participate in 2009 — and not only that, but to go for the “most human human” award while he was at it. There was nothing frivolous about his undertaking; it’s quite clear that he didn’t sign up for the test so that he could write a breezy article about it. He was motivated by the fear that human beings were giving up too easily — weren’t, in fact, trying to win. While the AI teams poured boundless time and effort into the design of their simulators, the human confederates were being advised, fatuously, to “just be yourself.” As Christian says, it’s hard to tell whether this pap reflected an exaggerated conception of human intelligence or an attempt to fix the fight in the machines’ favor.Â
To the extent that “just be yourself” means anything, it is better expressed in one word: “Relax.” That’s what coaches always seem to be telling their athletes before the big fight, and for highly-trained minds and bodies, it’s probably sound. You can’t show your stuff to true advantage if you’re worrying about what you’ve got. But ordinary people — this is what “ordinary” means — don’t have any stuff to show. What “just be yourself” says to them is “don’t sweat it.” So, on one side, we have ardent engineers, with their brilliant insights and excruciating attention to detail — and probably some serious funding. On the other, “don’t sweat it.” Rocket scientists versus slackers — not much of a contest.Â
Christian doesn’t follow this peculiar asymmetry (not in The Atlantic, anyway), but what’s at work here is the same decayed snobbishness with which the Educational Testing Service insists that special preparatory courses and other preliminary efforts are irrelevant to success on its examinations. This is patently untrue, but the cachet of the ETS achievement and aptitude tests remains bound up in the idea that success in life does not require specialized training. This was the lesson taught to us by the great English gentleman of Victorian fact and fiction, men who, by following their whims as far as fortune allowed, acquired skills and insights of almost universal application. Boy Scouts varied this theme by straining to remain semper paratus while carrying the lightest backback. Executive suites are still stuffed with affable generalists who have learned what they know about life from playing golf. In this clubby atmosphere, study and preparation, “boning up” of any kind, looks like a kind of cheating.Â
Even Christian is blown sideways by the gale force of this prejudice; he sounds crashingly unsportsmanlike.Â
And so another piece of my confederate strategy fell into place. I would treat the Turing Test’s strange and unfamiliar textual medium more like spoken English, and less like the written language. I would attempt to disrupt the turn-taking “wait and parse” pattern that computers understand, and create a single, flowing chart of verbal behavior, emphasizing timing. If computers understand little about verbal “harmony,” they understand even less about rhythm.Â
If nothing was happening on my screen, whether or not it was my turn, I’d elaborate a little on my answer, or add a parenthetical, or throw a question back at the judge — just as we offer and/or fill audible silence when we talk out loud. If the judge too too long corresponding to the next question, I’d keep talking. I would be the one (unlike the bot) with something to prove. If I knew what the judge was about to write, I’d spare him the keystrokes and jump in.
It’s almost funny, how shot through this passage is with the air of deception. All these conscious little tricks, all designed to “fool,” you almost think, the judge into regarding Christian as exactly what he is: a real person. Â
“Mind vs Machine” shares some invaluable observations about vernacular discourse. In heated exchanges, for example, people respond more and more exclusively to whatever has just been said, and less and less to the overall tenor of the argument. Researcher (and three-time winner of the “most human computer” prize) Richard Wallace has discovered that “most casual conversation is ‘state-less,’ that is, each reply depends only on the current query, without any knowledge of the history of the conversation required to formulate the reply.” This is a windfall for programmers, because a sudden lurch into ill-tempered language is all too convincing evidence of a human-ature tantrum, and very easy for a machine to fake. Christian draws a very practical lesson:Â
Aware of the stateless, knee-jerk character of the terse remark I want to blurt out, I recognize that that remark has more to do with a reflex reaction to the very last sentence of the conversation than with either the issue at hand or the person I’m talking to. All of a sudden, the absurdity and ridiculousness of this kind of escalation become quantitatively clear, and, contemptuously unwilling to act like a bot, I steer myself toward a more “stateful” response: better living through science.Â
I hope that Christian’s book will make that final point more clearly and happily than his article does. I was deeply put off by a passage that I read before I knew what Christian was up to, when, that is, it seemed that he was doing nothing more interesting than moaning about the possibility that we might some day be overtaken by our mechanical creations.Â
The story of the 21st century will be, in part, the story of the drawing and redrawing of those battle lines, the story of Homo spaiens trying to stake a claim on shifting ground, flanked by beast and machine, pinned between meat and math. Â
That’s pungent prose, but the metaphor of military conflict could hardly be less welcome — or less apposite to Christian’s far more gracious point, which is that computers, instead of supplanting us, can show us how to be better at what we already are.Â
***
If you believe that human beings are the Lords of Creation, then there is nothing to worry about when you sit down to dinner; but if you believe rather that we’re just one species among many, then eating becomes tragic, because it requires us to kill. My own view is that only the only way to draw a line between eating flesh and eating anything at all is to subscribe to a variant of the pathetic fallacy, according to which animals, being more like us than plants, merit kinder treatment — so it’s okay to finish your vegetables. We have to wonder what the editors of The Atlanticwere thinking when they assigned a passel of recent “foodie” books to BR Myers, the Green and vegan professor of North Korean literature. Oh, they were probably hoping for exactly what he delivered, a steaming denunciation of the lot. It’s easy to see why the prim Myers would dislike the louche Anthony Bourdain or the spiritual Kim Severson. But Michael Pollan?Â
The moral logic in Pollan’s hugely successful book now informs all food writing: the refined palate rejects the taaste of factory-farmed meat, of the corn-syrupy junk food that sickens the poor, of frozen fruits and vegetables transported wastefully across oceans — from which it follows that to serve one’s palate is to do right by small farmers, factory-abused cows, Earth itself. This affectation of piety does not keep foodies from vaunting their penchant for obscenely priced meals, for gorging themselves, even for dining on endangered animals — but only rarely is public attention drawn to the contradiction.Â
If you can find a passage in which Michael Pollan endorses any of the crimes enumerated in the second sentence, please write to Myers to thank him for the tip. Otherwise — and I’m fairly confident that it will have to be “otherwise” — you must still deal with Myers’ attack on everyone else mentioned in his review. I feel none of Myers’s hostility to today’s chic food writers, but I have lost interest in what they have to say, partly because they don’t begin to be honest about the economic elitism that underpins their outlook — those simple, slow-food pleasures are luxury goods, and always will be — and partly because, without getting excited about it, I do agree with Livy (referenced by Myers), that “the glorification of chefs” is probably unhealthy. Writing about food ought to be modest — that’s one of the appeals of Julia Child’s books. Child agreed with the fundamental French precept that there is one (1) right way to do everything, and she sought to convey the rules as clearly as possible to heterodox Americans; but she never raised her voice or succumbed to rapture. Today’s foodies haven’t got Child’s good manners.
The more lives sacrificed for a dinner, the more impressive the eater. Dana Goodyear: “Thirty duck hearts in curry — The ethos of this kind of cooking is undeniably macho.” Amorality as ethos, callousness as bravery, queenly self-absorption as machismo; no small perversion of language is needed to spin heroism out of an evening spent in a chair.Â
Well, I couldn’t put it down. Â
***
In his favorable review of Sean McMeekin’s The Berlin-Baghdad Expresss: The Ottoman Empire and Germany’s Bid for World Power, Christopher Hitchens identifies the people who ought to read this book (which would include me):Â
If asked to discuss some of the events of that period that shaped our world and the world of Osama, many educated people could cite T E Lawrence’s Arab Revolt, the secret Anglo-French Sykes-Picot Agreement portioning out the post-war Middle East, and the Balfour Declaration, which prefigured the coming of the Jewish state. But who can speak with confidence of Max von Oppenheim, the godfather of German “Orientalism” and a sponsor of holy war? An understanding of this conjuncture is essential. It helps supply a key to the collapse of the Islamic caliphate — bin Laden’s most enduring cause of rage — and to the extermination of the Armenians, the swift success of the Bolshevik Revolution, and the relative independence of modern Iran, as well as the continuing divorce between Sunni and Shia Muslims.Â
Check!