US Represented

US Represented

Turn Off the Bubble Machine

Requiem for Literacy

I first noticed that the medium was indeed supplanting the message in 1967. Just back in the world, I was spending a lot of time at demonstrations against the war in Vietnam. We’d march around and hold our fists in the air, with or without the two-digit peace sign, and chant various slogans. Sometimes the cops would roust us. Sometimes there was tear gas. But always, after the demonstration ended, we’d disperse to our various abodes in time to catch the 6 o’clock news to see if the demonstration had really happened. If it didn’t make the news, then it hadn’t.

After all, if you spent the afternoon marching in front of a bunch of offices and Federal buildings, some few hundred largely indifferent people might have seen you and some few, perhaps, might have been moved by your chants. But if you made the evening news, then millions of largely indifferent people would see you. You would have been real. By virtue of being videotaped. No tape, no demonstration. Everyone felt that way, so completely that we never even talked about it, let alone questioned it. Television was reality.

As the years went on, I began to understand how this had come about, mainly thanks to Jerry Mander’s great book Four Arguments for the Elimination of Television. I began to write about my own observations of television’s power to shape or replace people’s individual perceptions of reality with perceptions that served the interests of those who owned or controlled broadcasting rights or who paid for broadcast content. I also wrote a lot about what television language, which was essentially the language of advertising, was doing to the English language. My observations had about the same effect as Mander’s, which is to say none.

Our words had no power to stem the onslaught of television, or of the internet that has succeeded, incorporated and will shortly replace it, because television, by the time we began trying to write about its ill effects, had begun to reduce literacy. The internet has continued to contribute to that reduction in a number of ways. I wrote many of these pieces while still harboring some faint, hardly conscious hope that they might at least raise their readers’ awareness of their surroundings. I no longer harbor such a hope. Billions world-wide have embraced these media of so-called communication with wholehearted fervor, and wouldn’t listen to any criticism of their precious “devices” even if they could read it, which, increasingly, they will be unable to do.

In Sleeping Dogs, Thomas Perry’s protagonist, who has been living in England for ten years, returns to New York City, where he observes ” . . . the spray-painted graffiti on the walls. The colors had gotten better, the viridian greens and new shades of orange, and the gold and silver metal-flake, but the script was now so ornate that he couldn’t read any of it. When it occurred to him that it might be a different language, he decided it should still be organized into words. It looked more like the samples of Sumerian and Phoenician . . . than like any modern language . . . . The gradual replacement of words with colors and pictures had accelerated during his time away, and made moving around a kind of puzzle.”

That was written thirty years ago, and the replacement has continued to accelerate. If I pick up almost any piece of mechanical or electronic equipment today, Its operations are indicated by tiny drawings, usually black on black or white on white or some other nearly unreadable combination, drawings whose meaning is not immediately apparent. Or, for that matter, apparent after long study. They might as well be Sumerian or Phoenician. Vestiges of language remain in the form of acronyms, which, like the names of this week’s Pop Stars, many people seem to recognize. So when I fail to interpret the headline, “SAGO statement on newly released SARS-CoV-2 metagenomics data from China CDC on GISAID,” it’s merely another sign that I’ve overstayed my welcome in this society.

Aldous Huxley saw pretty clearly what was coming, and explained why it was coming. He saw that technological development would have to be devoted increasingly to controlling a population far too numerous for such antique experiments as self-government:

“Self-government is in inverse ratio to numbers. The larger the constituency, the less the value of any particular vote. When he is merely one of millions, the individual elector feels himself to be impotent, a negligible quantity. The candidates he has voted into office are far away, at the top of the pyramid of power. Theoretically they are the servants of the people; but in fact it is the servants who give orders and the people, far off at the base of the great pyramid, who must obey. Increasing population and advancing technology have resulted in an increase in the number and complexity of organizations, an increase in the amount of power concentrated in the hands of officials and a corresponding decrease in the amount of control exercised by electors, coupled with a decrease in the public’s regard for democratic procedures. Already weakened by the vast impersonal forces at work in the modern world, democratic institutions are now being undermined from within by the politicians and their propagandists.

”’Both parties,’ we were told in 1956 by the editor of a leading business journal, ‘will merchandize their candidates and issues by the same methods that business has developed to sell goods. These include scientific selection of appeals and planned repetition . . . . Radio spot announcements and ads will repeat phrases with a planned intensity. Billboards will push slogans of proven power . . . . Candidates need, in addition to rich voices and good diction, to be able to look ‘sincerely’ at the TV camera.’”

Twenty years before he wrote that, Huxley had imagined that the glut of people, ever growing and ever rendered more inessential by the mechanization of mass production, could be most effectively governed by conditioning them into near infantile beings and keeping them perpetually distracted by idiotic but technically sophisticated amusements, freely available drugs and zipless sex. The alternate reality on offer today provides the same menu of distractions.

The written word is a very recently invented technology, so I suppose the human race can learn to do without it. If we are not also to learn to do without the history of our species or the ability to follow or create coherent sequential thoughts, we will have to relearn the powers of memorization that were vastly weakened by the adoption of written language and that have been eliminated from most school curricula, pilloried as “rote learning.”

I suppose that publishing these cries of alarm will be an exercise in futility. I’m sure that English will continue to be taught and studied in future schools for a while, but I expect it will, within a few more generations, descend to the condition occupied by Latin during my school days. A diminishing number of worshippers will keep teaching English, and a diminishing number of dweebs will take their classes, until they peter out completely, as the classical languages have done during my lifetime.

In a hundred years or so, there’ll be no one left able to read this or anything else written in English. This sentence will look like this to them: Uijt tfoufodf xjmm oppl mjlf uijt um uifn. Since they’ll have had no reason or demand to learn the alphabet, they won’t have the simple tool needed to decode it. Or anything else from the past, for that matter. The majority of humans have never felt much need to remember or consult the past, so perhaps, after all, nothing much will be new in the brave new world.

My, what a dire introduction. I remind myself of something Kurt Vonnegut said in one of his letters: “The late humorist James Thurber wrote a fable set in a medieval court, and he has the Royal Astronomer report that all the stars are going out! It turns out that he is simply going blind. I am probably making the same mistake.”

I hope I am making the same mistake, too. I’d hate to imagine my grandchildren and their grandchildren living in a world in which they no longer can have their awareness drawn to what they see and hear and taste and feel everyday. I’d hate for them to take it for granted. I want them to be able to receive the messages that keep them from doing that, like this one:

Under the Vulture Tree
                –David Bottoms

We have all seen them circling pastures,
have looked up from the mouth of a barn, a pine clearing,
the fences of our own back yards, and have stood
amazed by the one slow wing beat, the endless dihedral drift.

But I had never seen so many so close,
hundreds, every limb of the dead oak feathered black,
and I cut the engine, let the river grab the boat
and pull it toward the tree.
The black leaves shined, the pale fruit blossomed
red, ugly as a human heart.
Then, as I passed under their dream, I saw for the first time
its soft countenance, the raw fleshy jowls
wrinkled and generous, like the faces
of the very old who have learned to empathize with everything.                                                                                                      

And I drifted away from them slow, on the pull of the river,
reluctant, looking back at their roost,
calling them what I’d never called them, what they are
those dwarfed transfiguring angels,
who flock to the side of the poisoned fox, the mud turtle
crushed on the shoulder of the road,
who pray over the leaf-graves of the anonymous lost,
with mercy enough to consume us all and give us wings.

***

The Perfect Drug

[This was written 37 years ago. Since then, average household tv viewing time has risen to 7 hours, 35 minutes. The hole in the ozone layer over Antartica has decreased somewhat, but the loss of ozone in the stratosphere has continued. Large chunks of the polar caps are falling into the sea daily, and if much of the West is not rapidly becoming a desert, it’s giving a good imitation. Of course, it could all burn down before it achieves desert status.]

I got home from work that night a little after ten, pacified Ralph the Cat with his customary mound of reeking Kal Kan, flipped on the tube and collapsed in my chair to watch the news.

The local lead stories were done with. It was time for the national filler piece, pillaged from the 6 o’clock network news. Tonight’s story concerned chlorofluorocarbons. Some scientists had convened a news conference earlier in the day, after completing the most ambitious study ever undertaken of the effects of CFC’s on the ozone layer.

They presented computer-enhanced photographs of the ozone layer over the South Pole, showing large, gaping rents. They predicted some results of the rapidly proceeding destruction of the ozone layer: a 30% increase in skin cancers, dramatic climate changes, particularly in the temperate zones, which will become intemperate, not to say hostile to agriculture; these and other developments to occur within 30 years.  From now.

They made these predictions as unequivocally as any scientists I’ve ever heard. They weren’t saying “maybe;” they were saying “will.” Questioned about U.S. use of CFC’s, their spokesman replied that while the U.S. had banned their use in aerosol propellents, other countries had not. “And there are no national boundaries in the atmosphere,” he added.

Then the anchorperson was back to add the local angle. “And for Colorado,” she said through her perpetual, mindless smile, “they say this means we’ll be living in a desert pretty soon!” Then it was time for the pre-commercial sports teaser. The sportscaster, on location at a rain-beleaguered golf tourney on Long Island, stood grinning in the deluge. “Well, John,” the anchorperson trilled, “no danger of a desert out where you are, is there?”

The commercial that followed concerned the peculiar virtue of Heilemann’s Old Style  beer: it is brewed with artesian water. Artesian water, a voice reminiscent of Orson Welles’ asserted orotundly, is uniquely pure because it has been trapped far below the earth’s surface for a whole bunch of millennia, and nothing has ever disturbed its purity. Naturally this was all hogwash, but “artesian” has a pure, dignified ring to it, and the message was accompanied by a few hundred thousand bucks worth of film of water trickling among some pretty old-looking rocks.

At this point, I turned off the tv and contemplated the six or so minutes of television I’d just experienced. Something about them seemed perfect.

***

Roof Leaking? No Problem!

Suppose George is sitting around in his home. His neighbor rushes in to inform him that his roof has a series of great holes in it. George looks up – it’s true! Great holes everywhere! All kind of stuff is going to be coming down through them. He’d better do something!

“Why are you smiling?” George asks his neighbor.

“I hear it’s a beautiful day in Vladivostok!” she says, her smile unabated. This disorients George. Maybe, he thinks, he’s getting excessively worried about his roof. His neighbor’s next remark tends to confirm this speculation.

“Oh, I forgot,” she chirps merrily. “I have some lovely Amway products you should buy from me. Then let’s get drunk and naked and smear them all over ourselves!” And she starts pulling bottles of dishwashing fluid and Dom Perignon out of her décolletage.

Well, as Nelson Algren used to inquire, what would you do?

Obviously, whatever you did in the next few hours, pretty soon you’d set to doing something about your roof, if you had half a brain. Even if that same neighbor kept coming over, day and night, and kept coming on, you’d be a fool to let your house be destroyed.

I suppose you could just move in with your neighbor. That would work. Unless her roof had giant holes in it, too. Then, if you just kept rolling around on the floor, smashed out of your kug, people might begin to think you had some kind of a drug problem.

***

Some Kind of a Drug Problem

Enough analogies. What I experienced during those six minutes of television was:

1. A “story” about facts which exist in the objective world. In this case, the facts are life-threatening. They exist whether I ignore them or not. If I do ignore them, my child is going to be sucking roots for a living, at best. Information of the utmost urgency, demanding action of every sane individual.

2. An abrupt end to that story and a refocusing of my attention on another “story”– the wet golfer story, the facts of which also exist in the objective world, but which are of little consequence to anyone except the golfers.

3. An utterly bogus “message” designed to sell me a drug that will relieve any residual anxiety the CFC story might have left me with. All I’d have to do was run down to the corner store and grab a couple of six-packs.

Like many other recreational drugs, television gives its user the illusion of experience without the perils or rewards of experience. Like every other drug, television seems to “enhance” reality, granting its user visions unavailable to unaltered consciousness. (How else did I get to Long Island, or deep below the earth’s surface?) Like every other addict, the television user develops tolerance: he requires ever greater doses to achieve the desired effect. (Average time spent in front of television sets per household is now 7 hours per day; that time has risen about 3 minutes per year over the past 10 years.)

Other cultures than ours communally ingest mind-altering drugs and experience shared, collective visions. (See, for example, Andrew Weil’s account of one such South American culture in The Natural Mind.) They do so under carefully controlled conditions, and they do so rarely. Ours is the first culture to enter into a communal drug experience with apparently uncheckable abandon, supplanting normal consciousness with television consciousness more than seven hours of every day.

***

Who Owns the Ju-Ju?

There is another distinction between “primitive” psychotropic drug use and television use. The visions induced in South American rain forests are truly collective and communal; they spring from the interplaying minds of the people who choose to ingest the drug together. They do not, as television does, spring from the will and conscious intent of a set of interlocking conglomerates that control the visions they broadcast.

Television programming is not free. The large corporations who “sponsor” television programming spend immense sums to give television away to us. The corporate creatures who comprise our government carefully control access to this “free” medium. Their interests are largely identical. One of their primary shared interests is the erasure of history and the establishment of the Eternal Now.

The federal government desires the disappearance of history because people will not demand action if they can’t remember what needs to be acted upon. If the people are residing in the Eternal Now, the government is freed of the obligation to actually address the many potentially fatal conditions with which we’ve saddled ourselves and to go on about its business of pillaging, looting, supporting thugs and bankrupting our children’s children.

The large corporations desire the Eternal Now because it creates a void of meaning and purpose in life, which they can fill with suggestions for one activity – buying. That is the only activity television suggests, urges, promotes: get out there and buy something.

For a people inclined from their beginnings toward an anti-historical bent, television has proved the perfect disaster. In fewer than forty years, it has captured almost completely an entire nation, turning us into a mass of passive, unimaginative, timid, valueless, puerile toadies. 

***

Lost in the Ozone

If George really got into his Amway/champagne/neighbor “lifestyle,” it would be difficult to get him to see his behavior objectively, or to consider the direction it was taking him. He would find life with huge chunks of ceiling lying on rotting carpets quite normal, and would not notice as things got worse.

When the great majority of a nation is hooked on the identical drug – and perceives that drug not as a drug, but as its principal source of information about reality – what then? How do you make contact? How can you get through to George that, although the Pope has just married J-Lo in a poolside ceremony, the roof’s still leaking?

***

Book Talk and Ad Blat

Newsweek’s recent [1975] essay, “Why Johnny Can’t Write,” lay Johnny’s latest failure at several doorways, but the largest parcel of blame wound up in front of the English classroom. English teachers, according to Newsweek, can’t, don’t or won’t teach Johnny to write decently.

Since I’m an English teacher, I suppose I should be expected to charge to the defense of my profession. I’m more inclined to think that my profession has been flattered. A severe decline in literacy is a large social change, and it’s rare that English teachers are considered agents of important change in our society.

It’s rare because they aren’t. Teachers are paid by the public to instill the prevailing cultural values in students, and that is what teachers mostly do, or else they get fired or quit in disgust. If English teachers are turning out illiterates, they are doing so at the behest of their society, not in defiance of it.

20 years ago, Alvin Toffler estimated that the average American received 560 advertising messages each day. One can assume that number has increased considerably. Needless to say, the average American receives no such number of “book-talk” messages each day, or each year, for that matter. (“Book-talk” is a Newsweek coinage, presumably meaning “English.”) These advertising messages may not edify, but certainly they instruct.

Here is one advertisement which flanked the text of “Why Johnny Can’t Write”:

                        We dry harder. Dry Gilbey’s. Dry Boissiere. When

                        a great dry gin and a great dry vermouth get together,

                        the result is – almost inevitably – a great dry martini.

                        So a Gilboissiere martini has to be a great dry martini.

                        Dry it…you’ll like it. 

The authors of this text feel certain that their audience will be familiar with the other texts (ads for rental cars and stomach products) to which they punningly allude. Their certainty demonstrates the universality of advertising language in America. What poet could allude, even without puns, to any “book-talk” phrase with such confidence? Johnny, and John, and Joan, and Juan are exposed to the language of advertising so frequently that they can appreciate puns and allusions in that language; they have learned that language well. What principles of literary style have they learned from the professors of ad blat?

“We day harder. Dry Gilbey’s. Dry Boissiere.” From the first three units, Johnny can learn that a sentence is any collection of words which begins with a capital letter and ends with a period. Johnny may therefore be perplexed when his sentences are criticized because they lack subjects or predicates.

“So a Gilboissiere martini has to be a great dry martini.” From this, Johnny can learn that a conjunction is a word used to begin a sentence, and that “must” is a musty word rendered obsolete by the locution “has to” (as in “I have to go down to the sea again,” or “Shoot, if you have to, this old, grey head, but you don’t have to hurt my country’s flag”).

Good prose can be written in non-standard English, as many major American authors have demonstrated. Is Johnny learning good, if non-standard, prose from Newsweek’s advertisers?

“When a great dry gin and a great dry vermouth get together, the result is – almost inevitably – a great dry martini.” In the first place, that is nonsense, and if Johnny believes it I hope to avoid his bar. In the second place, “almost inevitably” is high-sounding blather. What does it mean? How almost? One adverb vitiates the other – on purpose. The authors knew they were writing hokum, since ingredients never guarantee a decent cocktail, but they wanted to appeal to the 2-Easy-Steps-to-Sainthood sloth in all of us, so they wrote “almost inevitably” instead of “sometimes.” Johnny is no fool. He gets the idea: if you want to make a statement you can’t support, use fancy words and mush up their meanings with qualifiers.

An advertisement for particular brands of alcohol presumably asserts that they are superior brands. The form of the “Gilboissiere” ad appears to make such an assertion, but it does not: 1, a great gin plus a great vermouth make a great martini; 2, a martini made from our gin and our vermouth must be great. Even without almost all the inevitability asserted parenthetically, this phony syllogism says nothing about the merits of the brands it promotes. See, Johnny? You don’t have to really prove anything or make any sense; just sound confident.

The language of this particular advertisement, in its perversions of diction, grammar, syntax and logic, is perfectly typical of ad blat. The language daily bombarding Johnny is a language designed to obscure, not to communicate; to bamboozle, not to convince.

If we look beyond advertising to the leaders of the nation, who presumably act as exemplars for Johnny as he learns good habits of speech and writing, we find only more of the same perversions. The obscenities excised from the Nixon transcripts were widely remarked. The more terrifying obscenity – the systematic avoidance of truth and reality by recourse to ad blat – was neither excised nor viewed with much alarm. Senator McGovern, offering not a choice but an echo, celebrated his triumph in several primaries by alluding to an Alka-Seltzer commercial. The conduct of nearly all political campaigns – because nearly all are conducted by advertising people – has become a battle to see who can say least most strikingly.

More influential national leaders, such as talk-show hosts and football commentators, offer little more in the way of literacy, so we are left with the newspapers and magazines. Here is part of a book review from the literary section of The Denver Post:

                        Want a “front-burner” gift for the under-40 set? Buy

                        them a cookbook. The young may not dance much

                        anymore, but we are told they do cook, and they do

                        dig those cookbooks. This season finds at least a

                        couple dozen new ones to give the under-40 set….

                        Leader of them all is the new classic, From Julia

                        Child’s Kitchen, hotly pursued by The Better Homes

                        and Gardens New Cookbook. But these are just

                        the leaders.

Now, speaking as one of the under-40 set, I do dance quite a bit, and I can’t see how a “set” can take a plural pronoun, or how a book can become a classic before it’s even in circulation, or quite how the metaphor of hot pursuit applies to books, but except for those quibbles, I think this prose is all reet, and I dig it. It’s all reet because its purpose is not to review books. Its purpose is to peddle books, and for that purpose it is written in the appropriate dialect – ad blat.

My students are surrounded by such language. The writers and speakers of such language are, by and large, well rewarded with money and fame. The language of ad blat is in my students’ ears and eyes, and it comes out through their fingers when they write. I don’t blame them for that, and I don’t blame English teachers much, unless they also speak and write in ad blat, which some of them do. If decent, honest, clear speech and writing are required only in English classrooms, most students won’t learn to write decently, honestly, or clearly, because they don’t plan to spend their lives in English classrooms.

Quivering with alarm, Newsweek informs me that, “at Harvard, one economics instructor has been so disturbed at the inability of his students to write clearly that he now offers his own services to try to teach freshmen how to write.” Estimable, but what has he been doing up until now? Has he been ignoring his students’ writing errors and “reading for content”? Has he been writing “Awk” in the margin when he felt a sentence was unclear but couldn’t or didn’t want to figure out or explain why it was unclear? If it is unusual for even academics to care whether their students write decently, how in the nation can English teachers convince their students that they need to master this particular skill?

Clearly, if you teach English, you keep trying to convince them, or you find another line of work. Any teacher who doesn’t do one or the other ought to be blamed, since teachers who really do only instill prevailing values aren’t worthy of the name. Until, however, society as a whole shows some signs of valuing clarity, precision, grace and honesty over deliberate muddiness, bombast and half-truth, I doubt that Johnny’s writing will improve much, no matter how herculean the efforts of his English teachers. Johnny and his English teachers have only the average amount of courage, and they know what happens to literate, honest leaders, magazines, and television shows: generally, they get cancelled.

***

Meeting the Deductible

(Presented at a Community College Composition Conference, c. 2001)

Imagine that you are any of the following persons:

  1. a 19 year old boy who, after flunking out of high school as a third-year freshman, has hung out with similar friends for a few years, taken a few unrewarding fast food jobs which you’ve lost through general fecklessness, faced the awful future, obtained a G.E.D. and enrolled at a community college, having been told all your life that Education is the Key to Success;
  2. a single mother of two kids, 3 and 5, working full time to support them, taking night classes at a community college in order to prepare yourself for some yet-to-be- determined, more remunerative, less demeaning job;
  3. a forty year old father of three, also working full time, attending night classes at a community college in hopes of making a career change into the burgeoning field of computers, where the geetus spouts freely as a Texas gusher;
  4. a 37 year old, newly divorced mother of two teenagers, not unintelligent but seventeen years removed from reading anything but People or writing anything but memoranda regarding your endless chain of duties; sad, unsure, looking for a place to stand – any place that doesn’t shift and tilt – so you can start something you’ve heard referred to as “a new life.”     
  5. You can be any one of those you want.

You walk into your Freshman English I class, and the instructor directs you to open your book to page 38. “I want you to critically analyze this little essay,” says the prof. “You have an hour and fifteen minutes. Your writing sample will determine whether or not you’re ready for this class.”

You don’t feel particularly ready, willing, or able. You hate “English.” Tough. You open the book – it’s called The Postmodern Wordsmith, a title both incomprehensible and extraordinarily unappealing – and face the essay,  entitled,  “It’s Deductible.” Its author is someone named Gordon Kahn. This is what you see in the first paragraph:

“There were times, and Paris was the place, when nbibsbkt and their cfhvnt, cattle cbspot from the Bshfoujof, and people named Spuitdime would spend so much money that it would efnpojuaf the gsbod or fogffcmf it for long periods. In New York, during the Cbczmpojbo 31’s, many a cpoboab was bupnjae between the upper and ofuifs njmmtupoft of Qbsl Bwfovf and Cspbexbz.” 

“Oh, my sweet Savior,” you say to yourself, “come aid me now .“ But He is silent, and you perceive that you must try to survive unaided.

Two cities mentioned: Paris and New York. One is not in the U.S., one is. People named Spuitdime spending big. So it’s some people you’re supposed to have heard of spending big time in some big cities. For long periods of time. Some time in the past. There’s nothing to do but plough onward, hoping that future paragraphs will cast some light on this one, or that the Savior will appear, miraculous though tardy.

But what you find, ploughing on, is only more chaos and confusion. Paragraph after paragraph is interrupted by clots of weird code you can’t pray to decipher.

Oh, my sweet Savior, you think.

________________________________

The essay whose first paragraph I’ve encoded above appeared in a trade magazine calledThe Screenwriter in 1945. It is a jeux desprit , the kind of thing former newspapermen such as Gordon Kahn used to toss off at the drop of a hat when they needed a few bucks to keep the wolf from the door. Kahn takes the fact that a new class of servants – financial managers – had come to Hollywood in the early 30’s, bringing some financial stability to the actors, writers and directors. Kahn comments on that fact for 4 pages. He has fun with those who take money too seriously and with those who don’t take it seriously at all. Mostly, he has fun with his own gift of literary embroidery, freely mixing contemporary slang with playful allusions to ancient and modern history and cleverly adapted cliches. Hundreds of thousands of comparable words were written in 1945. 

Uncoded, the paragraph reads:

“There were times, and Paris was the place, when maharajahs and their begums, cattle barons from the Argentine, and people named Rothschild would spend so much money that it would demonetize the franc or enfeeble it for long periods. In New York, during the Babylonian 20’s, many a bonanza was atomized between the upper and nether millstones of Park Avenue and Broadway.”

I cannot know how much of this makes sense to you professional readers. It all made sense to me, although I had to look up “begum.” (After I’d looked it up, I was little wiser, because I’d  already figured it had to mean something like “a well-rewarded female companion to a rich fellow from the Indian peninsula.”) I could figure out what a “begum” might likely be from the context, because I understood the context. 

If the context had appeared to me in the coded version I’ve presented to you, I would have had little chance of figuring that one word out. If you’re a student with the dearth of reading experience most of our students have, what would you do with this paragraph, which appears a little later in “It’s Deductible”?

“Tbujoht cbolt offered lovely desk calendars, pof-xbz xbmmfut and Spzbm Tqpef piggy banks in an endeavor to trp the xboupo dollar.  The qbsbmf of the hsbttimqqffs and the bou was recounted in many foujdjoh versions and in brilliant technicolor – without snaring a tjohmf dmbn cfijoe the xjdlfu.” You now have about an hour to write a critical analysis of this essay.  It pretty much all reads like this.  Hit it.

_________________________________

I might be a more modern kind of English teacher, and not expect too much of you. I might give you two weeks to read this essay and formulate your response to it. During those two weeks, we’d have a chance to discuss the essay. We would talk about what the words meant when they were written. We would talk about the attitude the writer took toward the words he put on the paper. 

To be more realistic, I should say that I  would talk about those things, making largely unsuccessful attempts to involve you in the conversation. I would talk about them because they seem to me necessary considerations, given the time and place the words were written.

You, on the other hand, would not join much in the conversation, because to you neither other times nor places have much if any meaning. To you, all is present. You know that the future can be better than the past, and that your occasional attendance in some rooms can help make it so. That you are presented with this strange mix of code with simple English seems an affront. This is why there are lawyers, you think.

(When I started writing this, I was guessing. Guessing based on almost thirty years of teaching community college English, but guessing. After I wrote the above, I thought I should check my guesses, and so I presented an unencoded version of the above paragraph to 71 Freshman English students. I asked them to underline any words or phrases in the paragraph they felt they didn’t know or would have to look up. I told them that their responses would be anonymous and that they would have  nothing to do with the particular class they were members of.  This is just Science, I told them.

“Maharajahs and their begums” baffled 96%, “people named Rothschild” 41%, the “Babylonian 20’s”  45%.  The “atomized bonanza” was Greek to 73% and millstones failed to grind for 53%.  Every phrase or word encoded above proved unfamiliar to at least 25% of the 71 readers.)

English, like any other language, is a code. We don’t think of it like that because we learned it before we knew we were learning. As we learned it, we were learning all sorts of history and civics and art and music and religion and science and geography and what-all. A big mess of information came with learning the language, during the time most people over forty or so were learning it.

Then came television, and we allowed the teaching of language to pass over to it. And television was operated by advertising agencies whose purpose was to sell people things they didn’t necessarily need. The advertising agencies were paid by companies who wanted to sell things to those people. 

The only way to sell useless things to people is to confuse them into thinking they need the things you have to sell. This is the art of advertising. Radio had prepared people to  listen to messages regarding such things. It had not prepared them to see the things dance and sing and defeat villains.

 When Babo’s Foaming Cleanser angels chased the dirt right down the drain, America applauded; who would not? The entire history of theatre had been reprised in 30 seconds, and, as it always had, it worked perfectly: a threat to the polis had been repelled. Good boogie. When a colorfully loutish fellow groaned, “I can’t believe I ate the whole thing,” a new phrase became part of the ephemeral American lexicon, and it was soon being alluded to by a Presidential candidate.

The downside to this new mode of English instruction, from the teaching perspective, or from the learning perspective, for that matter, was that the new English didn’t include any reference to the past; it referred only to itself.

And so the children who grew up learning this new version of English learned a sort of English, but did not learn any of the history that the language had previously contained and required for its understanding.  They learned, instead, a new code, whose only referents were products available at the chain stores advertised by the agencies whose masters paid for their messages.

The language of advertising involves the making of claims not meant to be taken as “true,” involves the replacement of reasoned argument by catch-phrases, and is meant to evoke emotion, not thought. Without the accompanying visual images, it immediately reveals itself for what it is: empty, utterly cynical, manipulative.

It has become the language of our culture. If you doubt this, consider the language spoken by politicians, sportscasters, everyone on television and radio, and, sad to say, by most educational administrators and an increasing number of teachers. In our arena, the cutting edge was only yesterday interfacing with the world-class state of the art. So long as it was remembered that “art” had solely to do with The Art of the Deal.

Little wonder these children of the new English largely view language with little or no respect or affection or interest; the language they’ve acquired from our culture deserves  to be dissed.     

Further, our current students grew up in a country which had always believed, as Henry Ford did, that history was bunk and that it was perpetually morning in America. They could hardly be blamed for believing that nothing that had been written before their current moment was worth reading. They could hardly be blamed because they’d been told that if it was worth knowing, it was in the computer somewhere. If it was in the computer, why should they know it? It’s known. Television commercials reinforced this certainty.  Ancient-looking geezers mourned that they would not live to see the wonderful cyberfuture, as images of nubile lasses danced behind them across the screen; ghetto children exulted in their mom’s subsidized purchase of a new computer – now they, too, could be a parcel of the information age.

The “kids” I’m talking about are my kids and my students. They would say I was speaking “harshly.” 

I am. But I am not harsh about them. I’m harsh about us. We are the ones who’ve sat still for this nonsense. Sat still for it, or stayed silent in its face, or embraced it. None of those seems an admirable response to a large omnivore squatting over your face.

            ______________________________________

I hesitate to continue by considering the contribution their public school education has made to my students’ intellectual vacuity, but continue I must.  Most of my young students these last few years have been taught some simple, powerful lessons by their lives in school: 1. Show up where you’re supposed to be, more or less on time, three days out of five. 2. Don’t commit felonies when someone’s around to watch. 3. When sufficiently badgered, put some words on a piece of paper. They don’t have to make sense as long as they’re handed in on time. 4. You won’t have to do this often, because most of your work will consist of filling in blanks with the “right” answers found elsewhere in a canned exercise. 5. Follow steps 1 through 4 for four years, and you’re good to go.

My syllabus tells my students on its first page what will happen on every day of the semester: what I expect them to have read, when I expect them to have read it, when I expect them to submit a paper about a particular reading. Because my classes meet twice a week, I arrange my syllabus in two columns. The left-hand column is for the first day of the week the class meets; the right hand column is for the second day. Each day is dated.

And during nearly every class, I hear anxious queries: When  is this essay due? What  are we doing today? We are?

My students lack this concept: you can derive useful information by reading printed words. From this, I can only conclude that their previous schooling has failed them in about the most basic way I can imagine anything called “education” could fail.

Saying that, I don’t mean to attack public education, and especially I don’t mean to attack public school teachers. The system and its workers are functions of a culture which has abandoned all values but the pecuniary and all satisfactions but the immediate and gross. Public education does its best to care for the kids formed by this culture, whose parents have no time to do anything but chase the bucks to buy urban assault vehicles and access to the internet, upon which they can pursue their aborted sex lives in chat rooms and poor quality images.

_________________________________

I’ve talked about my students as if they were a blob, which they’re not. Yet I get these idiot questions about what’s due when, and oh, was I supposed to read that? from obviously intelligent students more often than from those obviously less gifted. The obviously less gifted never ask questions. They know they’re in over their heads, and they refer themselves to their previous educational experience: Shut up! Don’t draw attention to yourself! Someone might ask you something! You’ll be good to go!

The culture seems unlikely to change in the direction of a massive insurrection in behalf of literacy. Teachers of English, so long as such a discipline remains, seem equally unlikely to change. We’ll quote Orwell and E.B. White and persist.

We’ll persist in believing that humans live by, with and through the language they speak and hear and write, and that, without mastery of that language, they’ll likely be enslaved by it. That one picture is not only not worth one thousand words, it is worthless without the words its viewer says about it to himself or herself, and then says to another person.

We believe that one of the great joys and purposes of our species occurs when one human mind says something that another human mind always wanted to say, but  didn’t, because it lacked essential information or experience or courage or the words to say it in.

            ______________________________

It’s also why I’ve come to believe that I need to be teaching reading far more than to be teaching writing. Our profession has been going through a generational period of emphasis on “critical writing.” I’ve gone along with that, not unwillingly. The phrase has meant to me that you need to read carefully, think about what you’ve read, and respond to it from your experience and knowledge, and decide what parts of what you’ve read seem useful and true, and what parts not.

But if the “knowledge” part is removed, and the “experience” part is removed – unless “experience” can be defined as watching illuminated dots screaming inanities at you – and the “reading” part is removed . . . what then? 

My students don’t know how to read. They don’t respect words; what they know about words is that they’re tools used by someone to sell them something, or used by people pretending to be their peers (viz  Marilyn Manson, et. al.) to encourage them in the universal belief of the young that their elders are evil, stupid clowns and the authorities their elders respect are without either power or merit.

So I’ve come to redefine my job. What I hope to do now is to teach people that worlds of meaning and value and beauty can be entered through words, but that they have to devote serious, sustained effort in order to open the door to those worlds.

                                                ______________________

I currently teach a lovely essay by Frank Conroy called “Think About It.” In the last section of the essay, Conroy talks about his chance encounter with Justice William O. Douglas. They converse about the Dennis case, which was currently before the court.  When we discuss this essay, I always ask my students if they know what the Dennis case concerned, and, to this date, not one student has known.

“How can you understand this essay of Conroy’s, then?” I ask them.

The occasional brave student will respond, “Oh, we get the gist of it.”

Well, you can’t do that, in this particular essay, because Justice Douglas’s explanation of the Dennis case, such as he explains it, is elliptical in the extreme, and refers frequently to the language of the Smith Act, and if you don’t know what the Smith Act is, you don’t have a prayer of understanding what he’s talking about.

So the first thing we need to do is teach our students that every word counts, and that guessing is a perilous enterprise.

I currently teach a column by Ben Wattenberg, in which he asserts that Americans believe that America “stands for something very special, and they believe they know the reasons why, which turn out to be that same old stuff: political, personal and economic liberty under a constitution.”

I ask my students what the word “special” means. Then I ask them what “very” special might mean; how much more “special” than “special” would “very special“ be? Then I ask them why, if “special” in this context means “excellent,” “outstanding,” “different from any other,” Wattenberg describes the reasons for that “specialness” as “that same old stuff.” I ask them about the connotations of that phrase. Hardly any of them know the word “connotation,” or recognize that it’s the major manipulative property of this or any language. I ask them, what if he’d said, “that same old garbage”? Would that change your attitude toward the words that follow?

“It’s Deductible,” the piece from which we started all this, alleges that various foreigners “would spend so much money that it would demonitize the franc or enfeeble it for long periods.” Is this “true”? I might ask my students. If you don’t know, how could you find out? You have the place – “Paris” – but you don’t know what “times” the author refers to.  Does the rest of the paragraph give you any clues about what might be the “times”? If you knew that the times were the “20’s,” and the place was Paris, and the proper name “Rothschild,” would you have enough information to enable you to find out what in the nation that sentence means?

Then I return to the original question, “Is this ‘true’?” Since it is not “true,” I ask them, why did Gordon Kahn allege that it was? Why did he exaggerate in such a manner? Did he intend to deceive his readers? Did he intend to amuse them? These kinds of questions lead, sometimes, to a better understanding of the idea of a “voice” on the page.

But “voice” should never appear in quotation marks when it comes to reading, a point Donald Hall made clear in a superb essay published fifteen years ago in Newsweek  called “Bring Back the Out Loud Language.” His point was that  language is essentially a spoken art, and that we’ve lost the ability to hear the words on the page, and thereby lost our ability to understand what the hell the words mean. For example, if you can’t hear the corrosively grievance-laden irony in Richard III’s tone in his opening soliloquy, “Now is the winter of our discontent / made glorious summer by this sun of York / and all the clouds that lour’d upon our house / in the deep bosom of the ocean buried . . . ” it will seem to you simply a piece of nauseating political flattery – when, in fact, it’s a multiple death sentence.

To address this loss of the inner ear, I give my students a short pack of poems at the beginning of the semester and require that they select one to memorize, speak to the class, and analyze in a paper. And we talk – usually at greater length than the syllabus projects – about what that experience was like. It’s the most dreaded and ultimately the most popular of my assignments; students are amazed by it: first by their courage, then by the fact that language sounds like something, then that the way it sounds has to do with the way it means.

My favorite experience of this process, so far, occurred when a cabinetmaker who’d lost the use of his forearms and come back to school to change professions chose Elizabeth Bishop’s poem “The Fish” to read, and stood up in front of God and everybody and read it in a flawless East Texas accent, which gave it a completely new feeling. “Doug,” I said, after some dumbstruck moments, “I thought you told me you’d lived your whole life in Colorado.  Where’d you come up with that accent?”  “I just thought the poem sounded like some old fellah that’d had a lot of experience,” he said, “and . . . I don’t know . . .  the voice just was there.“ I think the poem’s author, Elisabeth Bishop, an old girl who’d had a lot of experience herself, would have liked that answer.

If I were teaching our essay “It’s Deductible,” I might stop at the phrase “Babylonian 20’s,” and ask the students to work first on “Babylon” and then on “20’s.” The only students likely to have a referent for “Babylon” are those severely addicted to reggae and those severely addicted to the Old Testament. The majority will have no referent at all, and thus no way to comprehend what the sentence means. I try to convey to my students that language is highly allusive, and that the allusions can sometimes be sussed out from context, but mostly have to be tracked down, at some personal cost but to great personal benefit. 

From that same paragraph of “It’s Deductible, I might examine the latter part of the same sentence: “…many a bonanza was atomized between the upper and nether millstones of Park Avenue and Broadway.” Having discovered the denotations of “bonanza” and “atomized,” we might proceed to a discussion of metaphor – how one thing can be implicitly equated with another thing, and whether we should take such implied comparisons as literally true.

The visual images by which our children have grown up surrounded – from the inescapable television screen to the video games to the computer screens to the movies and the movies on video, etc., etc. – have done something else –  they have occupied the children’s imaginations, as the Nazis occupied Europe.

I remember a day twenty years ago: I was teaching an English 060 class, a class for people who’d missed out on the knowledge of sentences and punctuation and what’s a verb and what’s a paragraph. We were trying to look closely at the parts of a metaphoric sentence that compared a woman with a spider. The comparison was an unusually favorable one.

“See, you have to use your imaginations,” I found myself spouting, in a kind of “Win this one for the Gipper” tone. “You have to see what the words are saying.”

A young woman all the way in the back suddenly raised her hand, looking in a startled way at her own sudden demand for attention, as if it might have been the first such bid she’d ever made. She’d been sitting back there – way  back, as far back as she could get without breaking through the wall – all quarter, never saying a mumbling word.

But now she said, “Mr. Malcolm . . . . I don’t have  an imagination.”

The sorrow in her voice like to break your heart.

But the look in her eyes is what I remain unable to forget. Her eyes were like the rabbit’s when she looks out through the mesh of her cage and sees the raccoon working away at that very mesh. They weren’t even frightened; they were already resigned and dimming. She had spoken a terrible truth, and every one of us knew it.

Imagination, I sometimes tell my students, doesn’t mean thinking about gryphons and faeries and unicorns. It means the ability to form your own pictures in your own mind.  And if you can’t do that, then whose pictures are you seeing all the time up there?

I also read them this sentence:  “I can’t understand anything in general unless I’m carrying along in my mind a specific example and watching it go. Some people think in the beginning that I’m kind of slow and I don’t understand the problem, because I ask a lot of these ‘dumb’ questions.” Then I tell them that this “slow” fellow is Richard Feynman, who holds a Nobel Prize in Physics. Then we practice the art of creating  specific examples from general statements and “watching them go.”  And are frequently startled to see how one specific example can reveal the bankruptcy of a perfectly splendid-sounding generalization.

                                          ________________________

Those are a few aspects of reading I try to teach my students – precision in definition, alertness to connotation and tone, hearing the words as well as seeing them, leaving no allusion untraced and no metaphor unexamined, developing the imagination. I don’t think I need to point out that these are also aspects of writing. They’ve proved eye-opening to quite a few. They can’t, need it be said, make up for the twenty or thirty or more years of reading that most of my students don’t have under their belts. Only a whole bunch more reading can do that.

And so my primary goal has become to trick, cajole, entice, inspire and otherwise motivate my students to fall in love with reading. If I don’t do that, I haven’t done anything, except, perhaps, to create a few more clever persuaders.

I don’t think we need any more of those.                  

***

Gratitude to Teachers

A friend of mine who’s still teaching English told me lately about a student who felt truly indignant that he should have to do anything to obtain a passing grade. Doing anything included coming to class. He held the extreme position of a common belief: that nothing deserves attention unless it frequently explodes, and that appearing in the vicinity of the classroom 65% of the time, refraining from open expressions of boredom and revulsion, and slapping some words on paper under duress ought to guarantee at least a B.

As Japanese students conclude each grade, they sing a song thanking their teachers for deigning to teach them. They value learning, and therefore those who help them learn, not from sentiment but from realism. Their culture understands that the reality of this world is danger, and that humans need maps and tools to cope with danger.

“The most dangerous animal yet roaming free,” as Nelson Algren called us, is the animal that uses language. Over the past hundred years, our language has fallen into the hands of paid liars. They are called advertising executives and public relations professionals and other euphemisms. Hyperbole and euphemism and fake logic and mindless catch-phrases, the stock in trade of these paid liars, have so pervaded our speech and thought that many people unconsciously believe that language is solely composed of such devices, employed to bamboozle or intimidate someone into doing what you want them to do.

Therefore lies, even when revealed, fail to make people indignant. Lies are expected, and no cause for alarm or anger. People who expect lies have a hard time telling shit from Shinola, because they believe everything is shit. This belief makes it easy for them to lie to themselves – as easily as they lie to everyone else. People who lie to themselves have no purchase on reality, and they make easy prey for professional liars.

In a dangerous world, realistic persons would be very grateful to anyone who could help them learn how to recognize the language of paid liars, how to recognize when it has infected their own language, and how to use language in the service of truth, told in clear and simple ways.  That is the job of English teachers, though not all of them see it that way.

Students pay teachers to teach them how to think straight about the real world. They pay teachers to not let them get away with lies. They pay them to make them miserable, not to make them feel good. It’s miserable to discover that there’s no end to the work of learning not to be a liar and not to be fooled by liars.

But if you don’t learn how the language is used as a weapon, you probably won’t learn how to use it as a shield.  And the world, gradually or suddenly, will eat your life right on up.

***

Next Step in the Old Song and Dance

Like most people who’ve worked in the same field for thirty-five years, I’m always eager to hear criticism of my profession from people who’ve never spent a day working in it, such as columnist Mike Rosen. Higher education has a lot of critics these days, but you just can’t beat Mike for logical fallacies and distortion of evidence.

Within the first paragraph of a recent column,“Voters have the power to diversify CU faculty” [The Gazette, March 4, 2005], Rosen describes CU faculty as a “bureaucratic” “self-serving” “omniscient professoriat.”  He’s just warming up. Pillorying the entire faculty of a large state university can hardly satisfy a broom so sweeping as Rosen’s. In the next paragraph he gets around to “higher education” as a whole, in which “some of the main problems these days” are “specifically, the tyranny of the tenured left and the paucity of conservative professors within liberal arts faculties.” 

Now that’s sweeping. It covers 1.6 million professors and graduate assistants who teach nearly 16 million students. In 2001, 45% of those professors nationwide were tenured [US Bureau of Labor Statistics, 2/27/04]. According to one of the studies of faculty political attitudes Rosen later quotes (Rothman, et. al.), 72% of the sample are “left/liberal.” That means that about 32% of those 1.6 million professors are exercising a “tyranny of the tenured left” over the other 68%, not to mention the 16 million students.  A mighty determined lot they must be.

Of course, the denomination “left/liberal” raises a few questions. The other study Rosen refers to (Horowitz), categorizes any faculty member registered as a Democrat as “left/liberal.” That means, I suppose, that they’re in bed with such flaming radicals as Joseph Lieberman, Joseph Biden and Max Baucus.

The words “study,” “survey,” and “research” ought to immediately prompt two questions: Who paid for the “research”? What was its methodology?

Rosen and other self-appointed journalistic experts on education currently make almost daily reference to two studies. The first to appear was “Political Bias in the Administrations and Faculties of 32 Elite Colleges and Universities,” a production of David Horowitz’s Center for the Study of Popular Culture [http:/www.frontpagemag.com/Content/read.asp?ID=55]. The second is “Politics and Professional Advancement Among College Faculty,” by Stanley Rothman, S. Robert Lichter, and Neil Nevitte [http://www.bepress.com/forum/vol3/iss1/art2].

Since 1989, Horowitz’s CSPC has received a total of $11,461,000 from the Scaife, Bradley and Olin Foundations. Lichter’s Center for Media and Public Affairs has received $2,177,000 from the same three foundations. Rothman’s Center for the Study of Political and Social Change has received $1,208,000 from the same three foundations. [Media Transparency Grant Data Matrix. http:/www.mediatransparency.org] All three foundations exist primarily to fund right-wing think tanks, centers, academic chairs and campaigns. Lichter’s CMPA is associated with George Mason University, whose foundation since 1989 has received donations of $23,454,786, the lion’s share of that from the Bradley Foundation.

These foundations are not interested in producing research for the sake of finding unknown truth. They are interested in funding “research” that will produce predictable outcomes that can be couched in sound bites that appear to “prove” the conclusions their funders desire to promote. The people who perform these “surveys” (except Nevitte, a Canadian with his own polling firm) have long histories of providing their funders with the kinds of “information” they expect, and of resolutely publicizing their “findings,” while misrepresenting them. In short, the answers to the first question – who paid? – suggest that the methodologies and conclusions of both “studies” ought to be carefully examined.

Horowitz first “generated a list of 32 elite colleges and universities,” including “the entire Ivy League,” other northeastern colleges, and colleges and universities in California. Oddly, all these institutions are located in strongly Democratic states. Not so oddly, since Horowitz studied only voter registration, he found that the “overall ratio of Democrats to Republicans we were able to identify at the 32 schools was more than 10 to 1.”

In his Executive Summary – the part that’s meant to provide talking points to the print and radio pundits – Horowitz asserts that such a ratio “makes a prima facie case” that “there is a grossly unbalanced, politically shaped selection process in the hiring of college faculty.” But anyone who bothers to look at his actual figures discovers that while they show 40% of the studied faculty and administrators are registered Democrats, 55% are unaffiliated and 28% fall into the piquant category Horowitz calls “Too Many” – that is, those for whom he found “multiple results for the same name.” If this “study” shows anything, it shows a selection process that seems to favor the apolitical over members of either of the two major parties, a tyranny of the bland rather than a tyranny of the left.

In addition to assuming that all Republicans and Democrats hold “a predictable spectrum of assumptions, views and values,” Horowitz also limited the faculty studied to “tenured or tenure-track professors of Economics, English, History, Philosophy, Political Science and Sociology departments.”  He asks us to believe that this limited number of a limited group of professors from a limited number of carefully selected institutions – 32 out of a total 4,074 degree-granting institutions in the US [nces.ed.gov/programs/digest/ d03/ch_3] – “strongly suggest that the governance of American universities has fallen into the hands of a self-perpetuating political and cultural subset of the general population, which seems intent on perpetuating its control.” His figures indicate no such thing.

Rothman, Lichter and Nevitte’s study is based on a sample of American professors. The survey from which it is taken,  apparently created by Nevitte, does not seem to be available for inspection, so its methodology remains unknown. The study’s authors have extracted from this survey 1643 faculty members, of whom 1183 responded to queries regarding their party affiliations and attitudes toward a number of the Right’s favorite wedge issues, such as abortion, homosexuality and environmental protection “despite higher prices, fewer jobs.” Only full-time professors were selected, and those only from “doctoral, comprehensive and liberal arts” institutions. As Horowitz did, Rothman, et.al. thereby eliminate all professors at institutions that enroll 39% of the entire student body in higher education. (Rothman blandly explains this by citing another study finding “that two-year colleges housed the fewest liberal faculty”[Rothman, 2]. In other words, including such faculty would not have produced the desired result.

In spite of these obvious attempts to skew the survey to show “that liberals and Democrats outnumber conservatives and Republicans by large margins” and “conservatives and Republicans teach at lower quality schools than do liberals and Democrats,” Rothman, et.al. are forced to admit that “The results do not definitively prove that ideology accounts for differences in professional standing. It is entirely possible that other unmeasured factors may account for these variations” [Rothman, et al].

That both Horowitz and Rothman studies fail to show what they purport to show and were obviously designed to show is, of course, only important if you are looking for evidence regarding their subject. If you are instead looking for sound bites tricked out as “research-based,” Horowitz and Rothman are eager to provide them. Horowitz’s has already been quoted. Rothman places his sound bite conveniently in the Abstract: “This suggests that complaints of ideologically-based discrimination in academic advancement deserve serious consideration and further study.” Carefully distributed throughout the study are further quotable quotes, such as “political conservatives have become an endangered species in some departments” and “conservatives have a legitimate complaint.” [13]

The Rothman study was released on March 29. The next day, writing in the Unification Church’s Washington Times, Joyce Howard Price quoted the study’s compiler Lichter: “This is the richest lure [sic] of information on faculty ideology in twenty years. And this is the first study that statistically proves bias [against conservatives] in the hiring and promotion of faculty members.”  In one day, results that do not “definitively prove” have become results that “statistically prove.” 

On the same day, syndicated columnist Cal Thomas summarized Rothman’s findings and quoted an AAUP director, Jonathan Knight: “Knight added that he is not aware of ‘any good evidence’ linking the personal views of professors to what they teach. He must be living in an ivory tower without Internet access,” Thomas continued, because “A quick Google search of ‘liberalism on college campuses’ brings a wealth of good evidence that what is being taught on many of them is anti-American, anti-religious, anti-Israel, pro-gay rights and abortion, often to the exclusion and ridicule of opposing views.” In short, the Rothman “research” that occasioned Thomas’s column does not provide any evidence that Democrats or liberals promote their views in their classrooms, or that they are favored in hiring or promotion, so Thomas must forget his original source to suggest that “a wealth of good evidence” exists elsewhere.

One doubloon from this wealth he names is www.campus-watch.org. This website is a project of the Middle-East Forum, an organization entirely concerned with promoting a “pro-Israel” stance and attacking any expressions of criticism of Israeli policy.  95% of its funding has been provided by the Bradley Foundation.  Another is Thomas Reeves “of the Wisconsin Policy Research Institute” [position not described], quoted by Thomas as alleging that “’conservatives are discriminated against routinely and deliberately in faculty hiring, making some highly qualified teachers virtually unemployable because of their political and social views.”  Thomas provides no documentation for these claims. The Wisconsin Policy Research Institute has received 91% of its funding, to the tune of $6,685,000, from the Bradley Foundation.

Though Thomas doesn’t mention it, the greatest wealth of information a Google search would produce is to be found on the website of Students for Academic Freedom, a “student” group funded by Horowitz’s Center for the Study of Popular Culture (and thus by Scaife, Olin and Bradley). This operation places ads in student newspapers nationwide soliciting student complaints about bias in the classroom. Its website provides a convenient “Academic Bias Complaint Form,” which suggests what students should be looking for, including “Required readings or texts covering only one side of issues,” “Introduced controversial material that has no relation to the subject,” “Mocked national political or religious figures,” and “Allowed students’ political or religious beliefs to influence grading.”  These solicitations have produced quite a few pages of student complaints, and they make instructive reading.

An anonymous complaint of “Introduced Controversial Material” alleges: “This is in the school paper in a [sic] article about professors incorporating the election into the classroom. One was [sic – probably “way”] in which [professor’s name] says she plans on educating her English students is to view and discuss “Bowling for Columbine” by filmmaker Michael Moore. Currently, sh [sic ] sponsors the Amnesty International Club at [the college], and as a member of the Dallas Peace Center she frequently demonstrates for peace causes.”  Those who recognize opposition to torture and support of peace as sure signs of a tyrannical leftist need hear no more.

Another complainant at Ohio State has far more specific complaints:  “This complaint applies to the discriminating[sic ] nature of grading of [sic ] my English teacher. She knows I’m an advancer of conservative ideas b/c I where [sic ] a ‘W’ t-shirt  to class on sometimes [sic].  Ever since the 1st day of class when I wore my ‘W’ shirt she has treated me cold [sic] and been discriminating [sic] in grading my essays. On the last one, I wrote about how family values in the books weve [sic] read aren’t good. I know the paper was pretty much great because I spell checked it and proofred [sic ] it twice. I got an [sic ] D- just because the professor hates families and thinks its [sic ] okay to be gay.” Certainly anyone who spell-checks and proofreds twice must deserve a high grade.

The web pages provide many more such complaints, none of them validated by any outside observer. Students who dislike, disagree with or are given low grades by professors are thus provided a national forum for their gripes. That these gripes may be perfectly sincere does not mean that they have any validity. Some are quite well written and do indeed describe actions or speech by professors that suggest possible bias, stupidity or downright lunacy. But the reader has no way to assess their accuracy. The site is rather like a national forum set up for employees to anonymously complain about their bosses. But doubtless they provide all or part of the “wealth of good evidence” Thomas has in mind.

This “wealth” of misrepresented research, unsubstantiated charges and meaningless anecdote is being used not only by newspaper pundits and television talking hairdos, but by David Horowitz as ammunition to persuade both state governments and the United States Congress to put into law his creation, the “Academic Bill of Rights.” [ABOR] This document is currently under consideration by at least eight state legislatures and by Congress.

The ABOR [http://www.studentsforacademicfreedom.org.abor.html] begins with a historical review of statements supporting “academic freedom” by the American Association of University Professors and the U.S. Supreme Court, followed by a definition of the term worth quoting in full: “Academic freedom consists in protecting the intellectual independence of professors, researchers and students in the pursuit of knowledge and the expression of ideas from interference by legislators or authorities within the institution itself. This means that no political, ideological or religious orthodoxy will be imposed on professors and researchers through the hiring or tenure or termination process, or through any other administrative means by the academic institution. Nor shall legislatures impose any such orthodoxy through their control of the university budget.” It’s hard to imagine any academic who would contest this definition. Unfortunately, the provisions of the proposed bill undermine or contradict this statement of principle.

Provision 1 states “All faculty shall be hired, fired, promoted and granted tenure on the basis of their competence and appropriate knowledge in the field of their expertise [so far, so good] and, in the humanities, the social sciences, and the arts, with a view toward fostering a plurality of methodologies and perspectives [my italics]. What the independent clause gives, the qualifying phrase takes away, injecting the faculty member’s “perspective” into the criteria for hiring, firing, promotion and tenure.  As chair of a search committee for a faculty position, how am I to discover the applicants’ “perspectives” except by quizzing them all about their religious, political, and philosophical beliefs? And how is my committee to “foster a plurality” of “perspectives” without hiring or rejecting faculty based on those exact criteria?

Provision 3 states, “Students will be graded solely on the basis of their reasoned answers and appropriate knowledge of the subjects and disciplines they study, not on the basis of their political or religious beliefs.”  Again, who would dispute this principle? But how, exactly, can the basis for grading a student paper be measured? It is notable that no mention is made here of the quality of the students’ expression of their “reasoned answers and appropriate knowledge.” A student such as the one from Ohio State quoted above would be empowered by this provision to protest a low grade – which may have been based on the student’s evident want of literacy – to both administration and to the courts.

The most dangerous provisions are 4 and 5, both of which open the door to the imposition of “political, ideological and religious orthodoxy” by a variety of pressure groups and by legislatures “through their control of the university budget” that the ABOR’s preamble forbids. Proposition 4 reads, in part, “Curricula and reading lists in the humanities and social sciences should reflect the uncertainty and unsettled character of all human knowledge in these areas by providing students with dissenting sources and viewpoints where appropriate.”  Proposition 5 continues, “Faculty will not use their courses for the purpose of political, ideological, religious or anti-religious indoctrination.”

Let us say that I am teaching a course in 20th Century American Literature, and let us say that the majority of my students have previously limited their reading of American Literature to the works of Stephen King, Dean Koontz and Danielle Steele. (Such a condition is far from impossible.) If I do not include these authors in my reading list, have I failed to provide students with dissenting sources and viewpoints? Who is to say, after all, since human knowledge in the judgment of literary worth is uncertain and unsettled, that Sherwood Anderson is more worthy of study than Danielle Steele?

Or let us say that I am teaching a course on the Clinton “welfare reform,” and make passing reference to Jesus’ assertion, “Verily I say unto you, Inasmuch as ye have done it unto one of the least of these my brethren, ye have done it unto me” [Matthew 25:40]. Is his remark “relevant” to the discussion? Am I engaging in “religious indoctrination”? 

Or let us say that I am teaching a course on Germany’s conduct of WWII, and include in my reading list the autobiography of Rudolf Hoess, in which he describes the process by which he “. . . personally arranged . . . the gassing of two million persons between . .  . 1941 and the end of 1943, during which time I was the Commandant of Auschwitz” [Hoess, Rudolf, Commandant of Auschwitz, Cleveland and New York: World Publishing Company, 1960, p.17].  Because human knowledge is “unsettled,” as it surely is, am I obliged to include in the same reading list a work by British “historian” David Irving, who has publicly and repeatedly argued that “No documents whatever show that a Holocaust had ever happened” [http://www.powells.com/review/2005_03_04]? Certainly Irving’s denial of the Holocaust is a “dissenting source and viewpoint.”  And certainly I will eventually encounter a student who holds this viewpoint dear to heart, and will wish to sue me for neglecting it.

In sum, this proposed legislation would open the door to precisely the “interference by legislators or authorities within the institution itself” with hiring of faculty and curriculum that it claims to forbid. It is a Trojan horse, very cleverly designed by its author to disguise its purposes and inevitable effects.

Mike Rosen, far less cautious than Horowitz or Rothman, plainly states the goals of this campaign: “The sacred cow of tenure is under review, along with the limits of academic freedom and the shameful lack of ideological balance within college faculties . . . . Academic freedom is not absolute . . . . That means hiring conservative professors to balance the, now, left-lopsided scales . . . . Here’s the perfect remedy. Convert CU into a bastion of conservative thought . . . . ”  Cal Thomas is equally clear: “In matters of race and gender, colleges practice affirmative action . . . . Why won’t they do the same for conservative professors and students . . . ?”

The process here is a familiar one. Major right wing foundations supply money to scholars who will perform “studies” that “prove” the existence of a “Left-Wing tyranny.” These “scientific studies” are then deployed by sympathetic publicists who selectively quote them and repeat their misrepresented “findings” in print and on the air until repetition establishes them in their readers’ and listeners’ minds as “fact.” Then the “facts” and “evidence” are used to intimidate academic institutions into diluting or suppressing expressions of “left wing” opinion on campus, and into hiring right wing faculty.

The process is familiar because it is identical to that employed in the campaign to convince Americans that a “liberal bias” controls “the media,” and to intimidate “the media” into abandoning all real questioning of the statements of those in political power and to fill their ranks with right wing ideologues. This campaign was largely funded by the same right wing foundations, carried out by the same “researchers,” and drilled into the public by the same publicists. [http://media.eriposte.com/2-2.htm]

No one who has worked in higher education could deny that some faculty members express their particular ideologies in class. I have known some who did so. Their ideologies ranged from extreme left to extreme right. I have never known a student who has been either swayed or intimidated by such expressions, nor have I ever seen a case of punitive grading based on ideological grounds. I have never known of a faculty member either hired or dismissed on the basis of political or cultural ideology. Do such things happen? I think undoubtedly they do, since bias and injustice are endemic to human beings and their institutions. The SAF student complaints illustrate not only the possible existence of such incidents, but the fact that they are scarcely limited to bias from the left:

“One time the professor answered my question about bias in the media in the US: ‘It is unlike the Arab World.’  Another time he was describing Chomsky as ‘insane’ and Edward Said as a rock thrower at Israelis because as he said ‘people there generally throw rocks.’ The class was in American politics and introducing remarks about Arabs because I am one of them is inappropriate,” writes one of the SAF’s anonymous complainants.

“When discussing the platforms of various candidates for the 2004 presidential election, my professor said that any candidate who took an anti-war position in their campaign was akin to a terrorist,” writes another anonymous student; “Was repeatedly forced to repeat ‘darwin is a loon’ on all assignments. Any arguments presenting evidence of evolution were denounced as ‘Satanism.’ We were also forced to watch video’s of IDF forces butchering Palestinian kids while the teachers said that ‘judgment’ was being meted out. We were also told that we are not allowed to question the president and that God had appointed him to lead the Christian armies in smiting the Arabs so that we can steal all their oil. We were also told that global warming is fake, and when I presented evidence to the contrary, the teachers accused me of witchcraft,” reports a third.

Again, all three of these reports are anonymous and unsubstantiated. They do, though, suggest that attempts to ideologically indoctrinate students may come from the right as well as from the left.

Do they typify what goes on in American colleges and universities? My own experience is that they do not. I, and most of my colleagues, have been far too busy trying to help students understand what they read and learn to express their understanding coherently to have had time or inclination to indoctrinate them into our various views. And “various” is an accurate description, running the gamut of political and social opinions. No real evidence, other than the sort of silly, sloppily defined and frequently self-contradictory “research” of hired guns such as Horowitz, Rothman and Lichter and the unsubstantiated anecdotes of aggrieved students, exists to support the prevalent characterizations of higher education as a “tyranny of the tenured left.”

And of course that will not matter. “The tyranny of the tenured left” will be repeated by Sean Hannity and Mike Rosen and Rush Limbaugh and Cal Thomas and Thomas Sowell and Bill O’Reilly and the rest until, through mere repetition, it becomes an established “fact” in the minds of all those people who know nothing about higher education except that it drains their tax dollars. Legislators, state and national, will eagerly parrot the phrase as they seek to introduce legislation that will put  them in control of both personnel decisions and curriculum, and give them further excuses to transfer tax dollars from education to the coffers of the wealthy who fund their campaigns.

And the actual ends of Scaife, Bradley and Olin will be accomplished. Dissent from academia, such as it faintly exists, will be stifled and the country may continue, undisturbed in a lockstep march back to the way things were under the presidency of Karl Rove’s ideal chief executive, William McKinley [See James Moore and Wayne Slater, Bush’s Brain, Wiley, 2003].

***

The Crowded Theater in the Global Village

Danny Moore is one of twelve members of Colorado’s Independent Congressional Redistricting Committee. The other eleven members – Four Democrats, four Independents, and the three other Republicans – recently voted to remove Moore from his role as Committee Chair after some of his Facebook posts had been revealed by the Colorado Springs Gazette and other news media.

Among other things, he wrote that “mass mail-in ballots can be controlled by the people you give them too (sic)” and that “the media, blue-state officials, social media, Judge Judy, the establishment, the Intelligence Community, and the Global elite” had somehow defeated Trump, who, he said, had received more votes than Biden. (He neglected to mention the part played in what he referred to as “the Democrat steal” by Judge Crater, the international Communist professor conspiracy, John Travolta, designated hitters, or the Oxford English Dictionary people.)

Asked about these posts, Moore asserted that he is not a conspiracy theorist. He said he had “read articles” that made similar claims. “I don’t know if these things are true or not, but in my circle we share these things between us and we debate these things,” he said, adding that “My opinion is no more or less than anyone else’s opinion, but in this country we have to be able to express our First Amendment rights, without fear of being canceled by one side or the other.”

When I was teaching Freshman Comp, back when the last dinosaurs were expiring, I required my students to write six or seven essays each semester. I’d explain to them that their essays would be graded, once I’d subtracted points for mechanical and grammatical errors, on the basis of 30 points for clarity, 40 points for organization, and 30 points for support. I’d tell them what I meant by each: “clarity” meant that I didn’t have to read a sentence more than once to understand what it was trying to say. “Organization” referred to such matters as the existence of a thesis sentence, topic sentences for paragraphs, and a concluding paragraph whose conclusion referred back to the thesis. “Support” had to do with marshaling factual evidence from reliable sources and/or with avoiding common logical fallacies in supporting arguments.

When I got to that third criterion, I nearly always encountered some version of Moore’s assertion of the inalienable value of his opinions. “Everyone has a right to an opinion” was the standard formulation. I’d come to expect it. I was prepared.

I’d choose some student who looked unlikely to be armed or psychotic, walk up and peer at him or her, and say, “You are very sick. You are clearly suffering from Mogo on the Gogogo. You need to leave class and get to a doctor immediately.”

The student, or other students, would of course question my competence to make such a diagnosis, and I’d respond that it was my opinion, and I had every right to it. They’d ask me if I was a doctor, or if I had had medical training. I’d grin idiotically and say, “Nope!” This would lead to a discussion which usually concluded with general recognition that not all opinions are of equal value. Sometimes I’d attempt to cement that point by reading the passage from The Sun Also Rises in which Jake Barnes is drunkenly ruminating about Life, and thinks to himself, “You paid some way for everything that was any good . . . . Either you paid by learning about them, or by experience, or by taking chances, or by money . . . . ” And I’d suggest that the 1st Amendment did, indeed, grant all citizens the right to spout any foolish opinions they wished to spout, but that didn’t mean that every opinion was of equal value, or of any value at all, if the right to it hadn’t been earned. Some people got it.

If I were teaching Freshman Comp in these days of the Digital Hegemony, I’d revise my grading system to 30-30-40, so that mindless blithering filled with mindless ad hominem labeling, post hoc arguments, and reference to unexamined “sources” and “studies” would guarantee a failing grade to even the most elegantly written and faultlessly organized essay.

Danny Moore’s bio page on the Commission’s website says, “Danny is a graduate of numerous Navy leadership and technical schools. He holds a Bachelor of Science in Organizational Management from Colorado Christian University and an MBA in International Business from the University of Phoenix.”

I must conclude that neither the Navy nor Colorado Christian nor the University of Phoenix requires English composition in their programs. Or else I must conclude that English composition has come to hold students to very different standards than it did in my day, which was not all that long ago. Moore would never have gotten beyond my freshman comp class.

I suppose it’s perfectly possible that Freshman Comp is no longer a requisite in many degree programs. Before I left the profession, fewer and fewer classes had any writing requirements. Shortly after I retired, the State Board essentially did away with the requirement that students meet minimum standards of literacy before being admitted to regular college classes.

This is a most melancholy trend. Until our rewiring is complete, and our thoughts can be provided us by the Masters of the Universe (“Elon Musk’s Neuralink implanted a chip into a monkey’s brain and now he ‘can play video games using his mind,’” a recent headline happily observed), we are stuck with the marvelously imperfect, extremely hazardous tool of language with which to find and make our way through life.

The imperfections – the likelihood of misunderstandings, ambiguities, emotional manipulations, etc. – are nearly endless. The hazards – lies, misrepresented or misinterpreted truths, unscientifically produced “scientific evidence,” emotional manipulations, etc. – are equally numerous. Anything close to the truth is damned hard to find, and if you don’t know that, you’ve never tried to investigate a crime or lived with a spouse for more than a year.

The ability to use language responsibly and honestly is not inborn. We grow up learning the language from those around us, whoever they are, and without the aid of teachers most of us wind up not as masters of language but at its mercy, which it is not likely to offer. One of the most effective tools for examining language, our own or others’, is seeing it written down on a page, where it’ll hold still long enough that we have time to question it. That’s why for hundreds of years we considered learning to write with some degree of precision and clarity a necessary part of anyone’s education. Until playing video games with our minds becomes a satisfactory and sustainable lifetime’s work, learning to write will remain necessary.

Danny Moore, by all accounts, is a genial, companionable man. He has honorably served his country in the Navy for much of his life, and now runs his own successful business. But it’s clear from his remarks in defense of his Facebook posts that he doesn’t think any better than Archie and Fred on their tenth beer down at the corner bar. This seems not to have bothered the panel of six retired judges who appointed him to the Commission, and perhaps they were neither required nor chose to consider the intellectual competence of the pool members from whom they made their selections. It should, I think, bother the rest of us citizens of Colorado, whose representation will be significantly affected by the decisions of Moore and his fellow commissioners.

It should bother us as well, I think, that the systems of public and higher education we fund are failing to equip students with the basic tools of literacy which living in the global village does not render less but rather much more necessary.

***

These Are Not Maggots

A Self-Help Manual for the Tenuously Still-Employed

In Sergei Eisenstein’s film Battleship Potemkin, that ship’s crew finally rebels over its working conditions, in particular over the food they’re given. They haul a side of beef, writhing with maggots, before the ship’s doctor:

“Dr. Smirnov’s eye peering through the doubled pince-nez fills the screen. (Cut to . . . )

“White maggots swarm over the surface of the rotted meat.

“Dr. Smirnov’s pince-nez, a blur in one corner of the screen, come into sharp focus as they are carried toward the maggots. (Cut to . . . )

“Dr. Smirnov leans back, looks up at Vakulinchuk, and gestures deprecatingly toward the hanging carcass as he pronounces his medical opinion. (Cut to . . .

“The doctor’s hand and pince-nez tap the maggoty meat to emphasize his findings. (Cut to . . .

TITLE: ‘These are not maggots.'”     

It is the voice of authority, oblivious to or contemptuous of any facts which might contradict its foregone conclusions. It is the voice of all who speak for the dictator.  (Thus far, the US has no dictator as such; its dictator is, rather, corporate in nature. Few people choose to challenge the dictator or the dictator’s lackeys, preferring to join in the popular fiction that they live in a democracy. So most of us require a different voice, the voice of the bureaucrat, to keep us in subjugation.)

That voice doesn’t directly assert the power behind it, or, for that matter, directly assert anything at all. Instead, it throws a stifling, damp, felt language blanket over its victims, who are of course referred to as “colleagues.” Though he was speaking of the tyrant, Clive James’ observation applies as well to the tyrant’s straw bosses: “. . . the . . . overlord’s power to bore was a cherished and necessary component of his repressive apparatus.”

If the bureaucratic straw bosses must engage in a direct confrontation with the people who do the actual work in an organization, that power to bore is generally the only component needed. You will recognize the drill: the straw bosses begin the meeting with a series of extremely long, jargon-laden “updates.” These plod on until the more desperately ego-ridden workers are writhing in their seats. When the meeting is opened for discussion of whatever complaint (excuse me, whatever Issue) it was called to address, it is these workers who will grab the floor with their own interminable remarks,

some of which may relate remotely to the subject of the meeting. By the time these people run out of steam – and they always have a lot of steam – the rest of the workers are also writhing in their seats, desperate for a drink or a smoke or some welcome silence or some normal human conversation. The meeting disperses with no dangerous outcomes, because with no outcomes at all, except possibly the formation of a committee or two to further study the Issue.

Committee meetings resemble larger meetings in most respects. In addition, they generally feature at least one PowerPoint presentation, which requires the straw boss, or some committee member serving as the straw boss’s poodle, to project an endless series of slides bearing an outline of something numbingly obvious, and to read each slide aloud in the interests of redundancy and amplified humiliation. While committee meetings are nearly always maggot-laden, they require recall of another film to enable a normal human worker to endure them.

That film is Deep Blue Sea, one of the ten worst movies ever made. In the nadir of a career that’s had plenty of low points, Samuel L. Jackson plays Mr. Franklin, possibly a corporate CFO, possibly a successful entrepreneur, possibly a token executive in a firm or charitable foundation occupying a high-rise in New York City, whose possible President, CEO or Owner is an elderly white man who seems to have been taxidermied shortly before his one scene.

In that scene, an ostensible actress, evidently lobotomized in preparation for her performance, plays a genius researcher leading a project that may lead to a cure for Alzheimer’s, to which her father has succumbed to provide her motivation. She helicopters down to the NYC high-rise to find that Mr. Franklin, or possibly his moribund superior, is pulling the plug on her research, which Franklin or the recently dead man have been financing. She begs for a couple of days’ grace, and she pilots Franklin back to the research station, an abandoned WWII submarine fueling station in the middle of the Atlantic. Or the Pacific, or some ocean. During the flight, in some of the most awkwardly disguised exposition ever written, Franklin reveals that at some time in his past, he was a member of a climbing party caught in an avalanche in the Alps, and that he was responsible for saving some of the party.

The fueling station has been converted to a largely underwater holding facility for a bunch of Mako sharks, whose brains have been somehow enlarged so that the researchers can harvest some chemical peculiar to shark brains that, we are to believe, accounts for the longevity of its species. This is the chemical that will cure Alzheimer’s in an undisclosed manner. An unanticipated side-effect of enlarging the Makos’ brains has been that they have gotten really smart. In fact, the escape of one of the sharks, and its subsequent attack on a pleasure boat full of bikinied women, prompted the decision to pull the plug on the project.

Well, wouldn’t you know it, a few hours after Franklin touches down on the station, a massive typhoon arrives, knocking out the facility’s electrical systems and trapping Franklin and the researchers below decks with a bunch of devious, angry sharks. The sharks are out for revenge. Turns out they haven’t appreciated having their brains enlarged or living in underwater cages. They make short work of the cages, the underwater bulkheads give way, and the research crew must scramble and thrash around through the rising water, trying to find safe haven. One after another, they fail to do so. (Their meagerly established characters have been so uniformly vacuous, repellent, and annoying that the theater fills with cheers as each one disappears down a shark’s gullet.) Finally cornered, the surviving principals take to arguing fruitlessly with each other, and Mr. Franklin, survivor of alpine avalanche, steps up to take command. Standing at the edge of a pool in which, in better days, the sedated sharks had had their brain chemicals harvested, Franklin says,

“You think water’s fast? You should see ice. It moves like it has a mind. Like it knows it killed the world once and got a taste for murder. When the avalanche came, it took us a week to climb out. And some way, we lost hope. Now I don’t know exactly when we turned on each other, I just know that 7 of us survived the slide, and only 5 of ’em made it out. Now we took an oath, that I’m breaking now, swore that it was snow killed the other 2. But it wasn’t. Nature can be lethal, but it doesn’t hold a candle to Man. Now you’ve seen how bad things can get, and how quick they can get that way. Well, they can get a whole lot worse. So we’re not going to fight any more! We’re gonna pull together, and we’re gonna find a way to get out of here! “First, we’re gonna seal off this ” – CHOMP!!!!  And the last we see of Mr. Franklin is his foot sticking out of a submerging Mako’s teeth. You think ice is fast?

It’s a true highlight of World Cinema. Many people must share that assessment, since that scene has been enshrined numerous times on youtube, so you will have no trouble finding it, and won’t even have to endure the rest of the film to do so (https://www.youtube.com/watch?v=uz1J9PUcMQ0).

The memory of that scene will carry you through the most stultifying committee meeting. You’ll find it no trouble at all to mentally replace Mr. Franklin with whatever insufferable clown is prating away in Inspirational mode and watch that clown erased from existence in less than one second by a large, ill-disposed shark.

This advice should suffice, unless you are one of the few surviving members of the dwindling subspecies of people who suffer from the illusion that language was invented to convey clear meaning. If you are, you’ll understand the following rant by one of my former fellow wage slaves, Mark Prokto:

“Deep Beneath Mongolia a series of caves exists, a full thirty miles below the surface of the earth, hence heated by the magma itself. In these caves, the alpha-ur-humans are being brought to us. From their little petri dishes they blossom into creatures in all outer aspect no different from earthlings.

“For their first thirty years, they are coddled, petted, and surrounded by only two messages: ‘You are superior to all earthlings you will encounter  – except, of course, for those with whom you share the Secret Handshake,’ is the first. The second is not so much one message as an always-changing series of messages, which these homunculi necessarily believe constitute human conversation: a few months ago, for example, these messages consisted of the word ‘robust,’ and the phrase ‘at the end of the day,’ but they change rapidly, depending on the date of Expulsion Day. On Expulsion Day, the homunculi, dressed appropriately for various climes, are put into a rocket-propelled elevator and blasted out through a tube in Mongolia into the near stratosphere. They rise and spread like dandelion seeds over the surface of the earth, and drift down into the executive offices of every goddam place on earth. And immediately, they begin to speak: ‘At the end of the day,’ they say, ‘the bottom line is this: it is what it is.’

“And that’s who’s running the show. Your show. My show. The show.

“Kurt Vonnegut has another version in his new book. He thinks who’s running the show is ‘C students from Yale.’ Naahh. It’s Mongolians.”

If you do somehow retain the belief that words are supposed to bear identifiable meanings, you define the uses of language far too narrowly. Dr. Smirnov’s denial of self-evident truth, depending on his role as representative of Authority, is a use that most, if not all, parents will instantly recognize, since they’ve had recourse to it themselves: BECAUSE I SAID SO!!!

Those same parents will recall as well that language can be used to soothe, bore or send off to dreamland – hence the once universally known nursery rhymes and lullabies that have only lately been replaced by machines programmed to emit numbing sounds. The most militant employees find it difficult to retain their fervor when drifting off to dreamland to the soothing strains of the “Think Outside the Box” mantras of those On the Cutting Edge.

A third use of language against the interests of clear communication is the ubiquitous practice of Ad Speak, of the sort inculcated in the Mongolian homunculi. Ad Speak comprises a wealth of techniques: qualifying phrases such as “As much as” and ” As little as,” sesquipedalian verbiage conveying manifestly erroneous assertions, “proof” by reference to irrelevant or non-existent authorities . . . .

If you have not yet learned the trick of tuning out meaningless or dishonest babble, you clearly haven’t watched enough messages from the pharmaceutical industry, and you will need to find a proactive approach to enduring meetings. For example, several of my colleagues and I used to get through assemblies led by a particular administrator by silently toting up his uses of the word “agenda” (approximately one per every two sentences). Whoever remained alert longest and counted the most “agendas” won. Donna Leon, a writer whose ear is tuned most delicately, astutely and precisely to the varying tones of dishonesty, suggests a more elaborate version of this approach in Fatal Remedies:

[Deputy Commissioner Giuseppe Patta is Commisario Guido Brunetti’s superior at the Venice Police Headquarters. Signorina Elettra is ostensibly Patta’s secretary, factually the most intelligent, able, witty person in the building. She sees everyone and everything clearly, and very carefully dispenses her observations to her allies in competence.]

“There in Lyon [at a training course at Interpol headquarters] Patta had . . . sampled the various managerial styles on offer by bureaucrats of the different nations. At the end of the course he’d returned to Italy . . . head bursting with new, progressive ideas about how to handle the people who worked for him. The first of these, and the only one so far to be revealed to the members of the Questura, was the now weekly ‘convocations du personnel,’ an interminable meeting at which matters of surpassing triviality were presented to the entire staff, there to be discussed, dissected and ultimately disregarded by everyone present.

“When the meeting had first begun two months ago, Brunetti had joined the majority in the opinion that they would not last more than a week or two, but here they were, after eight of them, with no end in sight . . . .

“His salvation had come, as had often been the case in the last years, from Signorina Elettra . . . . she had come into his office . . . and asked Brunetti, with no explanation, for ten thousand lire [approximately one U.S. dollar].

“He had handed it over, and, in return, she’d given him twenty brass-centered five-hundred-lire coins. In response to his questioning look she’d handed him a small card, little bigger than the box that held compact discs.

“. . . . it was divided into twenty-five equally sized squares, each of which contained a word or phrase, printed in tiny letters. He’d had to hold it close to his eyes to read some of them: ‘Maximize,’ ‘prioritize,’ ‘outsource,’ ‘liaison,’ ‘interface,’ ‘issue’ and a host of the newest, emptiest buzz-words to have slipped into the language in recent years.

‘”What’s this?’ he’d asked.

‘”Bingo,’ was Signorina Elettra’s simple answer . . . . All you have to do is wait for someone to use one of the words on your card – all the cards are different – and when you hear it, you cover it with a coin. The first one to cover five words in a straight line wins’ . . . .

“And since that day the meetings had been tolerable . . . . Each week, too, the words changed, usually in conformity with the changing patterns or enthusiasm of Patta’s speech: they sometimes reflected the Deputy Commissioner’s attempts at urbanity and ‘multi-culturalism’ – a word which had also appeared – as well as his occasional attempt to use the vocabulary of languages he did not speak; hence, ‘voodoo economics,’ ‘pyramid scheme’ and ‘Wirtschaftlicher Aufschwung’.” 

It can, of course, be argued that such approaches to enduring the abuses and humiliations fostered by Authority are pusillanimous and shameful. Perhaps, indeed, they are so. But it can also be argued that some degree of abuse and humiliation has always been the price of admission to gainful employment, and that putting up with it by turning it into an occasion for a few laughs is a sane, if not particularly admirable, strategy. Socrates said that in a democracy, you have three classes. “The third class will be the ‘people,’ comprising all the peasantry who work their own farms, with few possessions and no interest in politics. In a democracy this is the largest class . . . . “

In our time, the peasantry no longer work their own farms, but rather work at various  jobs in government or retail sales or, increasingly, as carbon-based servo-mechanisms to various robotic processes.

Nevertheless, in my long, if admittedly limited, experience, I have found that most working people still retain a good deal of pride in the performance of their work They  have little or no interest in politics because they perceive that bosses are always going to be bosses and mostly jerks. They hope to evade the attention and interference of the bosses so that they can do their work in the best ways they’ve learned to do it. If they’re lucky, unduly interfering with their work won’t too often serve their bosses’ personal agendas.

If they’re not lucky, they may run afoul of one of the more pernicious sets of bosses, the certifiable psychopaths. (The Harvard Business Review observes placidly that “Chances are good there’s a psychopath on your management team.” Its essay on “Executive Psychopaths” goes on to say that “Many of psychopaths’ defining characteristics—their polish, charm, cool decisiveness, and fondness for the fast lane—are easily, and often, mistaken for leadership qualities. That’s why they may be singled out for promotion.”)

But you have a pretty fair chance of avoiding the truly psychotic boss. After all, “Nathan Brooks, a forensic psychologist . . . conducted a study surveying 261 senior professionals in the U.S. supply chain management sector. Brooks, along with his colleagues at Bond University in Queensland, and the University of San Diego, found that 21 percent of senior level professionals showed ‘clinically significant’ levels of psychopathic traits.” Four out of five bosses, then, are just garden variety jerks, who can be finessed or ignored without undue peril. Or, if not ignored entirely, creatively endured, using some of the techniques I’ve suggested. Imagination is not a frill; it’s often a vitally useful tool. Cultivate yours.

***

Mother *%*)(#!  Tongue

During the afternoon of my 21st birthday, I was standing by the garden along one side of our house with my father, Walter McCollum. We were looking at the peony buds as they prepared to bloom into great globes of cream and magenta. The sepals were alive with black ants. “Those *%*)(#! ants!” my father observed. In 21 years, I had never, once, heard the word “*%*)” cross my father’s lips. I think I’d always assumed that he didn’t know the word “*%*).” His utterance ushered me into the company of adult men.

My dad was a cusser, but a cusser with a code. “God damn” and “hell” he felt acceptable within our family, even around me and my sister. Even those two he eliminated when other men’s wives or distant acquaintances came around. You’d have thought him ignorant of the rich palette of English profanity. I may have thought that, too, though I probably never thought about Dad’s language at all, mired in the slough of youthful solipsism as I was. I don’t recall Dad ever talking about his code, but he somehow, if only by example, made it perfectly clear to his kids. You didn’t cuss around anyone but close friends and family, and then only with a limited selection of curses. No point taking a chance on offending someone, unintentionally or not.

During the 1940s and 50s of the last century, middle class behavior was tightly bound by all sorts of standards and rules. A fellow would never go outdoors without his hat on or fail to remove it once indoors. Same went for a suit and tie, except you could keep both on indoors. If a kid wished to refer to a non-family adult, that kid had better preface the adult’s name with a “Mister” or “Mrs.” or “Miss.” Other rules extended to all language usage, and while they were not written, so far as I know, they were widely recognized as valid and binding.

How binding? An actor – I think it was Sterling Hayden – in a Playhouse 90 production in the early 50s got so carried away during one dramatic scene that he injected a “damn!” into his lines. Television was live, back then, so the director had no way to bleep out the offending word, and within a week, the network had received upwards of 40,000 letters objecting to the profanation of the airways and demanding a range of recompense, mainly the blackballing of the offending actor.

No sentient being in the country existed who hadn’t encountered the word “damn” pretty much every day, yet in our public life we all had to pretend no such word existed. We of the baby boom generation found this sort of hypocrisy absurd and unconscionable, and as the 50s melted into the 60s, we began to make our feelings clear by ostentatious public indulgence in vulgar language. We viewed these indulgences as political statements, arguing a case that no barrier should be left standing between private and public speech or behavior. Still, for most of the 60s, George Carlin’s “7 Words You Can’t Say on Television” remained literally accurate. TV and radio upheld the old standards, though “damn” and “hell” seeped into the acceptable category. The recording industry, whose major market consisted now of boomer kids, was the first to allow some of those 7 words, but not until the 70s had begun did “*%*)” make its first appearance on a major label, when Harry Nilsson sang, “You’re breaking my heart/ You’re tearin’ it apart/ So *%*) you.” The floodgates had opened.

By the early 90s, they’d been obliterated. When my wife’s daughter and some of her friends came out in the back yard with a boom box, I was sitting under the pergola, hidden from their view by the surrounding vines. The boom box was playing a work by their current favorites, The Insane Clown Posse. The Posse’s works seemed to consist solely of the sentiment, ” *%*) the mother-*%*)(#!  mother*%*)$%@” repeated a great many times. I was of course properly revolted, now that I’d become a responsible adult hypocrite. The group’s dependence on repetitious vulgarity and objectless rage struck me as pathetic and stunted. What struck me even more forcefully was the kids’ blithe indifference to their discovery of me sitting there as the bombardment of ” *%*)”s went on and on. Growing up after the sudden appearance of rap on the airwaves, they

had not a notion that other people might be offended by such language, nor that any barrier might be desirable between their standards and those of the unknown rest of the world. Growing up, as well, after the sudden appearance of beepers and flip phones and then the internet and the smart phone, they had no notion that a distinction had once existed between private and public life. All was public, now. Transparency ruled.

While I find it appalling and offensive to hear people walking down the street talking loudly into their cell phones and the ears of any and every passerby about the most intimate aspects of their lives, I recognize that I’m simply a disgruntled survivor of a moribund social order, just like every one else who’s managed to survive for more than seventy years. Mores, including a sense of privacy, ingrained over several generations, taken as natural components of “human nature,”  prove to be just mores, and dissolve as quickly as Spring snowfalls. We, the survivors, are left to growl and grouse. Duke Ellington wrote our theme song many years ago: Things Ain’t What They Used to Be.

They never were, of course, most especially in the Golden Age of my youth, the 50s. While we’d gotten beyond some of the insanities of the Victorian age – we could openly refer to a piano’s legs, rather than calling them “limbs,” and we could even view women’s lower legs, though they still had to be swaddled in nylon stockings – it wasn’t only “dirty” words that were censored. Many words for the most basic elements of reality, such as birth and death, were verboten still. When Lucille Ball, who starred in one of the most popular tv series of the decade, got pregnant and kept filming her shows through the next nine months, the script was never allowed to describe her condition with the word “pregnant.” In 1956, when she judged it was time to educate her son about what were known as “the facts of life,” my best friend’s mother summed them up in these words: “The man puts his banana in the woman’s purse.” Homosexuality was still being referred to as “the sin that dare not say its name.” No one was allowed to die. We all had to Pass On.

All this bowdlerizing brought about much mischief. Boys and girls who knew no more about sex than they did about bananas found out a good deal more through unwanted pregnancies, for which the only recognized remedy was marriage, the word “abortion” still unpronounceable, and so the act beyond contemplation. Homosexuals, if discovered, were hounded from their jobs or denied employment and openly scorned and pilloried. Large numbers of people who’d shunned for many years recognition of death as a fact of life spent inordinate sums to have their permanently sleeping loved ones gussied up like high-end whores and laid out in boxes that cost more than their cars.

A dear friend in my college years, Stella Stickle, observing the growing proliferation of vulgarity and profanity, suggested that “calling a spade a spade doesn’t mean you have to call it a goddam bloody shovel.” Shortly after she made that observation, I entered the Army, where nearly everyone found it impossible to produce an utterance on any subject without using “*%*)” or its derivatives at least five times per sentence. Every spade wore its garland of “*%*)s.” The word soon lost any force, or for that matter meaning, it had ever had, and it became impossible to use it to express outrage or contempt or anger or anything, really. Like any word repeated over and over, it became just a meaningless sound. It seems to me a similar fate has befallen our whole arsenal of profanity and vulgarity, and that seems to me a shame.

In one of his fables for our time, James Thurber wrote of “The Bear Who Let It Alone.” The bear would go in a bar and have two drinks and go home. He was quite proud of his self-control until the alcohol dissolved it and he became a hopeless lush. “He would reel home at night, kick over the umbrella stand, knock down the bridge lamps, and ram his elbows through the windows. Then he would collapse on the floor and lie there until he went to sleep. His wife was greatly distressed and his children were very frightened.”

Ultimately the bear saw the light, reformed, and became a famous anti-alcohol crusader. To demonstrate the regenerative powers of sobriety, he would “stand on his head and on his hands and he would turn cartwheels in the house, kicking over the umbrella stand, knocking down the bridge lamps, and ramming his elbows through the windows. Then he would lie down on the floor, tired by his healthful exercise, and go to sleep. His wife was greatly distressed and his children were very frightened.” Thurber’s moral: “You might as well fall flat on your face as lean over too far backward.”

It does seem to me that we’ve emulated that bear in our rapid shift from Victorian squeamishness about language to our unrestrained embrace of unbridled vulgarity, which has the effect of rendering vulgarity impotent to express anything at all and of leaving our ability to talk clearly and honestly about matters of life and death no better off than in Victorian times. We seem to have made no *%*)(#!  progress at all.

***

Keeping Apprised

An old friend called me this morning and related a long and complicated tale of legal woe involving the music business. I had little advice to offer. I’ve spent a lifetime studiously avoiding any kind of business dealings. He thought he might want me to rewrite some lead sheets, but wanted to ask his attorney if that would be necessary.

“Well, keep me apprised,” I said, thinking to sign off. I could have said, “Keep me filled in,” or “Keep me up to date,” or “Let me know what your lawyer says,” but “apprised” was the first word that popped into my head.

“Is that the way you say it? ‘Apprised?’” my friend asked.

“Yup,” I said.

“Are you sure?”

“Of course I’m sure,” I said. “I’m an English teacher. I’m the pro from Dover.”

I should have said I’m a retired English teacher, and I’ve been away from the game long enough that I thought I’d better go check my recollection. Time for my almost daily visit to the 1941 Webster’s New International Dictionary, Unabridged that’s lived on its own little reading stand in every house I’ve inhabited since I was born. It needs its own stand because it weighs nearly 16 pounds and stands 6 inches tall, unopened.  Here’s what it had to say:

“apprise (v.t.) (a-prīz‘) apprised, apprising, also apprize (F. appris, fem. apprise, past participle of apprendre, to learn, teach, inform, see apprehend, apprentice) to give notice, verbal or written (to a person), to inform. Often followed by of ; also, to give notice of (a thing) 

            Knock, and apprise the Count of my approach    Byron

Syn. – acquaint, advise, inform. See inform.“

I’d had to fetch my granddad’s magnifying glass to read the tiny type, but that wasn’t terribly burdensome. In fact, I always like using that big, round magnifier with its tarnished brass ferrule and walnut handle worn smooth by more than a hundred years of use. I can see my granddad’s rheumy old eyes peering through it at the morning paper. He was  a dear man who lived with my family during his last years after his beloved wife died. I somehow inherited his magnifying glass along with the Webster’s Unabridged, and now my rheumy old eyes are glad I did.

I could, of course, have simply googled “apprise,” and instantly found this:

“apprise [əˈprīz] VERB inform or tell (someone).

“I thought it right to apprise Chris of what had happened”

synonyms:

inform · notify · tell · let know · advise · brief · intimate · make aware of .”

This would have advised me that my use of “apprise” was correct, that it meant what I’d intended it to mean. It would not have told me whether the verb required an object or not, or whether its principal parts were regular or not, or that it had come into the English language, like so many other words, with the Norman invaders, or the names of its French-English cousins, or by which preposition it should often be followed. Or that it had been good enough for Byron.

Had I taken Webster’s advice to see “inform,” and pursued it through seven variant meanings, meaning 7.2 would have acquainted me with the fact that “Inform, apprise may often be used interchangeably. But inform, the general term, emphasizes the actual imparting of facts or knowledge of whatever sort; apprise, the more formal and less common term, frequently carries the implication of giving notice of something.” This seems, no matter how long I ponder it, a distinction without an appreciable difference to me, and an example of the kind of refinement of meaning to the point of no return that gives intellectuals a bad name.

I’d spent a working lifetime advising my students not to use the more formal, less common term, but here I was, doing exactly that. The fine novelist William Humphrey had elegantly explained my reasons for that advice. He had taken to reading books about fishing, and found that “The angler had metamorphosed into the ichthyologist, and the prevailing prose reflected the change – if mud can be said to reflect. I found myself correcting it as I had done freshman themes in my years as a professor. You had to hack your way through it as through a thicket. Participles dangled, person and number got separated and lost, cliches were rank, thesaurusitis and sesquipedalianism ran rampant, and the rare unsplit infinitive seemed out of place, a rose among nettles. Yet, instead of weeding their gardens, these writers endeavored to grow exotics in them: orchids, passionflowers. Inside each of them was imprisoned a poet, like the prince inside the toad. What came out was a richness of embarrassments: shoddy prose patched with purple – beautifully written without first being well written.”

Many people write so if they want to impress (or con) their English teachers. I didn’t want to read any more of such writing, so I advised my students against using words just because they looked impressive. I adjured them to take Mark Twain’s advice: “I notice that you use plain, simple language, short words and brief sentences. That is the way to write English – it is the modern way and the best way. Stick to it; don’t let fluff and flowers and verbosity creep in. When you catch an adjective, kill it. No, I don’t mean utterly, but kill most of them – then the rest will be valuable. They weaken when they are close together. They give strength when they are wide apart. An adjective habit, or a wordy, diffuse, flowery habit, once fastened upon a person, is as hard to get rid of as any other vice.” (Letter to D. W. Bowser, 3/20/1880)

I didn’t stop adjuring there. Hemingway had implied similar advice: “Poor Faulkner. Does he really think big emotions come from big words? He thinks I don’t know the ten-dollar words. I know them all right. But there are older and simpler and better words, and those are the ones I use.” (Plimpton, G.,ed, Writers at Work, Second Series, Viking, 1963) So, in their ways had Katherine Anne Porter – “But there is a basic pure human speech that exists in every language. And that is the language of the poet and the writer . . . . You have to speak clearly and simply and purely in a language that a six-year-old child can understand; and yet have the meanings and the overtones of language, and the implications, that appeal to the highest intelligence – that is, the highest intelligence that one is able to reach. (Plimpton, Ibid) and William Carlos Williams – “I couldn’t speak like the academy. It had to be modified by the conversation about me. As Marianne Moore used to say, a language dogs and cats could understand. So I think

she agrees with me fundamentally. Not the speech of English country people, which would have something artificial about it; not that, but language modified by our environment; the American environment.” (Cowley, Ibid) So had Pat Conroy: “I dislike pretentious words, those highfalutin ones with a trust fund and an Ivy League education. Often they were stillborn in the minds of academics, critics, scientists. They have a tendency to flash their warning lights in the middle of a good sentence. In literary criticism my eye has fallen on such gelatinous piles as “antonomasia,” “litotes,” or “enallage.”  I’ve no idea what those words mean nor how to pronounce them nor any desire to look them up.” (Pat Conroy, My Reading Life, Doubleday, 2010)

I’m with Conroy. I did look up “enallage,” and read several tortured definitions of the term which left me none the wiser. At best. “A figure of speech used to refer to the use of tense, form, or person for a grammatically incorrect counterpart.” Oh. Could you find a better example of a gelatinous pile?

So I share with these fine writers a distaste for fancified, pretentious, or otherwise overly complicated terminology. Or so I’ve always thought, until my friend’s question got me thinking about the words I use that probably seem to exemplify such qualities.

Just last week I received two letters from old friends, one as voracious a reader as I am, one a professional writer. The former remarked on the appearance of “synecdoche,” “elisions,” and “septuagenarian” in my most recent letter. The latter, having read  a memoir I’d sent him, wrote, “It also reminds me I haven’t used the word ‘pusillanimous’ in any book lately. I must make a note.” (I expect he was kidding.) I hadn’t used any of those words in order to impress my correspondents. They’d simply been the first words that occurred to me.

Obviously “synecdoche” and “elision” occurred to me because I had been an English teacher, and those are two shards of jargon from that world. Like all professional jargon, they’re designed to fence out the vulgar lay public. All “synecdoche” means is “taking the part for the whole.” “Elisions” are simply sentences or passages from which someone has eliminated some words, sometimes in the interest of brevity, sometimes in the interest of distortion.

For example, had I been writing about Babe Ruth, I might have called him “the collossus of the diamond,”  and been responsible for a metaphorical synecdoche. I would have meant that Ruth towered over the game of baseball in his day – represented by the “diamond” – as Chares’ Colossus had once towered over the harbor of Rhodes. In Ruth’s day, sportswriters would have indulged in such language without a second thought, knowing that the sports fans who read them had been educated sufficiently in world history to get the comparison and sufficiently in the English language to understand that “the diamond” here meant “the game of baseball.”

“Septuagenerian,” I must admit, is no improvement on “70-79-year-old” except that it’s more rhythmically pleasing and doesn’t take so long to pronounce. I learned it, I’m sure, reading books from earlier centuries, when nearly all literate people had also been schooled sufficiently in Latin that the meaning of “septuagenerian” would be immediately apparent to them. I’m sure I had to look it up when I first ran across it, and it stuck with me, probably, because I liked the way it felt in my mouth. Same reason “pusillanimous” – another import from Latin -stuck with me – it sounded as contemptuous as I wanted it to sound. (Since in the memoir my writer friend referred to I was describing my own pusillanimity, I felt I’d earned the right to contempt.)

I wasn’t using any of those words to try to impress anyone. They were parts of my reading and writing and speaking vocabularies – probably not everyday parts, since I only occasionally needed them – words of long enough acquaintance that they didn’t strike me as any more exotic than “cat” or “dog.” If someone else had to go look them up in the dictionary, that didn’t seem such a bad thing.

Malcolm X wrote in his autobiography about his discovery of the dictionary during his final stay in prison: “With every succeeding page, I also learned of people and places and events from history. Actually the dictionary is like a miniature encyclopedia . . . . I suppose it was inevitable that as my word-base broadened, I could for the first time pick up a book and read and now begin to understand what the book was saying . . . . months passed without my even thinking about being imprisoned. In fact, up to then, I never had been so truly free in my life.”

On the one page of my Webster’s that held “apprise,” I found seven words I’d never seen before, or at least didn’t remember seeing. Some were highly specialized, some listed as obsolete. I couldn’t imagine finding myself in circumstances that ever required my using some of the others. But then, I’d never imagined I’d find myself officiating at a wedding between a stockbroker and a Venezuelan heiress in the glowering presence of the latter’s adolescent son. Let alone had I imagined the elisions that could be made in the wedding ceremony if you just wanted, pusillanimously, to get it over with. Life is indeed full of surprises, and knowledge can only be called “useless” until you need to know it. Same goes for words. Life will keep you apprised of those truths.

***

Forget You: The Problem with Cancel Culture and Literary Censorship

On the landing of a high school stairway one day, I happened to witness the unhappy ending of a friend’s romance with one of our school’s acknowledged beauties. He had been pleading whatever case he thought he had for a while when she issued her verdict, with the haughty finality available to the beautiful young. “Forget you,” she said, and swayed off up the stairway. I imagine she did forget him, having found bigger fish to fry.

Watching various Confederate monuments pulled down, reading calls for the elimination of all sorts of books from school and college reading lists, I feel uneasy. I don’t feel uneasy because I harbor admiration for the Confederate cause, or for the racial or sexual slurs that used to be published without a second thought on anyone’s part. I feel uneasy at the enterprise of trying to erase the past, however benighted it may have been. I think “Forget you” is an unrewarding stance to take toward the past, not to mention a dangerous one.

Nathan Bedford Forrest’s bust serves as a fine example of statuary under assault. A pioneer of guerrilla warfare, Forrest was also known as a slave dealer, responsible for a revolting massacre by troops under his command, and one of the first leaders of the Ku Klux Klan. Surely not a man deserving of public honor. But he undeniably existed. If by removing his bust we think to expunge him from the historical record, to disappear him, then we remove as well the possibility of understanding how such a man came to exist, what forces formed (or deformed) his character – forces which are clearly still in operation today, deforming other characters into similar shapes.

We do something else: we lose the possibility of understanding an extremely complex character. For the few commonly alleged facts about Forrest give a far from complete picture of the man. Greg Tucker, Tennessee Historical Society historian, offers one addition to our view of Forrest’s nature and character:

“Retired Confederate Gen. Nathan Bedford Forrest was an outspoken advocate for the civil rights of the freedmen in postwar Tennessee.

“This advocacy and his popularity with the Memphis black community were resented by some of his white contemporaries who spread false rumors to discredit the general and further their own political interests.

“As president of the Selma, Marion & Memphis Railroad, he employed former slaves as construction engineers, crew foremen, train engineers and conductors. Blacks were hired as managers, as well as laborers.

“When Forrest’s cavalry surrendered in May 1865, sixty-five blacks were on Forrest’s muster role, including eight in Forrest’s Escort, the general’s handpicked elite inner circle. Commenting on the performance of his black soldiers, Forrest said: ‘Finer Confederates never fought.’

“Forrest detractors allege that the Confederate general was the ‘founder of the KKK.’ This is factually incorrect. The 19th century Ku Klos was founded as a fraternal organization on Dec. 24, 1865, in Pulaski by Thomas M. Jones, a Giles County judge; Frank O. McCord, publisher of the Pulaski newspaper; and four other Confederate veterans. Though not present at a Ku Klos meeting in Nashville in 1867, Forrest was elected as grand wizard of the organization.

“There is no evidence that Forrest ever wore any Klan costume or ever “rode” on any Klan activity. He did, however, on Oct. 20, 1869, order that all costumes and other regalia be destroyed and that Klan activity be ended.

“This was confirmed by the U. S. Congress in 1871: ‘The natural tendency of all such organizations is to violence and crime, hence it was that Gen. Forrest and other men of influence by the exercise of their moral power, induced them to disband.’ See U. S. Congressional Committee Report (June 27, 1871).

“When Forrest died in 1877, Memphis newspapers reported that his funeral procession was over two miles long. The throng of mourners was estimated to include over 3,000 black citizens of Memphis.”

If Forrest were to be disappeared from all libraries and all mention of him deleted from histories of the Civil War and of the antebellum and postbellum South, we would remain in ignorance of both his evil doings and of his active embrace of and by the black people who knew him first hand. And our tendency to deify or pillory others would remain unchallenged by another set of contradictory facts.

We would lose a great deal more than that if the efforts of school districts in Pennsylvania, Virginia, Minnesota, Mississippi and New Jersey to remove Huckleberry Finn from their public school curricula are aped throughout the country. The most recent of these efforts, in New Jersey, is justified by its sponsors thus: “The novel’s use of a racial slur and its depictions of racist attitudes can cause students to feel upset, marginalized or humiliated and can create an uncomfortable atmosphere in the classroom.”

Well, then! All we need do is chuck that novel in the trash and our history of racist attitudes, our justifications of slavery by denying the full humanity of slaves, our pain and outrage and guilt will be erased. Of course, we’ll have to throw in a few hundred more books with it – from Uncle Tom’s Cabin to Native Son to Their Eyes Were Watching God  to Invisible Man to The Underground Railroad (both books by that title) to Another Country . . . . This list could go on for a long time, this list of books that contain racial slurs in the mouths of characters who would have used them or contain mention of bigotry, lynching, segregation, slavery – all those shameful aspects of our history that make people uncomfortable because they ought to be made uncomfortable by them. If our history, like human history in general, is a nightmare from which we are trying to awake, we are not likely to awake from it by pretending there was and is no nightmare.

Here is a fairly representative paragraph from Huckleberry Finn. An old doctor who has been treating Tom Sawyer’s gunshot wound addresses the men who have captured Jim and want to hang him:

“Don’t be no rougher on him than you’re obleeged to, because he ain’t a bad nigger. When I got to where I found the boy, I see I couldn’t cut the bullet out without some help, and he warn’t in no condition for me to leave, to go and get help, and he got a little worse and a little worse, and after a long time he went out of his head, and wouldn’t let me come anigh him, any more, and said if I chalked his raft he’d kill me, and no end of wild foolishness like that, and I see I couldn’t do anything at all with him; so I says, I got to have help, somehow; and the minute I says it, out crawls this nigger from somewheres, and says he’ll help; and he done it, too, and done it very well. Of course I judged he must be a runaway nigger, and there I was! and there I had to stick, right straight along, all the rest of the day, and all night. It was a fix, I tell you! I had a couple of patients with the chills, and of course I’d of like to run up to town and see them, but I dasn’t, because the nigger might get away, and then I’d be to blame; and yet never a skiff come close enough for me to hail. So there I had to stick, plumb till daylight this morning; and I never see a nigger that was a better nuss [nurse] or faithfuller, and yet he was resking his freedom to do it, and was all tired out, too, and I see plain enough he’d been worked main hard, lately. I liked the nigger for that; I tell you, gentlemen, a nigger like that is worth a thousand dollars – and kind treatment, too. I had everything I needed, and the boy was doing as well there as he would a done at home – better, maybe, because it was so quiet; but there I was, with both of ‘m on my hands; and there I had to stick, till about dawn this morning; then some men in a skiff come by, and as good luck would have it, the nigger was setting by the pallet with his head propped on his knees, sound asleep, so I motioned them in, quiet, and they slipped up on him and grabbed him and tied him before he knowed what he was about, and we never had no trouble. And the boy being in a kind of a flighty sleep, too, we muffled the oars and hitched the raft on, and towed her over very nice and quiet, and the nigger never made the least row nor said a word, from the start. He ain’t no bad nigger, gentlemen; that’s what I think about him.” (352-354)

Ten “n*****s” in one paragraph. Is the word offensive, hurtful, disgusting, vicious, dehumanizing? All of the above. Will reading it, discussing it, discomfit some students? Undoubtedly. Is the use of this word evidence that the author was a racist? Or is it evidence that he was doing his best to accurately portray American society along the banks of the Mississippi before the Civil War? It could be both, if read carelessly. That’s one very good reason for continuing to “teach” it – to encourage students against inattentive, unreflective reactions to what they read or hear.

Far from promoting racism, that paragraph perfectly illustrates the utter insanity of racism, demonstrates in the doctor’s tortured reasoning how powerlessly stupid it makes a clearly decent man who can’t see beyond his cultural conditioning, even when presented with direct, personal evidence that refutes that conditioning. It also demonstrates the evil underlying a slave economy, the monetization of human life. Trying to defend Jim from potential lynching, the best argument the doctor can find is “a nigger like that is worth a thousand dollars.” And then, devastating because thrown in as an afterthought, “and kind treatment, too.” Does Twain explain any of this to the reader? No, because he’s a story-teller, not a social psychologist. His job is to show, not to explain.

Ernest Hemingway wrote that “All modern American literature comes from one book by Mark Twain called Huckleberry Finn. American writing comes from that. There was nothing before. There has been nothing as good since.” That’s the sort of sweeping profundity you utter midway through the first of one drink too many, but there’s some truth in it. For any number of reasons, including its accuracy as a socio-historical document, Huckleberry Finn is a very great piece of American writing, and certainly a progenitor of many other great books, from Winesburg, Ohio to Never Come Morning to Slaughterhouse Five. Among countless others. And if our education system doesn’t preserve these treasures, even if the treasures contain horrors and shameful beliefs and actions, preserve them and help people learn to understand and value what they have to say about the horrors, the shames, and also the beauties and proud moments of our past, then who will do that job?

Ralph Ellison said, “at best Americans give but a limited attention to history. Too much happens too rapidly, and before we can evaluate it, or exhaust its meaning or pleasure, there is something new to concern us. Ours is the tempo of the motion picture, not that of the still camera, and we waste experience as we wasted the forest.” If we shatter our monuments, burn our books, dismiss large parts of our past as the work of devils in human form, are we not continuing to waste our experience, whatever genteel lipstick we might apply to these activities?

***

Attack of the Killer Virus

“Our inventions are wont to be pretty toys, which distract our attention from serious things. They are but improved means to unimproved ends.”  
– Henry David Thoreau, Walden

“The emergence of AIDS, Ebola, and any number of other rain-forest agents appears to be a natural consequence of the ruin of the tropical biosphere . . . . In a sense, the earth is mounting an immune response against the human species.” 
– Richard Preston, The Hot Zone

I begin writing this on Bastille Day, 2001, perhaps an appropriate anniversary for considering computer viruses and the virus of computers. 

While many of the odd chaps who create computer viruses seem bent merely on demonstrating their prowess, like long-ago math whizzes who wore slide-rule tie clips and didn’t hesitate to whip them into action, some creators seem to be more revolutionary: they aim to bring the system down.

In either case, the viruses they create seek to convince the computer they enter that they are a legitimate part of its mind: nothing to worry about, sir – just need to drop in and make a minor adjustment. The minor adjustment the serious viruses aim to make, of course, happens to be the obliteration of the computer’s very self.

Lately the viruses are devoting greater attention to the power of self-replication – a significant power for any self-respecting virus. What started as a sort of Yippie phenomenon – let’s all get high and scream “Off the Pigs” in front of the Dean’s office – may be aspiring toward the Winter Palace.

If you use a networked computer to do some of your work, these viruses manifest themselves as the kind of petty irritation that adolescents are amused to perpetrate upon adults. So far, the adults have dealt with them accordingly, hiring Vice-Principal McAfee to swat them down and Officer Krupke to arrest the more serious offenders who go so far as to drop cherry bombs in the digital commodes. And progress whizzes on down the great highway.

Yet the adolescents are undeterred; dozens of viruses are born each week. 

Do they somehow mirror the digital organism they attack?

II – Ebola

“Let the school administrator announce that he has ordered computers for eight hundred illiterate sophomores, and lo, they have become educated.” 
– Lewis Lapham, Imperial Masquerade

The Ebola virus enters the body and essentially devours its innards until no distinction between organs remains, blood flows everywhere, and the host perishes choking up his own insides. Not to put too fine a point on it.

The message the virus brings to the body is this: Only I and my kind deserve to live. Auslanders, join our cause or die! Our victory is certain! Join us!

I think about this message when I listen to my fellow instructional administrators talk. They say, We must join the virus faster! More convincingly! How can we subdue resistance? How can we make the virus more efficient? How can we make it work in our behalf? How can we get these recalcitrant employees to embrace  the virus?

Occasionally, I ask them for instruction. How is it, I ask, that this virus is improving the intelligence, the skills, the humanity of our students? Or of ourselves, for that matter? They look at me when I ask these questions as if I were a Martian. I have the sensation that I speak from inside a glass box, like the one that surrounded Eichmann at his trial. Except that my box has no microphone inside it, nor is the glass bulletproof. My lips move, but there is no translator, and even if there were one, the jurors all have their pinkies stuffed in their ear holes. Their fear and hatred penetrate the glass.

III – Tools

“Simply by turning to a computer when confronted with a problem, you limit your ability to recognize other solutions.
When the only tool you know is a hammer, everything looks like a nail.”
– Clifford Stoll, Silicon Snake Oil

I use computers. Got one at work. I type on it (great  typewriter), I keep records in it (great  file cabinet, if not reliable over the long run), I converse with my fellow workers through it (much  preferable to the telephone for doing business). I think the computer is a good tool for those purposes. I could live without it, though I’d be unhappy if I had to go back to liquid paper every time I made a typing error. 

But if the Big Virus struck, and the whole network went down, vomiting bits and bytes all over the landscape, I could continue to do what I do for a living.  A tool isn’t a job. Anyone who can’t tell the difference doesn’t know much about the job.

My father was a superior craftsman and, lacking patience, a poor teacher. He did manage to teach me to handle a paint brush and both rip and cross-cut saws. I can paint a window frame  without bothering with masking tape, and cut a straight line through any plank you want, to this day. Those are skills at my command any time they’re needed, and they depend only on my being able to find the right brush or tool. They don’t depend on my ability to locate a source of energy, a power cord long enough to reach a socket, or six hundred dollars with which to buy a bench saw.

Those skills are not as simple as a master of them makes them look. They involve many of the multiple intelligences Howard Gardner talks about. You have to understand how paint works its way to the tips of the brush hairs, and which kinds of hair will do the best job of instructing the paint to go where you want it.  You have to understand surfaces of all sorts. You have to understand the relations between your vision and point of view and your hand, and between your hand and your tool. You have to cultivate patience and timing, and know when to rest. To learn to paint out of a corner or properly rip a plank, you have to become an enlarged human being.

If you buy a professional masker and a spray gun because you’ve been told they’re the tools of the professional, if you buy a bench saw and a radial arm saw for the same reason, you pay for them not only with dollars, but with a reduction in the exercise of your human capacities.

It’s obvious that my argument about human capacity lost to the bench saw could as well be applied to the hand saw. Before hand saws, how did we cut wood into desirable shapes and lengths? If you pursue the question back far enough, you get to the place in history where we must have done those things with our hands and teeth, or else done without that precious little tusk cabinet in the corner of the cave. Is that  what I’m recommending?

No. What I recommend is that we remember that the job comes before the tool; that no tool is good for every job; that all tools that become part of a culture offer gains in efficiency and productivity and losses in self-sufficiency and intelligence.

I once helped build a cabin on the side of a 10,500 foot mountain. The only power tool we used was a 10” circular saw powered by a small generator. We cut the trench for the foundation footing out of the granite mountain side with picks, sledges, and a big cold chisel named Steely Dan IV, in homage to William Burroughs. It was a fine experience, from which I learned much about the expandability of human endurance. I also learned that I would never willingly do it that way again, and when we went to build a second cabin, the generator had grown considerably, and the power tools were screaming above the growl of the backhoe.

Had we been entirely and purely devoted to self-sufficiency, we could have built that cabin entirely by hand. We’d still be working on it, of course, and would by now have diminished our ranks through falls, coronaries, homicides and other accidents incidental to a mad enterprise. Or we could have simply put up tee-pees and spared a lot of work and expense.

But the job was to build a cabin on a steep mountainside, upon the deck of which we could sit in the evening and watch the shadows on the mountainside across the valley, drinking gin and smoking dope and playing guitars until dinner had been cooked on the cast-iron wood stove and we could go to bed quiet and comfortable and unworried about the occasional bear. So we used a truck and a generator and an electric hand saw, the tools we needed to achieve the result we wanted within our youthful lifetimes.

The job comes before the tool.

The latest use of the computer network at my place of employment is the computerized calendar, a function of the e-mail package. It evidently allows you to know what you are supposed to be doing and where you’re supposed to be doing it. It also allows a raft of other people to enter obligations into your schedule anytime it occurs to them to do so, which I find irritating and unmannerly.

This description may be neither adequate nor fair-minded. I have thus far refused to so much as open this marvelous new tool, because I haven’t felt any need for it. I haven’t felt any need for it because I have my own calendar in my head, and it rarely fails me. I supplement my mental calendar with little scraps of paper that I stuff in my shirt pocket during the course of a day and read the next morning.  That technology – memory, caring about the obligations I assume, and random notation – has been working just fine for the past thirty years. The first piece of advice Big Bill Tilden gave in his book on tennis was, “Never change a winning game.” 

To learn to use “Calendar” would require a few hours of my time. To use  it would require that I spend maybe a quarter hour of every working day. But I don’t see any need to waste that time, because I make every scene I’m supposed to make using my  system. So do all the people I work with, except for the lamebrains who couldn’t and wouldn’t learn to use “Calendar” if you gave them a pretty.  “Calendar” is a tool that takes more time to master and use than tools already universally available – memory, caring, writing reminders.

No tool is good for every job. Some tools are simply stupid, as needlessly complex as Rube Goldberg’s labor-saving inventions, though seldom as amusing.

I find it deeply objectionable that no problem existed before this Calendar “solution” was mandated. People missed meetings, to be sure. They missed them because they spaced them out, or because something more vital interceded, or because they got stuck in traffic, or because they didn’t want to make them and provided a colorful excuse to cover their absence. I never noticed an epidemic of absences, and I fail to see how an electronic digital computerized state of the art cutting edge calendar will in any way alter the behavior of the spacy, the truly responsible, the traffic becalmed, or the imaginatively recalcitrant. The solution, which will be no solution, preceded evidence of a problem, and will cost the institution which mandates its use many lost hours of productive work – lost to more screwing around with computers. So there’s no gain in productivity – a loss, in fact – because the job didn’t come before the tool, the problem didn’t precede the solution. And, for the obedient souls who religiously consult “Calendar,” there’ll be a further loss in self-sufficiency, as their mental calendars atrophy.

I’m trying here to work toward an outline for a cost-benefit analysis of computers.  It looks to me as if, before computerizing an activity heretofore undigitalized, we might want to ask:

Does the job require the use of this tool?

Will this tool do the job better than the tool we’re currently using?

Will its use to do this job actually improve the outcome?

Will what is lost in human capacity be less valuable than what is gained in “efficiency”?

The last question worries me the most.

IV – Calculation

“If Farmer A can plant 300 potatoes an hour, and Farmer B can plant potatoes fifty percent faster, and Farmer C can plant potatoes one third as fast as Farmer B, and 10,000 potatoes are to be planted to an acre, how many nine-hour days will it take Farmers A, B, and C, working simultaneously, to plant 25 acres?

Answer: I think I’ll blow my brains out.” 

– Kurt Vonnegut, “Flowers on the Wall”

When I was a boy, and the snow through which I walked unshod for miles to school stood higher than this piddly powder they call snow today, I spent a number of years, five days a week, memorizing and reciting the multiplication tables through the 12’s. I don’t recall enjoying the experience. I’m quite certain, even through the haze of Golden Age nostalgia, that it seemed to me and all my peers a dead waste of our precious lives.

And how very wrong we were can be discovered by anyone seeking change for a five from the young product of modernized math behind the counter whose register has momentarily quit calculating. They didn’t name it a “counter” for nothing.

Another way to discover how wrong we were is to ask people to think about numbers. Hardly anyone under the age of 50 will even try. If those multiplication tables don’t reside in your hard-drive, boyo, then nothing that followed subtraction in your math “education” ever made much sense to you. Now did it?  Fess up.

Tell you why I know that. About 90% of the people who take the math placement test at my community college can’t do fractions, can’t comprehend decimals, and are left utterly baffled by percentage problems. Why? Because all those lesser parts of one embody division, and division is the other face of multiplication. And they never learned the damn tables, so division never came clear to them. So they don’t know how to ask questions of numbers that will produce sensible answers, and they don’t recognize whether an answer is sensible or not, even if they arrive at one. When confronted with a statement involving numbers, their brains essentially shut down.

V – Questions

“It is the nature, and the advantage, of strong people that they can bring out the crucial questions and form a clear opinion about them.
The weak always have to decide between alternatives that are not their own.” 
– Dietrich Bonhoeffer, Resistance and Submission

I answer the phone a lot at my work, and I sit at the front desk, to which people come seeking all sorts of information. The seekers represent a random cross section of American humanity. They all have questions.

About a year and a half ago, I began to notice something peculiar, first on the phone, then with the people who came to ask me questions in person: they all said exactly the same thing first. “I have a question,” they said.

For a couple of months, I played along.

“Yes,” I would say, “what would you like to know?”

“It’s about college,” they would say.  Or, “It’s about classes.”  Or, “It’s about financial aid.”

So far, I only knew that they had a question. “What is it about college (classes, financial aid, etc.) that you would like to know?” I’d sweetly enquire, in my patient, customer-service tone.  And then, having lost a good half minute from my rapidly dwindling store, I might hear an actual, answerable question.

After a few months, this new mode of query had become so invariable that my curiosity went to work on it, and I realized what was happening.

They thought I was an internet search engine.

They thought that asking questions meant that you entered a key word or two in the box, mouse-clicked “search” or “go get it,” and a menu of choices would magically appear before you, saying, in effect, “Is this what you had in mind? Or this? Or this? . . . ”  And then they could scroll down – except, of course, that I was supposed to be scrolling through these refinements of their initial questioning feeling – until something looked as if it might be worth another click of the mouse.

Once I’d figured that out, I started experimenting with silence.

“Hello?  I have a question?”

Silence.

“It’s about college?”

Silence.

“It’s about, like, my bill?”

Silence.

Silence is a great power, and most such conversations would revert to a human level after 4 or 5 silences, and my interlocutors would get the info they wanted, and I’d be no more enraged than I always am.

Out of sheer exasperation, people used to teach their children how to ask useful, precise questions. Then television came along, and the kids were plunked down on Sesame Street, where every question came supplied with an immediate answer – except, of course, for any questions the little shavers in front of the tube might have that the script writers hadn’t considered important. Then the Internet arrived, and everyone from the outhouse to the White House assured us that all the answers were in there; you just had to point and click until you homed in on the answer you sought. Therefore, you didn’t need to learn how to ask questions.

If you don’t know how to ask questions, you don’t know how to learn anything beyond whatever was beaten into you before you could talk and whatever people with access to the various transmitting media want you to know. If you don’t know how to ask questions, you’re cannon fodder. Or another grateful Wal Mart customer, secure in the belief that you’ve seen the world, for how could there be anything not contained within that gigantic maze of an emporium? It’s just a matter of pointing your cart ahead of you and following its clicking wheels down one aisle and up the next until the latest desideratum magically appears. The pain of thought needn’t plague you.

VI – Faces

All the papers in Andalucia devoted special supplements to his death, which had been expected for some days. Men and boys bought full-length colored pictures of him to remember him by, and lost the picture they had of him in their memories by looking at the lithographs. 
– Ernest Hemingway, “Banal Story”

An actual face gets dirt on it, and twists itself into all sorts of shapes that reveal what’s going on within its owner. Watching an actual face is like watching a piece of country: the weather passes through it, changing it; the light illuminates its best features one moment, obscures them the next. After a long time watching a particular piece of country – a meadow, a grove of trees, a bend in a stream, a face – you have burned into your brain your  perception. It’s not a photograph.  It’s much more. It’s your memory.

The Internet and e-mail subtract the face. Every moment spent sitting in front of a tube is one less moment spent perceiving the face of another. Does this matter?

“Not only humans have faces,” writes James Hillman in The Force of Character and the Lasting Life.  “We do not own them all . . . . Ancient Egyptians imagined the sky as a vast face with the sun and the moon as eyes. The Navajos say something is always watching us.                         

“If we no longer imagine that ‘objects stare back,’ then the things around us spark no ethical challenge, make no appeal. They are not partners in dialogue, with whom an I-Thou relationship exists. Once the soul of the world loses its face, we see things  rather than images. Things ask no more of us than to be owned and used, becoming possessions.”

As I copied this passage, I was seeing the face of the student who gave it to me, a face oval and relaxed around eyes that truly look at whatever comes before them, a face that engages in no Miss America gymnastics to appear winning, gay or lovely – and so is lovely as the faces of children are – but lovelier for the depth of experience within those steadily looking eyes. Had this student been one of twenty or thirty or a hundred students “on line,” known to me only through her disembodied words – eloquent as in her case they are – and had she recommended to me, on line, the magazine in which Hillman’s remarks appeared, I doubt I’d have read them. People are all the time trying to get me to read things, and I hardly ever welcome or honor their suggestions. I’m always working on my own reading program, which is generally obsessive or frivolous or both. But when this student handed me this magazine, I read it. I knew her, I knew she knew me, at depths that made me receive the magazine as a gift, a tribute, a challenge, a suggestion – I didn’t know which, but I knew it would be real, because I knew this person as I never would have if I hadn’t spent enough time looking in her face to know the somebody who was home there.

VII – Images of Faces

“Because he can walk into a dark room, and every bulb in that room can be burned out, and there’s no matches,
and believe me, you will feel that room light up when that face of his gets inside it.” 
– Lou Clayton on Jimmy Durante, Gene Fowler, Schnozzola

It might be suggested that it’s already possible for me to see that student and for her to see me, through the miracle of the digital fiber optic cutting edge Telescreen of the future.

To which I say that a face in front of a camera isn’t a face; it’s a performance.  Need I elaborate?  Consider every photograph of yourself you’ve ever seen.

In actual human communication, we become ourselves only when we forget our faces because we’re focused on what’s being communicated, whether it’s getting across, how to get it across. And then the light within remodels the very skin and turns the receptor called the eye into a transmitter, the act of reception called listening into an act of transmission. Then words have sweat and musk and acid; then words can touch like fingertips, rake like nails, poke like old friends out of patience.

VIII – Receptors

“The lights must never go out.
The music must always play,
Lest we should see where we are….”

– W. H. Auden, “September 1, 1939”

The great paradox in the “communications revolution” that began when we harnessed electricity is that the more “communications” we receive, the less we are able to receive them, or find time to do anything of value with the ones that do get through the glut.

An example: I teach a class in American music, a history of ragtime, blues and jazz. I taught this class for maybe three semesters before it dawned on me that hardly any of my students had any idea what instruments they were “hearing.”  My first clue that I might be assuming too much when I assumed that everyone knew the sound of a trumpet came when one student identified Bix Beiderbecke’s Mozartean solo on “Jazz Me Blues” as the work of Vic Spiderback.  This engaged my curiosity, and I began asking students what instruments they were listening to. The horror, the horror. They couldn’t tell a flute from a glockenspiel. More to the point, they couldn’t tell a trombone from a clarinet.

These students had not only been exposed to more music than any humans in the history of the world, they’d heard every conceivable musical instrument in  the world, and dozens not in the world, through the miracle of studio electronics.  And not one out of a hundred had an idea in the world what instruments were producing a given set of sounds. And a majority of these students thought of themselves as musicians – wrote songs, played in bands, all that. As I pondered their inability to hear, I thought suddenly of Horace Butler.

Horace was a bass player I’d known in a previous incarnation who played in a rhythm and blues show band. They had stacks of Marshall amps and speakers and put on a big show with choreography and lights and what not. Horace played in this band because he was a musician and the band worked enough to enable him to pay his rent and eat.

Horace played bass in the band wearing plastic ear plugs. He had figured out somewhere along the line that the cilia in the human ear can only be subjected to so many decibels before they begin to break, and that, once broken, they don’t regenerate and they aren’t replaced. So to defend his hearing, this musician had to reduce his ability to hear the music he was helping to produce.

In other words, to hear, you sometimes have to deafen yourself.

My students had all lived their lives as Horace was living his when I knew him, surrounded by insistent sound, rhythm, vibrating, throbbing flashes of light, sound, light, sound. Showtime, baby. That had been their reality throughout their lives: blasting lights; screaming sound waves. But they hadn’t had earplugs.

So they’d learned how to not listen. They’d learned how to not look. They’d learned to deaden their perceptions in self defense.

IX – Shutting Down

“You can’t get away from TV. It is everywhere. The hog is in the tunnel.” 
– Dr. Hunter S. Thompson, Generation of Swine

Every time I enter the world of the Internet, I’m met by something that looks like the back page of a comic book from the days of my youth, only crasser. The top of the page has moving graphics directing me to look here, no, look there; the sides of the page are lined with little billboards far less witty than the old Burma Shave signs; the bottom of the screen always has yet another moving graphic urging me to some kind of expenditure of time or money.

I’m long trained to ignore all this Barnum stuff. I use the Internet for reasons of my own, and I don’t even notice the bells, gongs, dancing girls, pointed guns, slashing swords, slacking thighs or yearning lips that reach out to me as I march up the information highway toward my goal. 

In other words, in order to abide using the Internet, I’ve learned to not see more. To not-see. More.

When Muzak arrived, bringing the Gift of Music into public establishments theretofore free of irrelevant aural stimuli, people began to learn to not listen. When television became ubiquitous in American homes, and then in public places, people had to learn to not-listen and not-see. This did not mean that they didn’t take anything in. It meant that they learned to take stuff in without being consciously aware of what they were taking in.

If you passively absorb the stimuli your environment provides, you have attained the spiritually advanced state of a rooted plant. If you learn to endure the bombardment of external stimuli by deadening your senses, you have progressed to the state of a rock. If you have learned to combine these two skills, you have reached the pinnacle of modern human consciousness exemplified by the citizens of 1984’’s Oceania. You absorb the unavoidable stimuli provided by those who control the transmitters. You are unaware of any other random stimuli emanating from accidental sources, such as yourself or other people or birds or insects. Or, if briefly aware, you are programmed to tell yourself that such stimuli are beneath the notice of a citizen of such an advanced state as yours.

However, you are still, in fact, neither a plant whose roots might survive your lopping nor a rock impervious to all but time. You are a large, soft mammal whose survival depends on a once highly-developed repertoire of senses, a marvelous ability to move upright through the field of gravity, and an active, nearly paranoid interest in what’s going on in your immediate neighborhood.

All of which, the virus assures you, are tools no longer necessary in the Information Age. 

Oh, no, the virus assures you – you need no longer fool with such gross activities as smelling, touching, listening, looking, wondering. WE CAN TAKE CARE OF IT.  SIT BACK. CHILL. ABSORB. ABSORB ME. THE MEDIUM IS THE MESSAGE.

X – Not Ebola Again

“For the time being, however, the worship of the higher technology serves the cause of barbarism.” 
– Lewis Lapham, Imperial Masquerade

When the hemorrhagic viruses first began probing human hosts, humans reacted pretty rationally. These babies are out to kill us all, said humans. We’d better study the way they work, we’d better find out how to contain them when they show up on Main Street, we’d better learn how to fight them. They don’t mean us no good.

Does a virus that deadens our senses and turns our brains into passive recorders of meaningless electronic impulses, that further reduces our contacts with other humans and with the natural world and encourages passive acceptance of the destruction of the environment upon which our survival depends mean us more good?

As I approach a conclusion, on May Day, 2001, the first wave of the mighty computer virus has gone into sudden retreat. I don’t know why. Perhaps enough people have noticed that the shining promises of the Information Age flacks are empty. That the electronic revolution has not reduced our slaughter of trees, but increased it. That the age of communications has resulted in ever-decreasing communication, because everyone shouting at once does not constitute communication – just Talk Soup and showers of meaningless Factoids. That the ready availability of “information” is of no use to those who haven’t the time or the training to sort, collate, compare, contrast and evaluate it, because they’re running too fast to make payments on the debt they’ve already accrued so that they’ll be able to further burden themselves when they buy the next cutting edge upgrade. That any amount of information is useless to you if you don’t know yourself, and therefore don’t know what you want to do with the information.

Or maybe we’re all just waiting for Smellovision and the Feelies to hit the market, our vision and hearing having been completely numbed, rendered incapable of further titillation.

Turning over our active, animal selves to the ministrations of media has not led us to fuller humanity; it has gone a long way toward reducing us to humanoid creatures who have no thoughts or feelings of their own, derived from unique, direct experience, but only faint, random, confused imprintings and aborted impulses. This does not constitute progress. It constitutes a terrible loss of our infant capacities to serve our planet as responsible stewards, and to serve ourselves and each other as wise and loving friends.

I have not so much failed as consciously refused to “document” many of the  accusations I’ve made here. I don’t think they need documenting. Rather than asking for “expert” corroboration or “scientific” supporting data, I hope you’ll consult your own perceptions of how your life is, how your senses are operating, how your human relationships have changed since the beginning of the Information Age.

If nothing has ever satisfied you as deeply as cybersex, if a bunch of electrified dots that your brain assembles into the image of a cardinal sitting on a bare, black branch against a snowy field seems preferable to feeling the cold in order to see the actual cardinal (even if the dots don’t include the smell of the nearly frozen creek behind you or the splush your boots made breaking through its surface in the grey dawn) if, when you type in the question, “What’s it all about, Alfie?” and the answer comes up “Burt Bacharach,” you feel confident that you’re good to go – why then, you’re not here, having long ago thrown this piece of anti-technological drivel into the trash.

If you’ve felt enough unease with our current situation to read this far, then perhaps you’ll agree with me that it’s time we began to clarify our perceptions of what needs defending and of who constitutes the enemy. Perhaps you’ll begin to think about the computer and the internet in terms of costs and benefits. 

“Never change a losing game;” Big Bill Tilden wrote, but he didn’t stop there. The rest of the sentence was, “always change a losing game.”

***

Turn Off the Bubble Machine: E.M. Forster’s Virtual Reality

I stole my title from a Stan Freberg routine in which he parodies the Lawrence Welk Show that used to be broadcast from the Aragon Ballroom in California. At the end of the show, Welk’s Bubble Machine, which produced the visible signs of his “champagne music,” goes berserk, emitting such a Vesuvius of bubbles that the pier is elevated from its moorings and the Avalon is borne out into the dark Pacific, with Welk’s desperate Dakota twang crying, “Turn off-a da Bubble Machine! Turn off-a da Bubble Machine.”

But I have a different bubble machine in mind, as did E.M. Forster when, in 1909, he wrote his long tale, “The Machine Stops.” This bubble machine creates bubbles, true enough, but they don’t float themselves out to sea. Rather, they encase individuals from experience – experience of the natural world, of other individuals, of their own bodies and emotions; encase them within transparent walls of images and “ideas.”

“The Machine Stops” envisions a world in which humans live far beneath the surface of the earth, each one sequestered in a hexagonal cell to which all necessities and approved pleasures are supplied by The Machine. Food, air, light, water, music, literature and human company of a sort are available to each cell’s resident at the touch of a button.

In one such cell we find Vashti, described as “a swaddled lump of flesh – a woman, about five feet high, with a face as white as a fungus.” Vashti receives a call from her son, Kuno, whose cell is halfway around the world, on a device which allows her to both hear and see him, after a fashion. Kuno asks her to visit, so that she may explain to him the harm in his desire to visit the surface of the earth. Loath to leave her cell, Vashti responds that such a desire may hold no harm, but is “contrary to the spirit of the age.”

While children are separated from their mothers immediately after birth in Forster’s brave new world, some unacknowledged vestige of maternal love eventually impels Vashti to undertake the journey to meet with Kuno. She is transported to him within a series of sealed chambers, including an airship that inadvertently gives her glimpses of the repellent sky, the ocean, Greece and the Himalayas; to all of these, her immediate and automatic response is the same: “No ideas here.”

Kuno reveals to her what he refused to communicate electronically: he has visited the surface of the earth without having first applied for an Egression Permit, and has, after his recapture by The Machine, been threatened with Homelessness. This means that he will be ejected from the cell world and left on the surface of the earth, where he will perish immediately from the poison of unmodified air – or so Vashti believes, even though Kuno tells her that not only had he begun to acclimatize to the air on the surface, he had seen human creatures living there. Repulsed and despairing, Vashti leaves her son to his fated Homelessness and returns to her cell.

Some time later, Kuno, who has somehow been spared, again calls Vashti with a message that baffles, terrifies and enrages her: “The Machine stops,” Kuno says. Vashti cuts her ties with the “man who was my son,” reckoning him irretrievably mad.

But Kuno’s prediction comes to pass. The music begins to fade and falter; the water turns ever fouler; the poetry machine emits gibberish. Complaints from the cell-dwellers, directed to the Committee of The Mending Apparatus, multiply. The communications apparatus breaks down, panic overtakes the cell-dwellers, and Kuno and Vashti are reunited somehow during the final chaos, which is ended when an airship crashes through the surface, exploding tier after tier of the underground world.

This tale scarcely even meets the typical science fiction story’s minimal level of characterization – we see only two characters, Vashti and Kuno, and they are little more than types: Vashti, the conformist, Kuno, the questioning rebel. But Forster had something else in mind other than a simple, dystopian tale of the future; at the very beginning and the very end of the story, he describes it as “a meditation.” His meditation concerns the relationships between humans and their tools – the benefits those tools provide and the costs they exact.

The inhabitants of Forster’s subterranean paradise have available both necessities and pleasures at the touch of a button: “There were buttons and switches everywhere” in Vashti’s chamber – “buttons to call for food, for music, for clothing. There was the hot-bath button, by pressure of which a basin of (imitation) marble rose out of the floor, filled to the brim with a warm deodorized liquid. There was the cold-bath button. There was the button that produced literature. And there were of course the buttons by which she communicated with her friends. The room, though it contained nothing, was in touch with all that she cared for in the world.”

In her satisfaction, Vashti is not alone, not a member of some privileged class; in each cell throughout the entire system, the amenities are identical: “…thanks to the advance of science, the earth was exactly alike all over.” In short, half of the Communist  Manifesto’s great vision – to each according to his needs – has been completely realized. After a fashion.

Thirty-seven years after Forster’s meditation, George Orwell was moved by an article in a popular magazine about “Pleasure Spots of the Future” to observe: “It is difficult not to feel that the unconscious aim in the most typical modern pleasure resorts is a return to the womb. For there, too, one was never alone, one never saw daylight, the temperature was always regulated, one did not have to worry about work or food, and one’s thoughts, if any, were drowned by a continuous rhythmic throbbing.” Forster was perhaps thinking of this connection to the womb as he imagined the appearance of the cell-dwellers: Vashti, that “swaddled lump of flesh . . . white as a fungus” is also to be thought of as “without teeth and hair;” in other words, these citizens whose every need is supplied by The Machine have essentially reverted to homunculi.

With the exception of the deviant Kuno, the citizens not only do not miss contact with the natural world, they actively fear and loathe it. When Kuno first requests that Vashti visit him in person, she demurs because, “I dislike seeing the horrible brown earth, and the sea, and the stars when it is dark.” After she is driven by residual mother-love to make the trip, her first sight of the airship she will take is even worse: “Yet as Vashti saw the vast flank of the ship, stained with exposure to the outer air, her horror of direct experience returned.” To her, “All the old literature, with its praise of Nature . . . rang false as the prattle of a child.” This attitude is also encouraged by the faceless committees in charge of The Machine.

It is when Kuno first sees the constellation Orion from an airship that his curiosity about nature is fired, for he sees in this grouping of stars “’. . . that they were like a man.’”  Kuno, in other words, has begun to imagine that as a human he is somehow a part of nature. Vashti responds to this notion, “’It does not strike me as a very good idea, but it is certainly original.’” She senses that Kuno’s identification with nature is “contrary to the spirit of the age;” in fact, she senses that it is profoundly subversive. In a world devoted entirely to the satisfactions that can be mechanically provided, any interest in, identification with or contact with Nature threatens the system with potential discontent.

Vashti’s abhorrence of direct experience operates as well in the sphere of human contact unmediated by the machine – that is, “direct experience” of other humans.Yet Forster notes near the beginning, “She knew several thousand people; in certain directions human intercourse had advanced enormously.” Vashti can communicate with these people instantaneously and at will through the medium of what amounts to an interactive computer network. Further, she can avail herself of the knowledge of all others, as well as that stored up from the past, and she can share her own ideas with everyone who chooses to hear them. After she repels Kuno’s request to come see him, she delivers her lecture on Australian music, which is “well received.”

So the other half of the Manifesto’s vision – from each according to his abilities – has also been realized. Forster’s delicate qualifier regarding the improvement of human communication, “in certain directions,” indicates his attitude, but he lets Kuno phrase his criticism more directly. Trying to explain his desire to speak with Vashti in person, Kuno says, “’I see something like you in this plate [the “screen” in which people appear to each other], but I do not see you. I hear something like you through this telephone, but I do not hear you.’” As the conversation continues, Vashti fancies that Kuno looks sad.  “She could not be sure, for the Machine did not transmit nuances  of expression. It only gave a general idea of people – an idea that was good enough for all practical purposes, Vashti thought. The imponderable bloom, declared by a discredited philosophy to be the actual essence of intercourse, was rightly ignored by the Machine, just as the imponderable bloom of the grape was ignored by the manufacturers of artificial fruit.”

And Vashti’s horror of direct experience of other humans is nearly overpowering. The worst thing about airship travel for her is the need “to submit to glances from the other passengers,” and when the airship’s flight attendant touches her, she cries, “’How dare you! You forget yourself!’” Her distaste becomes clearer still when she reaches the end of her journey. “And if Kuno himself, flesh of her flesh, stood close beside her at last, what profit was there in that? She was too well-bred to shake him by the hand.”

Finally, the instant availability of human communication has a paradoxical side – effect; the more quickly and easily people can communicate with each other, the more impatient they become with the slightest delay. When Vashti takes Kuno’s call, her first words are, “’Be quick! . . . Be quick, Kuno; here I am in the dark wasting my time.”

In Forster’s imagined world, human communication has become nearly instantaneous and potentially universal; that portion of human communication which is received by sight and hearing is available to all, and all but the aberrant find it “good enough for all practical purposes.” Speed and ease, however, bear costs: the loss of “nuance,” (in other words, of emotion) and a terrible fear of experiencing what other senses bring – fear of smell, of touch, of “the bloom” that defines the grape but is itself undefinable.  Meaning that it cannot be reduced to an “idea.”

In his book Four Arguments for the Elimination of Television, Jerry Mander asks his readers to perform an experiment: “Please go look into a mirror. As you gaze at yourself, try to get a sense of what is lost between the mirror image of you, and you. You might ask someone to join you facing the mirror. If so, you will surely feel that other person’s presence as you stand there. But in the reflection, this feeling will be lost. You will be left with only the image . . . . What is missing from the reflection is life, or essence.”

But Vashti and her fellows seem to sense no loss in their condition. They feel, instead, that they are in complete control of their lives. When Kuno telephones Vashti, “The woman touched a switch and the music was silent.” Music is simply a commodity at the command of individual whim. When Vashti’s refusal to come see him irritates Kuno, “His image in the blue plate faded . . . . He had isolated himself.” What a male paradise; if a woman disputes your desires with illogical arguments, flip a switch and disappear! Every man’s dream has finally been realized.

But at what cost does this mastery of life come? The cost is pretty high, in Forster’s vision. And the highest cost can be seen in Vashti’s complete loss of self-knowledge.  When Kuno appears on her “plate,” Vashti’s “white face wrinkle[s] into smiles,” and she is impelled to visit him in person by the thought that “there was something special about Kuno – indeed there had been something special about all her children – and, after all, she must brave the journey if he desired it.” But when Kuno telephones her toward the end of the tale with the cryptic message, “’The Machine stops,’”  she says to a friend, “’A man who was my son believes that the Machine is stopping.’” When she must choose between the Machine and her son, her natural but perfectly unconscious maternal feelings are readily expendable.

Because scarcely any self is left, a terrible emptiness is evinced by the need for constant and immediate mental stimulation. After Kuno isolates himself, Vashti immediately turns off the isolation switch which has allowed her to talk to only one person, and “all the accumulations of the last three minutes burst upon her. The room was filled with the noise of bells, and speaking-tubes. What was the new food like?. . . Had she had any ideas lately?  Might one tell her one’s own ideas? . . . To most of these questions she replied with irritation – a growing quality in that accelerated age.” Constant, instant, inescapable communication seems to fill a void for these people, but because it does not truly fill the void, it becomes increasingly irritating, and the people become increasingly impatient with the current communication which is preventing the next communication from arriving.

The Committee of the Machine senses that this void is becoming a problem – Kuno’s unauthorized visit to the earth’s surface alerts them to this danger – and so it undertakes to provide the citizens with a new religion, the worship of the Machine and its instructional manual, the Book of the Machine.

“Those who had long worshipped silently,” Forster observes, “now began to talk. They described the strange feeling of peace that came over them when they handled the Book of the Machine, the pleasure that it was to repeat certain numerals out of it, however little meaning those numerals conveyed to the outward ear, the ecstasy of touching a button, however unimportant, or of ringing an electric bell, however superfluously. ‘The Machine,’ they exclaimed, ‘feeds us and clothes us and houses us; through it we speak to one another, through it we see one another, in it we have our being. The Machine is the friend of ideas and the enemy of superstition: the Machine is omnipotent, eternal; blessed is the Machine.’”

And so the “creation of man,” the Machine which serves all human needs and desires that can be served by reason, becomes not servant but master: “The word ‘religion’ was sedulously avoided, and in theory the Machine was still the creation and the implement of man. But in practice all, save a few retrogrades, worshipped it as divine.” However, in Forster’s imagination, the Machine proves a false god, for even reason has its limits. As the Machine is failing, the Committee of the Mending Apparatus, besieged by complaints, must finally issue the mournful bulletin, “The Mending Apparatus is in need of repair.” This marvelous admission may epitomize Forster’s skepticism toward the primacy of reason, but the costs to humanity he has noted along the way have done so far more completely.

Forster saw with remarkable clarity that “science” and technology were erecting barriers between humans and the natural world, between humans and other humans, and between humans and their own experience of their humanity. He saw that if people learned that they needed machines to communicate with other people, the machines would take on a life of their own. He saw that people cut off from direct experience would become infinitely malleable, nearly identical and dead to all stimuli except those available to the intellect through the ears and eyes.

“The Machine Stops” is short on characterization because it describes a world in which character – that is, individuality – has nearly vanished, replaced by “ideas. ”That is, by constructs of words entirely disconnected from direct experience of life, and which cannot be checked against  direct experience of life because no one has any. In this world, people are not only shielded from direct experience (and this is the benefit of technology), they are prevented from having it (and this is the cost). People have created the Machine to mediate between them and Nature, them and each other, them and themselves; the Machine has thus become their master.

I might be excused if this reminds me of the title of a textbook in use at my college: A World of Ideas. The “ideas” in Forster’s story are pretty well summed up in a well-received lecture supporting the abolition of travel to the earth’s surface: “Those who still wanted to know what the earth was like had after all only to listen to some gramophone, or to look into some cinematophote . . . . ’Beware of first-hand ideas . . . . First-hand ideas do not really exist. They are but the physical impressions produced by love and fear, and on this gross foundation who could erect a philosophy? Let your ideas be second-hand, and if possible tenth-hand, for then they will be far removed from the disturbing element – direct observation.”

Here are the observations of two students quoted by Peter Sacks in his book Generation X Goes to College:

“Lectures are just one person talking, and it’s kind of just not really any tone. Something that’s loud and flashes or something like that, it grabs your attention. When somebody is just standing there just talking, it makes you want to fall asleep . . . . I think the media is out of control. Technology is moving so fast. We need to take a breath and stop for a while and give people time to catch up,” says Angie, apparently unaware of a self-contradiction that would have stupefied Walt Whitman.

To which Frederick adds, “Higher education doesn’t work any more. It doesn’t challenge. We (students) think the media is more substantial than you the teacher. We don’t value what teachers say and do. We’re afraid of what you will say and do; it’s so personal. With media it’s so impersonal. We don’t want to be personal any more with anybody. We don’t want to confront our emotions. Machines are easier. If we can get it from machines, we don’t have to get it from a person. The media is passive, safer. It doesn’t really affect us.”

These remarks demonstrate how completely many of our children have been encased in their bubbles, and how completely their minds have come to resemble Vashti’s in their fear of others and of the self, in their febrile need for “something that’s loud and flashes or something like that.”

Can we somehow turn these tools – the tv, the VCR, the LCD projector, the computer, the internet – to our purposes as human beings? I don’t think so.

For one thing, they are meant  to encase people so that those people can be supplied with pre-approved thoughts and values and needs, and they are meant to make sequential thought essentially impossible. If only something that’s loud and flashes can get your attention, then your attention becomes merely a stultified blur, anxiously awaiting the next bang or flash. For another, they are meant to encase people in the belief that they, the media themselves, are reality – “more substantial,” as Frederick says, than other humans, and far less threatening.

Third, film, television and the internet all place the highest possible premium on speed, in order to reduce those periods of stultified blur to the minimum. “’Be quick, Kuno; here I am in the dark wasting my time,’” says Vashti. “In a perfect world, everything would be different,” opines a recent Dodge commercial; in other words, in a perfect world change would be perpetual, erasing any vestigial impulse to reflect upon whether the new everything was in fact an improvement upon the supplanted everything.

I recently received a glossy, jazzy, multi-color brochure from Adelphia, encouraging me to “Experience the breathtaking speed” of their internet cable connection. They feel sure I will wish to “feel the rush of video, sound, graphics, and tons of information screaming in and out of [my] computer.” In my “more gratifying Internet” experience, they assure me, “Web pages appear in a flash, as fast as you can click on them. Files that took minutes or even hours to download now arrive in mere seconds. So let slowpokes stare at half-filled screens. You’ve got better things to do!”

Nowhere in the brochure is there any suggestion of what those “better things” might be, or, indeed, any mention at all of the content of these Web pages and files. The point is to keep those moments of time wasted in darkness at bay.

In Slowness, Milan Kundera writes, “There is a secret bond between slowness and memory, between speed and forgetting. Consider this utterly commonplace situation: a man is walking down a street. At a certain moment, he tries to recall something, but the recollection escapes him. Automatically, he slows down. Meanwhile, a person who wants to forget a disagreeable incident he has just lived through starts unconsciously to speed up his pace, as if he were trying to distance himself from a thing still too close to him in time. In existential mathematics, that experience takes the form of two basic equations: the degree of slowness is directly proportional to the intensity of memory; the degree of speed is directly proportional to the intensity of forgetting.”

The intensity of forgetting is visible on a daily basis. It can be seen in Angie’s and Frederick’s ability to contradict themselves in succeeding paragraphs without noticing that they’ve done so. It leads to sentences like the one a student of mine wrote about William Bennett’s apology for the War on Drugs: “Bennett’s essay is filled with fallacies, which makes it very persuasive.”

It may be, of course, that my student did not forget that he’d accused Bennett of mendacity before he praised him for his persuasiveness. I find in many of my best students a profound acceptance of lying, illogic and gross appeals to emotion as the norms of communication. This is scarcely cause for wonder, since they have been raised in a bombardment of advertising and promotion, dialects dominated by those very characteristics.

For those reasons, I do not think we can make use of these technologies to teach our children to either care about writing well or thinking well. I don’t think we can use them to teach our children anything except further dependence upon technology. It seems clear to me, as it did to Forster, that our worship of The Machine is rapidly reducing us to the condition of Vashti – that is, of fungi in human form.

Teaching Freshman Comp for thirty years provides a remarkable opportunity to view the contents of each successive year’s minds. In my experience, this has resembled the opportunity to watch a photograph un-develop; the images have grown fainter and fainter, fewer and fewer, as the ideas have become increasingly “far removed from the disturbing element – direct observation.”

If the claims of The Machine’s promoters are remotely true, then the generation of students we teach today, and have been teaching for at least five years, must be the brightest, best-informed, best educated students in the history of the known universe. Their easy access to “information” has certainly been greater than any preceding generation’s. Can it be that access to “information” is not necessarily the key to knowledge or to wisdom? What is “information”? 

I have spent the last 25 years living with a severe back injury I sustained moving one of those accursed hide-a-beds, and being generally stupid. When my injury is about to get serious with me, it sends me signs through very circuitous routes. For example, if I start experiencing the sensation of nausea, I know that one of my vertebrae between 12 and 17 is out of line. It’s not that I ate something I shouldn’t have; it’s that I need to lie down on the floor and straighten out my sixteenth vertebra. My occasional sensations of nausea are information; my knowledge of what to do about them is not  information; it’s something else.

It is knowledge derived directly from physical experience combined with the teachings of Moshe Feldenkrais, whose book Awareness Through Movement  taught me how to become my own back specialist. One of the things he taught me was that “symptoms” don’t necessarily manifest themselves where they originate. Another was that using motion as a sort of imaginative x-ray, I could trace pain or discomfort to its source, if I was willing to take the time and expend the energy that imagination required.

Learning to imagine (see in your head) your own body resembles learning to ask the kinds of questions required by anything that can be called reading or anything that can be called writing. Such learning means close attention to detail, retention of a number of apparently unrelated details in the mind over time, and seeking relationships among those details that make them mean  something. These activities require time and patience. There’s no way around it.

I could have consulted a back specialist or a chiropractor when I first injured my back, and perhaps spared myself the three months during which I could only get around on all fours or the two years during which I was one wrong move away from that condition. I could have, except that I couldn’t afford to, my financial position being, as the sportscasters say, day-to-day.

I’m grateful I invested my time in reading Feldenkrais and teaching myself what his words meant. Had I gone to a doctor, I would still be in thrall to the medical profession, obliged to fork over large sums whenever my back got feeling poorly. Equally likely, I’d be a permanent cripple, sections of my spine fused by some helpful surgeon. As it is, my back is more reliable and strong than it was when I was 20, because I pay attention to its messages and know what they mean when I get them.

What conclusions do I wish to assert from this tedious personal history? 

That we always face a basic choice between relying on our own human powers (which we have let atrophy as we have fallen ever deeper into idolizing our tools) and relying on The Machine, otherwise known as Cutting Edge Technology or “Science.”

That Cutting Edge Technology costs a great deal of money (which would better be applied to supporting human teachers and students) while developing our human powers costs only time – the time it takes to develop, study and refine individual perception, knowledge, memory, imagination, concentration.

That the money spent on Cutting Edge Technology represents gigantic amounts of time spent by innumerable numbers of people. We mortgage our future time to the demands of Bill Gates. Perhaps only as we approach the end of it do we realize that time is our only actual currency.

The benefits of Cutting Edge Technology as tools for teaching or learning or living are self-canceling. If we and our children are pouring down cup after cup of legal speed to enable us to work longer hours so that we can afford to buy the latest version of “something that’s loud or flashes,” we will find that we don’t have time to make any thoughtful use of that loud, flashing something.

Not being able to afford the latest Scientific Breakthrough might be the best break available to the human race.

Our children have grown up in a world of electronic images and sounds that have supplanted direct experience and terribly stunted their powers of perception of the other and of themselves. They have been most effectively instructed to believe that the Present is the only reality, and so they have no collective and precious little personal past. (I’ve never been able to forget the answer one student gave to the question a Newsweek  reporter asked in 1983: “What do you know about John F. Kennedy?” “He’s dead,” the student replied; “What’s to know?”)

These children do not need further instruction in the art of passive viewing. They do not need to be told that education means picking up the capsulized “messages” spoon-fed them by some inordinately expensive substitute for an overhead projector or a book;  they don’t, for that matter, need an overhead projector. They do not need to be encouraged to believe that knowledge, understanding, or wisdom reside in packages instantly available at the touch of a button.

But these are the messages we give them, every time we fail to protest the purchase of the latest software “upgrade” (in which the Talking Paper Clip appears for the first time in three dimensions, yet more sublimely certain it knows what you want to do better than you do) and the latest hardware “upgrade” the software mandates. These are the messages we give them when we replace direct contact with “media” in our classrooms and our homes.

I pay enough attention to job announcements to have noticed that technological savvy has become almost mandatory for those seeking work as teachers. While I despise this development, I recognize that no young teacher can afford to appear to reside anywhere but on the Cutting Edge. So I address this, finally, to my fellow Old Teachers: can we  afford to let this worship of The Machine continue unquestioned?

I can’t; don’t know about the rest of you. I can say that students need teachers, not entertaining electronic images. I can say that the purpose of humanity is the development of each individual to that person’s greatest capacities, and that that development can only happen if the individual spends a lot of time paying attention to his or her immediate, sensory world. I can say that learning to see and hear and smell and feel the world, and then think about what those senses bring in, is a demanding life’s work, and doesn’t leave time to worry about how to wire up the cam corder to the electric fan so that the wind can become visible. I can turn off the Bubble Machine.

***

Dealin Doug: The Huckster’s Appeal

For about as long as I’ve lived in my town, local and cable channels have been infested by a used car dealer who calls himself Dealin Doug. He’s stocky, with a jutting jaw and an aggressive approach to the camera. He doesn’t speak. He shouts. He likes to dress up in thematic costumes he’s never earned the right to wear – baseball and football uniforms, Uncle Sam rigs, and so on. His pitches are based on gross exaggeration. They are meant to be annoying and succeed admirably.

I’ve never lived in any town, East, Middle or West, that hasn’t had some version of Dealin Doug – often more than one. Intentionally boorish, loud, obnoxious and aggressive salesmen of cars or furniture or carpets or roofing or something, they seem to have been turned out from the same factory, year after year after year. Why tamper with success? And their shtick must be successful, must appeal in some way to a sufficient number of people to keep the product going out the door, or it would long ago have been mothballed. What can be appealing about their appeal?

I’ll use the name Dealin Doug to represent all the hucksters I’ve seen over many years. I don’t know anything specific about Doug. He may be, in his private life, a loving husband and father, a samaritan, a shining example of humanity at its best. People in Chicago used to say, “Even a blind pig finds a truffle now and then,” by which I think they meant to imply that every human contains a range of virtues and vices. I’m sure television hucksters are no exception. Some of them must like puppies. So when I speak of Dealin Doug, I don’t mean the “real” Dealin Doug, but the act, the persona, the character he’s chosen to play as a salesman. It’s a character who’s had many names, though it’s always essentially the same character: the pitchman, the carnival barker, the fast-talking, high-pressure salesman.

Dealin Doug’s central – indeed, his only – pitch is the offer of A Great Deal on whatever he’s selling. What makes his deals Great? He does not comment on the original quality of his used cars, or on their maintenance records, or on any factor which might be of interest to a prospective buyer, except price. The deals he offers are Great because they offer the cheapest prices available, or so he alleges. In other words, he assumes his prospective customers are morons, since only morons believe that because a product is priced lower than others in its category it represents the best value. While I suppose that there may be such fools among the population, I can’t believe there are enough to finance Dealin Doug’s advertising budget or keep his business afloat.

A certain number of older people – older even than my generation – may be willing to take Dealin Doug at his word, assuming that his word must be good or he’d have been driven out of business. That used to be a fairly reasonable touchstone. In a predominately small-town America, a local merchant who engaged in shady practices soon became known and eventually shunned. If someone had been in business for ten years or more, you could assume with some degree of certainty that he was mostly honest.

My dad was raised in such a world and harbored such beliefs, and my dad was no moron. His belief in the reliability of reputation extended even to the products of national corporations. Until very late in his life, he was a Chevrolet man – never bought anything but a new Chevy every few years. He’d dealt with a local Chevrolet dealer for many of those years and come to trust him completely, so when he decided to buy me a car after I got out of the Army, he went to this dealer, who gave him a Good Deal on a fairly recent Chevy wagon from the dealer’s newly opened used car division. It looked beautiful – the exterior was clean, unblemished, undented, unscratched, waxed and polished.

All it lacked was lipstick, since it proved to be a pig the first time I stopped at the Mississippi River and checked the oil, which I’d topped up before leaving Chicago for Denver. And of which all 8 quarts had gone missing in fewer than 200 miles. The pig was out of the poke, and I learned that my dad’s lessons in business ethics might need a little updating.

I don’t think there are many people left who take salesmen at their word. During my lifetime, we’ve come to expect that most people do a lot of lying, especially when they’re trying to sell a product or a candidate or a version of some event. So customer gullibility doesn’t explain the success of Dealin Doug’s act.

The longer I consider that act, the more baffling that success. Dealin Doug seems incapable of speaking without using his hands. His gestures are limited. He thrusts out both spread hands at the camera, as if throwing a large medicine ball at the viewer’s face. He jerks both hands up from the elbows into fists. He pokes one fist, index finger extended, at the camera, as if poking the viewer in the chest. He employs these three gestures from beginning to end of each message.

I might be engaging in projection, but I can’t believe that very many, if any, people enjoy having large objects thrown at them, being threatened with right or left jabs, or being poked in the chest. And yet they find Dealin Doug amusing, or sincere, or something – whatever it is, they’re drawn down to his car lot. I can’t fathom it.

But then I think about the first time I was exposed to what was called a “music video” on MTV. The friend who compelled me to watch it, and about fifteen more like it (just like it) told me it was the new new thing in the Music Biz.

The video makers had learned valuable lessons from advertisers of other products. The images didn’t have to have anything to do with the words of the songs. Words meant almost nothing. The images were what counted, and it was important to keep them changing from one to the next at the rate of about an image per second. (I’d guess that’s down to about an image per quarter second now, though I keep forgetting my stopwatch.) If you bombard the viewer with images faster than anyone can consciously process them, they’ll keep that viewer riveted, trying to catch up, trying to make some sense of the moving slide show, trying maybe, at first, to connect the images with the song they accompany.

In the early days of MTV, some ambitious videos attempted to tell the story of the song, or at least to illustrate significant moments in that story, but those ambitions were soon abandoned. They were replaced by shots of the bands or individual performers pretending to perform. These shots were not meant to show how the music was actually being produced, or, really, to show anything except a disorienting, over-stimulating succession of views of essentially static musicians – the message was in the shots themselves: Look here! Now look here! Look at the same guitar from the floor! Now run up so close the tuning pegs are the size of tree trunks! Now roll across the stage looking up! Now look down from the catwalk! Wow!  Isn’t this exciting? We don’t know where we are, or were, or where we’ll be next!  But it’s sure to be somewhere different!

When rap came along, the dominant video style changed again. Most rappers meant to convey BadAss Autonomy, so they weren’t content to serve as mere camera fodder for videographers. Most of them were adolescents or retained adolescence and so their acting technique to seem menacing was largely the first thing a kid thinks of to do when faced with a lens, which is to push his face aggressively into it. If he happens to be holding a guitar or an electric bass that resembles a semi-automatic rifle, as more and more do, he pushes that into the lens. Rappers in rap videos, in other words, approach the viewer on the other end of the camera much as Dealin Doug approaches his viewers. I find their appeal as mysterious as Dealin Doug’s. What is it that people enjoy about being threatened and bullied and yelled at?

My age is showing (again). I’m assuming most people react to the images on their screens and their accompanying sounds as I do. I react to these sights and sounds as if they were Real, as if the performers were physically present in my home, hectoring me, shouting at me, getting all in my face. I often wish to shoot my tv, and at the first appearance of Dealin Doug (I never even go near MTV or its imitators), I do the tv-land equivalent and go for my remote and flip to some other channel. Poof! Dealin Doug dead.

Most people, reared on television from birth, are far more sophisticated. They have long ago learned that nothing on tv is really real, that threats of violence, acts of violence represent no real danger. They are merely stimulants for the adrenal glands, whose adrenaline and accompanying steroids burn off rapidly, and demand frequent replacement.  As for the nonsense Dealin Doug spouts, the hatred and sexist viciousness of most rappers . . . they don’t really mean anything. If some hero gets shot on the screen, no sweaty-da – he’ll be back next week, good as new. People who can’t remember a life without television are experts in suspending their disbelief in the service of their own entertainment, which has consisted, more and more as television programming has mutated, of goosing their adrenal glands ever more frequently.

All television does this by the kind of constant, rapid shifting of the viewer’s point of view accomplished by changing camera angles and lenses that I have described rap videos employing. These shifts keep the viewer both startled and disoriented, struggling to keep oriented in a “reality” very much unlike any individual’s reality, which is confined to one point of view, a most limited one. I cannot, for example, see across thousands of miles or look down on myself from above or see around corners or through walls. But television cameras can, and do, at a rate that rivets the viewer.

This does not suffice to hold viewers forever. We get used to these tricks, and bored with them over time. So our other primary sense, hearing, must be enlisted to keep the glands firing. If you doubt me, perform this experiment: turn on a television set and blank out or cover or simply turn away from the screen, and just listen to whatever channel comes up for five minutes. From a dramatic show – nine times out of ten that will mean some kind of cop show – you’ll hear a constant accompaniment of “musical” sounds of a quite specific nature: either low, whale-like electronic moaning if suspense is to be built or screaming electronic tones toward the high end of the audible spectrum, all accompanied by random percussive effects that don’t induce any sense of a rhythmic pattern – they’re meant to induce the startle reflex. More adrenaline, please. If you happen to bring up a sports channel, whatever sport is on display or under discussion, the accompanying soundtrack will be stadium-rock, mock heroic/military, with biting brass fanfares and fuzztone electric guitars with that 30 foot Marshall sound. If a “news” show comes up, at least three people will be trying to holler over each other most of the time. If it’s a series of commercial Messages – the odds are good it will be – there’ll be some kind of intrusive, unsettling sounds going on behind the claims and disclaimers. Not even the weather channel is free of quasi-musical goadings, pokes and gooses.

All of these desperate efforts to keep the old adrenaline flowing have been going on long enough that they’ve receded from most people’s conscious notice. We’ve learned well the arts of half-seeing and not-listening, hence the increasingly hysterical nature of nearly all television programming. We’ve gotten used to that, too. As I learned in the Army, you can quickly get used to almost any conditions, no matter how appalling they might initially seem.

I think I still take note of the unsettling, jarring nature of television images and sounds, when I do, because I grew up on radio. Radio didn’t lack for intrusive sound tracks, but they had to be kept in the background because radio was still a literate medium. It had to depend on words to convey whatever it wanted to convey, and the words couldn’t be obscured by extraneous sounds, musical or otherwise, or the listener couldn’t follow them. The listener had to pay close enough attention to the words to recreate by means of imagination what the words were describing.

By the time I was old enough to listen to the radio, the words that counted were describing products: cigarettes, laundry soaps, cars, cooking products – the whole post-War cornucopia of our madly over-productive economy. At first, as radio had begun to become a commercial enterprise, sponsors had limited themselves to stating their sponsorship of various programs, rather as modern corporations buy naming rights to athletic stadiums. But soon they began taking air time away from their programs to advertise their wares.

The first radio advertisements were straightforward. “This is Ralph Tuxedoface for Sparkle Cereal, the only cereal you can find in the dark to eat before dawn. You should buy some. It’s cheaper than anything and mighty delicious. And now, the Loves of Linda Lively asks the question: Can a girl from a little strip mining town in East Puke, Nevada, find happiness as the wife of the Shah of Bratiphoor, India?” After twelve minutes of dialogue proved insufficient to find an answer to that question, Ralph might come back to briefly remind listeners what outfit had ponied up for their free entertainment, namely the generous makers of Sparkle Cereal, the Reputable Grain Cartel.

As radio metastasized, so did the advertising industry, and radio advertising pitches became increasingly sophisticated. First appeared mini-dramas, in which actresses voiced anxieties over their hair, their skin, their pies, and other actresses provided them with the Reputable Grain Cartel’s newest solution, shampoo with omoolionts (a never-before-heard of boon to shininess), soap with pulverized neats feet – no more dry skin, ever! – Frespo shortening, made at that new million-dollar atomic churning factory you’ve all been hearing about. When these fairly straightforward dramatic appeals began to bore their writers, post-modernism came on the scene, years before it raised its scaly head out of the French swamp.

The first post-modern ad started off in standard fashion:

Mrs. Potsdam: Oh, Harriett, I wish I knew the secret to your pie crust. Jim won’t even touch my pies. He says they remind him of moose turds!

Harriett: Well, my dear, I just don’t know…I’ve been making them the same for so long –

But here, a sudden interruption: the main announcer, who has introduced the radio show and then told you to wait to hear it while you heard this word from the sponsor, leaps into the mini-drama:

Announcer pretending not to be the announcer: Well, ladies, perhaps I can explain the source of the wonderfulness of Harriett’s pie-crust!

Harriett and Mrs. Potsdam (together) Who are you???

Announcer pretending not to be the announcer: Oh, just someone who knows a lot about pies, and….

And he goes on to explain how Frespo’s new million-dollar atomic churning process makes for a shortening like no other, one that gives your every baking effort three new, new desirable qualities: they will be lighter, tastier, and more digestible. Nothing at all like moose turds, he does not add, though the implication lingers.

Harriett and Mrs. Potsdam are appropriately grateful. They’ve entirely forgotten their initial shock and anger that some unknown oaf has suddenly turned up in Harriett’s kitchen, and that he’s even now moaning in quiet ecstasy as he scarfs another piece of Harriett’s latest pie, and Mrs. Potsdam resolves to go out and get her some of that Frespo shortening at the earliest opportunity, even if she has to steal Harriett’s ration book to do it.

This new advertising style served a couple of purposes. It apparently sold the product, since such confusions of carny pitching and drama proliferated for many years. But what it was really selling was the infestation of your private dwelling by salesmen yammering incessantly at you about how you should be spending more money on more shit you hadn’t realized you needed. It was selling the monetization of all life, public and private, the idea that buying and selling were the primary purpose of any and all human activities. It served a third purpose: it introduced a new kind of mentality to humans, a frame of mind unable, or perhaps unwilling, to separate fact from fiction.

The idea that humans knowingly, willingly suspend their critical faculties, their tendency toward disbelief, goes back at least to Aristotle, who pondered on such suspensions in theater audiences. Surely everyone at the Globe “knew” that Hamlet hadn’t “really” skewered old Polonius behind those drapes that they “knew” didn’t drape anything but the back wall of the set, which they “knew” was a set. Theater audiences purposely put aside their skepticism, they spent a couple of hours letting on that they were watching a bunch of long-dead nobles spout poetry along with (fake) blood. It was a way of resting their imaginations, on which they otherwise had to depend for entertainment, for illustrations to the stories they told or read.

Going to the movies offered a similar exercise in suspended disbelief. You paid your way into an ornate, gaudy theatrical palace and watched moving images which could perform all sorts of entertaining actions much more convincingly than stage actors could. They had the help of film technology. They could fly through outer space. They could hang from a clock face on a skyscraper. They could murder a thousand Indians with one six-gun. They could put their face up close to yours, look deep into your eyes, offer you unending pleasures. But you knew that after a couple of hours, you’d walk out of the movie theatre and be right back in the beige of Quotidian Street, changing the litter box and stoking the furnace. The images in movie theaters remained quite clearly separate from “real” life.

Television changed all that. Early sets either sat on top of a piece of furniture or inhabited a piece of furniture in what was then known as “the living room.” In the early days there were few shows to watch, and they were events. When the big networks quickly established dominance over most programming and began presenting rosters of popular shows – many, for a few years, visual versions of popular radio shows with an already established audience base – family life began to be organized, at least during the evening hours, around “what was on.” Much family strife ensued over which programs to watch, but  as sets proliferated and grew more affordable, more and more households began to hold two or more sets, one for the adults, one for the kids. By the 1960s, in more and more homes, televisions were left on all day and well into the night, whether anyone was “watching” them or not. They had become as unconsciously accepted parts of home as beds or chairs or ovens or air.

Fifty years of film technology didn’t go unused by tv for long, especially after videotape was perfected. Pretty soon, all the miracles once limited to movie screens performed themselves in every tv room. (It only took about ten years before the television set was granted its own, dedicated room in most middle class homes.) Household products grew limbs and eyes and voices and cavorted entertainingly about, singing their own praises. Heads of distant states speechified and gesticulated right below the sleeping cat or the pothos plant. Joe Friday raced all the way across L.A. to apprehend a felon, who was convicted by the time you’d been convinced to run out for a pack of Chesterfields. Horses talked. Cars talked and gave birth. Time and space became malleable. Miracles became part of everyday life and soon seemed not particularly miraculous, since they were available at the press of a button without leaving home. Daily life began to seem tedious and unrewarding in comparison. E.B. White, viewing an early demonstration of the medium at the 1939 World’s Fair, foresaw what it would lead to: “Television will enormously enlarge the eye’s range, and, like radio, will advertise the Elsewhere . . . .  More hours in every twenty-four will be spent digesting ideas, sounds, images – distant and concocted. In sufficient accumulation, radio sounds and television sights may become more familiar to us than their originals. A door closing, heard over the air; a face contorted, seen in a panel of light – these will emerge as the real and the true; and when we bang the door of our own cell or look into another’s face the impression will be of mere artifice. I like to dwell on this quaint time, when the solid world becomes make-believe . . . when all is reversed and we shall be like the insane, to whom the antics of the sane seem the crazy twistings of a grig.”

“Ideas, sounds, images – distant and concocted” – but ever more “lifelike,” as the technology constantly improved. Today the images are nearly life-size, once you can afford the 86 inch LED flat screen. The illusion that you’re watching something “real” grows ever more powerful. Yet you “know” it isn’t “really real.”

Once you begin having to use such phrases as “really real,” you’re well on the way to deciding that there is no fixed reality, only versions of it that please or displease you. You are of course aware, on some level, that though the law of gravity, say, has many displeasing consequences, it remains in force whether it pleases you or not. You are aware, on some level, that Dealin Doug doesn’t necessarily believe his own promises and claims and boasts, whether they please you or not. But you have been subjected to such a deluge of advertising and opinion posing as “news,” nearly every word of which has been contradicted by the accompanying images or by your own experience of life, that you are reduced to believing, in the words of writer Jonathan Dee, “that words can be made to mean anything, which is hard to distinguish from the idea that words mean nothing.”

In a world of ever-increasing uncertainty, many people feel more and more desperate for some kind of certainty, and the appeal of people able to convey that they are dead certain of the truth and value of what they’re saying grows. Does Dealin Doug contradict last week’s sales pitch with this week’s? No matter . . . it’s the conviction with which he asserts that this week’s, then next week’s, then next week’s deal is the greatest deal ever that moves those people desperate for some kind of certainty. Is Dealin Doug something of a buffoon, bursting out of his Uncle Sam suit? No matter . . . he appears to be fully at home in that suit. We can share that feeling of confidence with him for a while, and feel grateful for it. We might even feel grateful enough to buy a used car from him.

***

My People Don’t Do That

Dorsey Templeton was a math teacher at my community college. He’d served in the 101st Airborne during the Korean War, gone to college under the GI Bill, and was an All State football player, on both sides of the ball. His heritage was Osage and Cherokee, and he never stopped teaching wherever he could about Native American history. He was a big man who bore himself in the military style.

I happened to be standing next to him along the wall of the gymnasium the day our latest new President had called us to assemble for her introduction. She had clearly attended a New President Workshop, where she had learned that it was important to establish Warm Personal Relationships with her faculty. She proposed to do this by coming around the big circle of teachers and hugging each of us in turn.

When she came bearing down on Dorsey, he folded his arms and stood back against the wall. The new President kept coming, arms akimbo. She was finally stopped in her tracks when Dorsey’s bass voice drummed out, “My people don’t do that.” As I recall, she tried to smile, gulped, and moved on to the next target.

My People Don’t Do That. I’ve treasured that statement ever since, and frequently found occasion to repeat it. My people and Dorsey’s were often deadly opponents, but we shared at least that visceral aversion to public, physical displays of affection, especially affection from people we didn’t know.

My people were of the 20th Century. I could say, with some truth, that my people were Midwesterners, descended from farm and small town folks, and that they generally shared a cautious outlook on life. They were like the people described by a Cubs fan named Jack Wiers: “We were taught to believe you never promoted yourself. You let your work speak for itself. There is a Midwest sensibility there.”

My particular people were, more specifically, Chicagoans, and one of the first things I learned, before I knew I was learning anything, was summed up in a Chicago saying: Don’t Put Your Business in the Street. In other words, keep your private life private.

You could find all sorts of fault with that sense of privacy. It often led to the concealment of evil behavior within families. It led to the press’s “gentlemen’s agreement” to stay mum about the moral or mental shortcomings of public figures. It led to egregious and long-lasting failures to deal with all sorts of discriminatory behavior. Yet I’m not seeing a lot of evidence that our rapid and nearly complete rejection of privacy has brought about great improvements in any of those areas.

A preference for privacy was hardly limited to Midwesterners or Chicagoans. Robert Frost wrote in one of his journals, “After babyhood self-improvement becomes a private matter. Physical, mental, or moral, please attend to it where I can’t see you if you care to avoid my disgust.” Later in the century, Katherine Hepburn observed, “The right to privacy – Fifty years from now this word as we have understood it – will have no meaning at all – if our world continues in its present direction . . . . Talk – tell it – it is never your fault – We’ll fix the blame – Mama – Papa- Uncle Sam – Teacher – Employer – They are responsible . . . . you have a public geared to listen – read -speak -about the most intimate details of another’s life (to say nothing of their own) and geared to ‘understand’ any vagary – because nothing is either right or wrong . . . . “

Hepburn saw what was coming, sure enough. Over the past few months, the internet has enabled me to learn the following: Rihanna has a new and “weird” habit since welcoming a baby boy With A$AP Rocky. Shakira is moving to Miami with her sons after hammering out a custody agreement. Chanel West Coast is sharing the name and image of her baby girl. Steven Spielberg wept on the set of The Fabelmans—a lot. Kim Kardashian has revealed the sex with Pete Davidson that was Inspired by her Grandma. Kylie Jenner’s daughter, 4, is storming to the top of the best dressed lists in her silver dress and chunky trainers – with her own $991 handbag. And though Natasha Lyonne has confirmed her split from Fred Armisen, she insists that they’re “still talking all the time.”

Learning these things about people I wouldn’t know even if I knew their last names, I wept all the time – a lot. I’m very sensitive, you see. You need to know that. But I can’t really get with this new century. I run around in a state of constant bafflement at people’s newfound propensity for publicizing details of their private lives. I don’t engage in antisocial media, but I gather that billions of people are emulating the Kardashians by Telling All to the world at large.

Back before Facebook and the rest had started, Lewis Lapham put his finger on the motives of all these Non-Secret Secret Sharers: “In order to fuel the engines of publicity the media suck so much love and adulation out of the atmosphere that unknown men must gasp for breath. They feel themselves made small, and they question the worth, even the fact, of their existence . . . . At any one time the ecology of the media can bear the weight of only so much celebrity, and as the grotesque personae of the divinities made for the mass market require ever more energy to sustain them, what is left for the weaker species on the dark side of the camera?”

Answer: put everyone on the lighted side of the camera through the miracles of digital technology. Now anyone can prove his or her existence with the expense of a few hundred dollars and a great deal of time: voilá, my website/blog/page of selfies, baby photos, etc. See, I really do exist. And I can claim your fleeting attention by revealing my grandma’s sex secrets, the details of my separation agreement, or the clothes I bought little Pootifac Gulf of Mexico Jaws Johnson for her six-month birthday.

Maybe some or all of the new internet celebrities are deriving satisfaction and confidence and a sense that they’re valuable from putting their business on the internet street. To me, their activities seem generally tawdry and ephemeral and silly, but that’s undoubtedly the common view geriatrics like me take of succeeding generations’ activities and choices. As Katharine Hepburn went on to observe, after lamenting the abandonment of privacy she saw growing around her, “Bobby Kennedy wants to climb his brother’s Canadian Mountain – to be the first to get to the top – leave a token there – This was warm and thrilling and mysterious – until he sold or gave it to Life magazine – and the television went along – the act had to lose a lot of its meaning to him by being publicized – or at least it would have to me – Who am of another generation.”

Me, too. So while I could tell you about the precious little BMW I gave my grandson Agoraphobia Blitzkrieg for his graduation from prison, I guess I won’t. I need to walk my dog. And besides, my people don’t do that.

***

Oh Brave New World That Has Such Cornholes in It

“‘It suddenly struck me the other day,’ continued Bernard, ‘that it might be possible to be an adult all the time.'”
Aldous Huxley, Brave New World

In 1963 when I was an undergraduate at a midwestern university, a famous poet (at that time such a thing still existed) was spending a year on campus as Poet in Residence. He was widely rumored to be homosexual (“gay” had not then become a synonym). A couple of English grad students and I went to a party in his apartment near the university. Next day, we were sitting around the graduate student lounge when another grad student, who hadn’t made it to the awkward soirée, came in and, leering at his compadres, asked, “Did he cornhole  ya?”

“Cornhole” was then vulgar slang for anal sex, and that was the only meaning I knew for the word for much of my life. So I was somewhat startled when, surfing through the channels on my television set, I came upon the National Cornhole Championship on one of ESPN’s many spawn. The Championship wasn’t showing at that moment. A commercial message was portraying a number of hysterically happy senior citizens disporting themselves after they’d ingested some lethal drug. I waited impatiently through this to see what the National Cornhole Championship would consist of.

Essentially, it consisted of people throwing beanbags at a plank with a hole in it. I faintly recalled tossing beanbags around with my friends when I was five or six years old. I hadn’t seen one since, until now. I’d thought beanbags were something kids outgrew before they reached puberty. How had they entered the world of Professional Sports?

My initial internet search term, “Cornhole Game History,” led me first to the site “letsplaycornhole.com,” which offered what it described as a “True History of Cornhole Game.” That history, while shy – in fact, utterly devoid – of documentation, proved to be written in a prose so eccentrically and remotely related to English that I must quote from it liberally.

First alleging that more and more people are playing Cornhole, the site states that, “no one really knows the origins of the game and the cornhole game name” but that “there are multiple accounts all claiming to explain the history of cornhole game.”

The first account suggests that Cornhole began in Cincinnati, Ohio, but the site makes short work of that claim: “Although the annual cornhole tournament is normally held in Ohio, that in itself doesn’t make a solid claim to the game’s originality in Ohio.” Kentucky’s putative parentage is as easily dismissed: “Controversially, the same history also shows the game having traveled multiple paths throughout its entire history.” So much for Kentucky. In fact, so much for any “Midwestern town in Illinois or Indiana.” The whole debate over the origins of the game and its monicker “might all be but a moot. Cornhole game actually originated in Germany several hundred years ago and here’s all the evidence to show for it.” And here it is, verbatim:

“There’s a strong correlation between the origins of cornhole as an ancient civilization game and its founders, the early German emigrants.

“The civilization is thought to have taken massive interest in tossing rocks at holes dug in the ground as a past time which eventually fuelled the birth of cornhole as a game.

“Matthias Kueperman, an ancient German farmer is believed to have been the person who invented and perfected the game. It was in the year 1325 and Kueperman is said to have done all these in Bavaria in his own backyard.

“Cornhole historians hold that while Kueperman was taking a stroll during one of those fine spring days, he observed a group of kids really having fun throwing some heavy rocks into a hole dug in the ground. That instilled worry in him seeing that the kids could easily get hurt. Yet he lacked the will to stop them from doing what they seemed to enjoy best.

“From there henceforth, Kueperman decided to come up with a safer game based on what he had just observed.

“The said stones were found to weight about 1.13 pounds, which in old German language is an equivalent of 1 Pfund. The weight is seen to have been ideal for achieving 16 feet or thereabouts when tossed. Note that the stone’s hardness is to date potentially risky for every person who engages in the game.” Well, then. That clearly and convincingly (if you’re well short of reaching puberty) explains the game’s origins, but what about its name?

“The cornhole game name origin has been somewhat controversial, though, with a number of people rooting it to Jebediah McGillicuddy, another Midwestern farmer. It’s believed that Jebediah, a corn farmer, invented cornhole in an effort to counter boredom after tending to his chores whereby he engaged in the game with his friends and family so as to have quality time. However, there isn’t any solid evidence to support this claim. In fact, this story, unlike the Kueperman’s, is only heard from those residing in Kentucky.” Well, we know what they’re like.

Now for a brief return to world history: “Later on, this whole new invention resulted in unforeseen repercussions as the making of the goal board eventually brought forth deforestation which raised alarm amongst woodworkers.

“As a consequence, noble wood merchants looked for help from their lord which later on resulted in the implementation of the corn laws of Britain during the 15th century. Exorbitant taxes were imposed on corn imports which in effect took a tool on the production of corn bags. Soon after, cornhole tournaments became costly.

“As a result, the game became quickly forgotten and slowly disappeared into oblivion. Thankfully, this never lasted long as the cornhole game, hundred years later, resurfaced again in the regions of Cincinnati.

“According to historians, Cincinnati bears strong German roots and so it’s believable that Matthias Kueperman’s story bears weighty truths in it . . . . All these truths sum up everything you might want to know about cornhole’s true origin and the history of cornhole game name came about . . . . It’s rare nowadays to observe any college football and not see multitudes of people playing cornhole.”

Call me a cynic, but anytime I encounter such a plethora of passive voices (“is thought,

is believed, is said, is seen, it’s believed, is only heard, were imposed, became quickly forgotten, it’s believable”) in a few paragraphs, it takes a tool on my credulity. But it did so only because I failed initially to recognize that I was reading what Jules Henry dubbed “pecuniary pseudo truth – which may be defined as a false statement made as if it were true, but not intended to be believed. No proof is offered to a pecuniary pseudo-truth, and no one looks for it” (Jules Henry, 47). 

I remained unsure I’d found the true history of this latest professional sport, even though my skepticism might only be a moot. Delving on, I came upon more plausible accounts. Stacey Moore, I found, “talked about how his family had started a semi-pro basketball league that ended up failing. Stacey was able to take those learning on what had worked before and transferred it to the ACL. He started his research and development by going to college tailgates to see if there was proof of concept.” Evidently those tailgates provided sufficient proof of concept that “[Moore’s American Cornhole League] have a 3 year deal on ESPN and are one of the most popular sports on the network. Even celebrities are interested” (https://upreneur.com/2020/10/21/stacey-moore-commissioner-founder-of-the-american-cornhole-league/).

Another claimant to the Originator’s title, Frank Geers, who founded the American Cornhole Organization in 2005, explained his impetus thus: ” I got involved with cornhole as an extension of my marketing company Harris Hawk 15 years ago, when I was looking for better ways to help my clients market themselves. I stumbled across the idea of cornhole. It was a simple yet fun game, and the boards and bags were billboards waiting to happen. We could logo the product to help market our clients’ brands” (https://musebycl.io/sports/bag-man-how-frank-geers-growing-sport-cornhole).

“Billboards waiting to happen” – what better example could be found of what Rex Sorgatz has dubbed “Toyetic . . .  a nasty neologism, forged for this synthetic era of synergistic entertainment experiences . . . .  An adjective initially coined to describe a movie’s potential to generate revenue from toys . . . the definition of the term has evolved to encompass all possible merchandising opportunities for any type of media property” (Sorgatz, 213).

I was beginning to get the idea that the sport of cornhole might be more about marketing than about Matthias Kueperman’s concern for the safety of rock-throwing children. This suspicion was cemented into certainty as I kept investigating cornhole sites and finding that the first thing they all wanted to show me was a photographic catalog of products (a typical sample: $130 – $225 for variously illustrated boards with holes in them; $20-40 for beanbags; Cornhole score tower&drink holder combo $40). In most cases, that was also the only thing they wanted to show me.

In Brave New World, Aldous Huxley had envisioned the necessary connection between consumption and games: “‘Strange,’ mused the Director [of Hatcheries and Conditioning], as they turned away, ‘strange to think that even in Our Ford’s day most games were played without more apparatus than a ball or two and a few sticks and perhaps a bit of netting. Imagine the folly of allowing people to play elaborate games which do nothing whatever to increase consumption. It’s madness. Nowadays the Controllers won’t approve of any new game unless it can be shown that it requires at least as much apparatus as the most complicated existing games’ (Huxley, 31). The Controllers would have abhorred beanbag and insisted it be transmogrified into Cornhole.

“. . . even in our Ford’s day” suggests the reason the Controllers would have embraced Cornhole. The society imagined in Brave New World derives from the techniques of mass production that Henry Ford did so much to pioneer. The World State’s motto is “Community, Identity, Stability,” and the Director explains why endless, and endlessly increasing, consumption is the necessary foundation for all three: “The machine turns, turns and must keep on turning – for ever. It is death if it stands still. A thousand millions scrabbled the crust of the earth. The wheels began to run. In a hundred and fifty years there were two thousand millions. Stop all the wheels. In a hundred and fifty weeks there are once more only a thousand millions; a thousand thousand thousand men and women have starved to death.” (Huxley, 42)

While mass production has not alone been responsible for the explosion of human population, it has certainly accompanied that explosion and become its sole and necessary economic system. When he considered advertising aimed directly at children in 1963, Jules Henry observed, “In contemporary America children must be trained to insatiable consumption of impulsive choice and infinite variety” (Henry, 70). In Huxley’s imagined society, this training is carried out in the World Government’s human hatcheries/conditioning centers by means of sleep teaching, the nightly repetition of messages such as “Ending is Better Than Mending” that teach future citizens to abhor the old and throw it away as soon as possible, that the machine may keep on turning.

Another necessity in the Brave New World is that human language be reduced to a near-infantile level, in order to assure stability and render critical thought or its expression impossible. So people learn to wear such items as zippicamiknicks and zippyjamas, to sing such popular love ballads as “”Hug me till you drug me, honey/Kiss me till I’m in a coma/ Hug me, honey, snuggly bunny/ Love’s as good as soma.” In our own society, rejection of the past and language degradation, while the government contributes to the efforts, are primarily the responsibility of the advertising industry. Cornhole Worldwide is doing its part to reduce the language to infantile babble by offering an exhaustive list of

“Best Cornhole Slang You Must Know:

#1. Cornament: A cornhole tournament

How to use it: ‘I’m having a cornament this weekend. Do you want to join the bracket?’

#2. Holy Moly Triple Cornholy: Three cornholes in a row

How to use it: ‘Holy moly triple cornholy! That was one of the best turns I have ever seen!’

#3. Cornstar; An extremely confident cornhole player

How to use it: ‘They are a family of cornstars! They never miss shot with their perfect technique!’

#4. Cornholed: When a stray bag hits you

How to use it: ‘Watch where you throw, I got cornholed right in the face!’

#5. Skunk / Whitewash / Shutout: Finishing the game with zero points

How to use it: ‘We skunked in the cornament. We couldn’t get a bag on the board.’

#6. Shucked: What you are if you are losing or lost the game

How to use it: ‘We are going to be shucked unless we do something fast.’

#7. The Great Cornholio*: Four cornholes in a turn

How to use it: ‘Did you see that?! They got a Great Cornholio!’

*Also known as a Gusher, Jumanji, Double Deuce, Catorce, Cornzilla, Four Bagger, 12 pack, Golden Sombrero, and Galbraith.

#8. ‘Get that corn out of my face!’ :What you say when you stop your opponent from scoring

How to use it: ‘You really thought you could make that shot? Get that corn out of my face!’

#9. Cornfusion: Disagreement about scoring and points

How to use it: ‘There was a lot of cornfusion after my dad got three cornholes in a row.’

#10. Corn on the cob / Leprechaun / Four-leaf clover: When all four bags land on the board

How to use it: ‘You must be pretty lucky to get a leprechaun on your first toss.’

#11. Sally*: A weak toss

How to use it: ‘He calls himself a cornstar but he only throws sallies.’

*Also known as Candy Corn,  Short Toss, Suzy, Mary, Corn Patty, and Weak Sauce.

#12. Nothing But Hole/ Airmail: A bag straight in the hole that doesn’t ever touch the board

How to use it: ‘She is the Michael Jordan of cornhole. She airmailed it with nothin’ but corn.’

#13. Cornholer: One who plays cornhole fanatically

How to use it: ‘I’m a huge cornholer! I even own a custom cornhole board and play every weekend!’

#14. Cornado: A player that has the highest points and is on a roll

How to use it: ‘Get that corn out of my face! I’m the cornado here.’

#15. Dirt Bag: A bag on or touching the ground (and anyone who cheats in cornhole)

How to use it: ‘I really thought I had that throw, but it turned out to be a dirt bag.'”

Perhaps I’m merely reacting like those English-teacher fuddy-duddies back in the 1960s who made such a nuisance about the slogan “Winston Tastes Good Like a Cigarette Should,” insisting that it should be “. . . As a Cigarette Should.” Perhaps the fractured grammar, syntax and spelling that characterize the various Cornhole sites, and the fanciful historical characters and events (Mathas Kueperman, Jebediah McGillicuddy, 15th-Century corn laws) they advance as “true history,” are merely more instances of Henry’s pecuniary pseudo truth, just Business As Usual. Perhaps I am silly to quail at internet headlines like the following:

“Tinder makes it easier to see if you vibe with someone’s Spotify taste

 Now DuckDuckGo is building its own desktop browser

 Clippy will return as an emoji in some Microsoft apps

 BoohooMAN Drops DaBaby After His Hateful And Ignorant Comments Go Viral

 Google has a cute little Wordle Easter egg

 Yes, Topanga is married to the Cinnamon Toast Crunch shrimp tail guy”

Goo-goo talk has, after all, infested popular culture since at least the 1920s – oop boop-a doop. But the ascendance of pecuniary pseudo truth to our primary, acceptable (and almost universally accepted) form of discourse might have a few more seriously negative consequences. Author Jonathan Dee, considering the language of advertising, observed, “The real violence, though, lies not in the ways in which these messages are forced upon us but in the notion they embody that words can be made to mean anything, which is hard to distinguish from the idea that words mean nothing” (Dee, 67). Social critic Neil Postman argues that our growing replacement of written language with imagery (which, we are discovering, can be infinitely manipulable) has been disastrous: “Some ways of truth-telling are better than others, and therefore have a healthier influence on the cultures that adopt them . . . the decline of a print-based epistemology and the accompanying rise of a television-based epistemology has had grave consequences for public life . . . we are getting sillier by the minute” (Postman, 24). If you don’t  believe him, ask BoohooMAN.

Given the spreading, unfocused rage and assertions of “alternate facts” that have swept the world over the past few years, “sillier” might be putting it too mildly. It appears to me undeniable that more and more people are accepting absurdities and paranoid fantasies as reality, and acting upon them in the real world, where the rest of us must contend with their madness.

I have no idea what the “far right” (whatever that may mean) finds objectionable about butterflies, for example, yet I am informed by Google news that “Texas butterfly sanctuary forced to close after far-right threats.” Not only butterflies are threatened: “Two Los Angeles officers fired for ignoring robbery to play Pokémon Go.”

In fact, pretty much everyone is threatened: “Battlefield 2042 Reportedly Sold 4.23 Million Units in its First Week,” Google news reports happily, and “Call of Duty is getting a ‘new Warzone experience’ in 2022.” The carnage in such games becomes ever more convincingly “realistic,” visually speaking, and more and more players seem to be transferring their taste for imaginary violence into the real thing. In a few short days at the end of last year, “Oxford school shooting – latest: Suspect ‘used gun his dad bought on Black Friday,” “A 14-year-old was chased and shot 18 times while waiting for a bus in Philadelphia,” “Tennessee shooting at high school basketball game leaves 1 dead, 1 critical,” and “Suspect in Michigan high school shooting charged with first-degree murder.”

When television was demonstrated at the New York world’s fair in 1939, E. B. White foresaw our brave new world of alternate reality with remarkable clarity: “Television will enormously enlarge the eye’s range, and, like radio, will advertise the Elsewhere . . . .  More hours in every twenty-four will be spent digesting ideas, sounds, images – distant and concocted. In sufficient accumulation, radio sounds and television sights may become more familiar to us than their

originals. A door closing, heard over the air; a face contorted, seen in a panel of light – these will emerge as the real and the true; and when we bang the door of our own cell or look into another’s face the impression will be of mere artifice. I like to dwell on this quaint time, when the solid world becomes make-believe . . . when all is reversed and we shall be like the insane, to whom the antics of the sane seem the crazy twistings of a grig” (White 3).

I think it’s most accurate to view Cornhole as a transitional stage, standing between actual human games or sports and the virtual games of our emerging digitized future. Like Brave New World‘s Centrifugal Bumble-puppy or Obstacle Golf, Cornhole refers back to those games we used to play sheerly for fun and exercise, suggesting that they are similar products of humanity (as we are still in the habit of viewing ourselves) rather than tools of the digital economy. Such games are meant to reassure those dwindling few citizens who can recall a different past that they are living in an improved version of that past, that they are still humans, still protagonists with free will. Very likely such games will fade out rather quickly as virtual reality capability is universally wired into the next generation of infants. Who needs beanbag when you can fly to Betelgeuse and fornicate with the heavy-breasted aliens of your choice after you’ve blasted the competition to pieces? Beanbag can only lead to cornfusion.

Works Cited (in order of appearance):

Aldous Huxley, Brave New World, Harper Perennial Classics, 1998

Jules Henry,  Culture Against Man, Random House, 1963

Rex Sorgatz, The Encyclopedia of Misinformation, Abrams, 2018

Jonathan Dee, ”But Is it Advertising?” Harpers, January 1999

Neil Postman, Amusing Ourselves to Death, Viking, 1985

E.B. White, One Man’s Meat, Harper’s, 1944

***

Now It Can Be Told

“Dwayne Hoover’s body was manufacturing certain chemicals which unbalanced his mind. But Dwayne, like all novice lunatics, needed some bad ideas, too,
so that his craziness could have shape and direction. Bad chemicals and bad ideas were the Yin and Yang of madness.”
Kurt Vonnegut, Breakfast of Champions

When Vonnegut offered that diagnosis in 1973, the bad chemicals he had in mind were ones produced spontaneously within the human body. While often disputed, the notion of innate chemical/biological causes of mental illness had become at least widely familiar by the 1970s. The specific bad ideas Vonnegut had in mind were contained in a book by his fictional alter ego Kilgore Trout.

There are far more bad chemicals making the rounds of the US these days, and far more bad ideas as well. Here are some. You will see them here in print for the very first time, as a shadowy cabal has until this moment succeeded in suppressing them:

1. I am not really Malcolm McCollum, retired professor. That was merely the shell I was sent to inhabit among you. I am in fact a former resident of the globular cluster M89.

2. I was sent here many years ago to reveal the existence of a terrible conspiracy of caninophiles who seek to totally dominate your planet in order to corner the world puppy population for their own sickening uses and raise taxes.

3. Among the leaders of this conspiracy are Nancy Pelosi, Warren Burger, Noam Chomsky, Simon Bolivar, the Dixie Chicks, Tom Cruise, Alex Rodriguez, Erica Jong and Mister Ed.

Some readers might question my veracity on one point or another. Some might doubt the existence of a place called Globular Cluster M89. They would be wrong to doubt it. Globular Cluster M89 not only exists, it has about a thousand associated globular clusters (see Fred Hoyl, The Nature of the Universe, Mentor #125). Some might find it unlikely that Nancy Pelosi could successfully collaborate with Alex Rodriguez, not realizing that the two were secretly married in 1963. Some might wish to point out that Simon Bolivar has been dead for a few centuries. To such naive skeptics I must point out that I have included several intentional falsehoods in my revelations to serve as coded messages to those with eyes to see.

My claims here are of course intentionally absurd (I hasten to say), but some who read them may still find them persuasive. For them, these allegations will explain why they’ve had such trouble finding appropriate work (they own puppies), why their spouses can’t stand them (their spouses’ minds have been colonized by Noam Chomsky and/or The Dixie Chicks), why their television shows keep getting interrupted by incessant appeals for money to help abused puppies (PETA and other such organizations are false flag operations designed to help the conspirators acquire even more puppies to sate their evil lusts), why Tom Cruise hasn’t been defenestrated yet.

My claims are no more absurd or unlikely than those of Qanon. Yet, “A poll released today by the Public Religion Research Institute and the Interfaith Youth Core . . . found that 15 percent of Americans say they think that the levers of power are controlled by a cabal of Satan-worshiping pedophiles, a core belief of QAnon supporters” (https://www.nytimes.com/2021/05/27/us/politics/qanon-republicans-trump.html). 15% of current US population means that nearly 50 million of my fellow citizens have found the Qanon conspiracy theory persuasive.

Perhaps I shouldn’t find that fact surprising. Throughout our history, considerable numbers of Americans have believed a variety of ridiculous things, and, in the words of one of Stuart Kaminsky’s characters in his novel She Done Him Wrong, “‘Some people will buy a goat’s ass and stick it on their head if a sweet talker gets his jaw going at them.'” But we eddicated folk are not supposed to fall so easily for unlikely nonsense, are we?

In a recent essay (“Viral Grievance” – https://usrepresented.com/?s=viral+grievance) I found occasion to write, “James Kimmel, Jr., co-director of the Yale Collaborative for Motive Control Studies . . . asserts, citing a number of clinical studies employing positron emission tomography and analysis of gaming behavior . . . .” 

Since I anachronistically believe I ought to know what the hell I’m talking about, I looked up “positron emission tomography.” I found that it was “an imaging technology in which substances containing positron-emitting isotopes are introduced into the body, allowing the precise location of physiological processes by detection of the gamma rays produced by the isotopes.” Little wiser, I went on to discover that “positrons are the antiparticles of electrons” and that  “the major difference from electrons is their positive charge.” I discovered a few more facts and definitions. I remained none the wiser.

I knew I’d never understand any of this, or be able to assess the validity of the assertions derived from “positron emission tomography,” unless I were gifted with a brain transplant. My own brain, I’d long ago learned in radio school at Ft. Gordon, Georgia, was  completely incapable of grasping or usefully visualizing abstractions such as electrons, let alone their antiparticles. I could read sentences such as “Positrons are formed during decay of nuclides that have an excess of protons in their nucleus compared to the number of neutrons” until the Cubs win their next World Series and still not have the slightest idea what they meant. For all I know, “positron emission tomography” may denote nothing more real than Gleem toothpaste’s “GL-70” (which some wag suggested might actually mean “lark’s vomit”).

I chose to transmit Kimmel’s assertions, even though they were based on research techniques whose validity or reliability I hadn’t a prayer of substantiating, for several reasons. First, Kimmel provided a complete list of the research on which he based his argument, research appearing in publications I believe, from past research of my own, to be reputable. These citations are accompanied by brief but coherent summaries of the research. Second, on his home page Kimmel lists his recent scholarly activities, his publications, and print/media articles about him and his work. He also supplies multiple paths by which he can be contacted directly. These qualities seemed pretty solid evidence that Kimmel wasn’t some lone, random nut peddling an unsupported theory.

Most people would not take even this minimal amount of trouble to decide on the validity of an argument they read on the internet. I quite often wouldn’t, myself, though I hope I’d do so in the case of any argument I intended to accept or pass on to others.

When Casey Stengel wanted to assure some reporter he was telling the truth (he often wasn’t), he’d add to his statement, “You can look it up.” In other words, if it’s in print somewhere, you can believe it. For the first two-thirds of my life, this belief wasn’t entirely absurd. For an argument, a narrative, an assertion to get into into print, it generally had to pass through a number of inspections and verifications –  by publishers, editors, proofreaders, peer review panels – by people who had had to demonstrate some degree of knowledge of the subject to get and keep their jobs.

A.J. Liebling famously observed that “freedom of the press is guaranteed only to those who own one,” and that fact virtually guaranteed that certain biases would control what the editors and publishers found worthy of publication or comment, and what not. But it was generally possible for a reader to identify those biases and to take them into account. This was true for the publishers of books or the producers of television news as well. Some book publishers, it became obvious from their catalogs, published only books promoting a particular political slant – some right, some left – but a reader could usually identify such biases as likely to color whatever books those publishers put out. Given those qualifications, finding something asserted in print meant there was a reasonable chance it might be true.

With the advent of the internet, it isn’t. You can look it up, all right – look up damn near any question you might have about anything on earth, and chances are you will find at least one answer to that question, usually more than one.

Can you depend on their veracity or accuracy? To answer that, a thought experiment:

Think about the last time you tried to find contact information for someone you knew but whose address/phone number you lacked. Did the information you found – if, eventually, you were able to extract it from the welter of surrounding sales pitches for juicy details of that person’s past – prove accurate? When you called their alleged phone number, did they answer it? Or had it gone out of service years ago, or been given to some other person? Are you aware of all the information about you and your whereabouts that resides on the internet? Do you monitor it for accuracy? Do you change it to reflect new developments in your life? Chances are you don’t, and chances are nobody else does it for you. On the internet, nobody is in charge of content. To paraphrase Oscar Wilde, the Global Village is not some analyst’s metaphor; it is a most depressing and humiliating reality. Or, another way to put it, it is as close to a purely democratic marketplace of ideas as one could imagine.

Neil Postman, in Technopoly, offers another metaphor for the internet: “Indeed, one way of defining a Technopoly is to say that its information immune system is inoperable. Technopoly is a form of cultural AIDS, which I here use as an acronym for Anti-Information Deficiency Syndrome. This is why it is possible to say almost anything without contradiction provided you begin your utterance with the words ‘a study has shown . . . ‘ or ‘Scientists now tell us that’ . . . .  More important, it is why in a Technopoly there can be no transcendent sense of purpose or meaning, no cultural coherence. Information is dangerous when it has no place to go, where there is no theory to which it applies, no pattern in which it fits, where there is no higher purpose that it serves . . . . Information without regulation can be lethal.”

Indeed. A recent headline: “Vaccine skeptic US cardinal on ventilator after Covid diagnosis.” He may have been depending on God to keep him immune, or perhaps he had “done his research.” That’s a phrase I hear more and more often from people who subscribe to various currently circulating notions. It generally means that they’ve hunted around on the internet until they found something that seemed to support whatever position they’d already decided to take. Whatever the position, it’s pretty easy to find someone who’s advocating for it on the net.

Say, for example, you took my claims about a conspiracy of caninophiles seriously, and wished to find more information about this menace on the internet, and googled “PETA puppy killing.” You would shortly find an entry entitled “PETA Kidnaps, Kills Family Pet, Here’s How They Apologized,” which begins thus:

“After PETA kidnapped and killed Maya, the Zarate’s family pet, they were very sorry.

So to show just how sorry they were, they brought the family a fruit basket. That didn’t sit too well with Wilber Zarate, who bought the cute little Chihuahua as a gift for her daughter. The child lost weight and sunk into a deep depression.

“The dog was kidnapped from the front porch of the family’s Virginia home in October 2014. Footage from the family’s surveillance cameras shows a PETA employee coming onto the property and taking the dog, WND is reporting.

“Court documents said that on the day Maya was taken, the family had gone to the store to purchase her a pillow, but couldn’t find her when they returned, reported WAVY-TV. When Zarate checked his security camera, the video showed a van with “PETA” on the side parked in his driveway. Two women exited the van and one walked up his porch, took Maya, and put her in the back of the van. The dog was put down shortly after that. According to a PETA spokesman, the employee made a ‘tragic mistake’ by euthanizing the pet ‘without permission.’

“This all stems from PETA’s long-standing belief that humans should not own pets for their ‘personal amusement,’ and some in the organization believe the animals are better off dead than kept as pets” (https://thefederalistpapers.org/us/peta-kidnaps-kills-family-pet-heres-how-they-apologized).

So begins this account, by someone styling himself “Bushrod Washington.” (Bushrod Washington was a nephew of George Washington and a Supreme Court justice. Perhaps this one is a descendant or admirer of the original. The website on which his account appears, “thefederalistpapers.org,” identifies him as “a political commentator [who] has over 15 years of journalism experience. He lives on a farm in the Midwest with his wife, 3 kids and 100+ cows, goats and other critters.” He is otherwise invisible on Google.) 

That description could well apply to your drinking uncle, Joe, who swept out the local newspaper office all his life until his wife inherited the family farm and he could retire to bloviate on the family computer. In other words, it’s rather vague, and doesn’t really give you any way to assess the writer’s competencies or experience or credentials.

Readers who responded to this example of Washington’s journalism or political commentary found it persuasive (Ronnie Medaid wrote, “yay google is my world beater assisted me to find this outstanding website !”) and worthy of further dissemination (Tiffiny Weldin wrote, “Good post. I be taught one thing more challenging on completely different blogs everyday. It would always be stimulating to read content from other writers and follow a bit one thing from their store. I’d want to use some with the content material on my weblog whether or not you don’t mind. Naturally I’ll provide you with a hyperlink on your web blog.”).  You have to admire the politesse of “whether or not you don’t mind.”

Carlotta Everley, however, was one of only two respondents who found anything questionable about Washington’s account: “Thank you, I’ve recently been searching for info about this subject for ages and yours is the best I have discovered so far. But, what about the conclusion? Are you sure about the source?”

She makes a good point; the major source appears to be the court filing by the plaintiff’s attorney, which presents a rather incomplete picture of the event. It omits to mention, for instance, that the two PETA employees had been called on by the mobile home park’s owner to come help capture wild dogs and feral cats, it omits to mention that the pet chihuahua they picked up had been left unattended and unleashed, and that PETA has stated that “the person responsible” for the dog’s euthanizing has been fired.

The other source Washington mentions is WND (World Net Daily), an internet news and commentary site. A libel suit filed against the site suggests how reliable it might be as a source:

“On September 20, 2000, WND published an article claiming that Clark Jones, a Tennessee car dealer and fund-raiser for then-Vice President Al Gore, had interfered with a criminal investigation, had been a “subject” of a criminal investigation, was listed on law enforcement computers as a “dope dealer,” and implied that he had ties to others involved in alleged criminal activity. In 2001, Jones filed a lawsuit against WND; the reporters, Charles C. Thompson II and Tony Hays; the Center for Public Integrity, which had underwritten Thompson and Hays’ reporting on the article and related ones; and various Tennessee publications and broadcasters who he accused of repeating the claim, claiming libel and defamation. The lawsuit had been scheduled to go to trial in March 2008, but on February 13, 2008, WND announced that a confidential out-of-court settlement had been reached with Jones. A settlement statement jointly drafted by all parties in the lawsuit states in part:

“Discovery has revealed to WorldNetDaily.com that no witness verifies the truth of what the witnesses are reported by authors to have stated. Additionally, no document has been discovered that provides any verification that the statements written were true.

“Factual discovery in the litigation and response from Freedom of Information Act requests to law enforcement agencies confirm Clark Jones’ assertion that his name has never been on law enforcement computers, that he has not been the subject of any criminal investigation nor has he interfered with any investigation as stated in the articles. Discovery has also revealed that the sources named in the publications have stated under oath that statements attributed to them in the articles were either not made by them, were misquoted by the authors, were misconstrued, or the statements were taken out of context” (https://www.sourcewatch.org/index.php?title=WorldNetDaily).

It appears from this that Ms Everley was well advised to wonder about Washington’s  “source.” The “conclusion” – presumably she means “This all stems from PETA’s long-standing belief that humans should not own pets for their ‘personal amusement,’ and some in the organization believe the animals are better off dead than kept as pets” – is also contradicted by PETA’s official position statement: “Contrary to myth, PETA does not want to confiscate animals who are well cared for and ‘set them free.’ What we want is for the population of dogs and cats to be reduced through spaying and neutering and for people to adopt animals (preferably two so that they can keep each other company when their human companions aren’t home) from pounds or animal shelters—never from pet shops or breeders—thereby reducing suffering in the world”(https://www.peta.org/about-peta/why-peta/pets/).

So far as I can find, Bushrod Washington never offered his readers an update on his “story,” but I feel obliged to, so here is The Rest of the Story:

“A trial had been scheduled for September, during which Zarate’s attorneys had planned to question current and former Peta employees about its euthanasia policy.

The group later said it would pay the family $49,000 and donate $2,000 to a local branch of the Society for the Prevention of Cruelty to Animals (SPCA) to honor Maya. The family had sought up to $7m.

“The family’s attorney, William H. Shewmake, said: ‘The Zarates felt that the settlement reflects the grievous loss of their beloved Maya. And it allows the Zarates to bring some closure to a very painful chapter of their lives. They’re glad the case has been settled.’ Both parties said in a joint statement: ‘Peta again apologizes and expresses its regrets to the Zarate family for the loss of their dog Maya. Mr Zarate acknowledges that this was an unfortunate mistake by Peta and the individuals involved, with no ill will toward the Zarate family.’ (https://www.theguardian.com/us-news/2017/aug/17/peta-sorry-for-taking-girls-dog-putting-it-down)(8/16/2017)

In short, an unfortunate single incident was put forth by Washington as typifying the practice and operating principle of an entire organization, based entirely on the allegations of the injured party. This incident took place nine years ago, yet Washington’s account of it, and his unsupported assertions about it, remain on the internet today. The general principal this illustrates was nicely put in Mark Twain’s Pudd’nhead Wilson: “One of the most striking differences between a cat and a lie is that a cat has only nine lives.”  It matters not how often or how thoroughly some statement on the internet has been debunked; unless someone takes the trouble to remove it, it will live on so long as the internet exists. And if someone does take the trouble to remove the original statement, no one on earth will be able to remove all the forwarded copies of it. A lie will not have any number of lives at all. It will be immortal.

Back when computers were a novelty, my community college – and I’m sure countless other schools – felt obliged to create a course called “Computer Literacy,” which taught students such things as where the “on” switch was, and, perhaps, how to negotiate one of the early languages such as MS-DOS. After a while, it also introduced them to the internet, and to such arcana as how to reach a website by typing in its url. The course offered nothing in the way of critical examination of the content to be encountered on the internet.

It seems glaringly obvious to me that our schools, starting at the elementary level, need to create and make mandatory a radically revised version of “Computer Literacy” courses, one that equips all citizens with methods necessary to separate propaganda, manipulation, and lies from useful information. We should have been doing this all along, of course, but the need has become acute now that a lie can fly around the world in seconds. Another observation about lies has been widely, though falsely, attributed as well to Mark Twain: “A lie can travel around the world and back again while the truth is lacing up its boots.” When Twain lived, he would have been thinking of the telegraph. The observation goes back at least as far as Jonathan Swift, who would have been thinking of the speed of sailing ships. Neither could have imagined the viral speed with which information – and misinformation – can encircle today’s globe.

I am about the last person competent to devise the curriculum of the sort of course I have in mind. I use the internet for various purposes, but it is not my world, and I know essentially nothing about how it works. But I do know plenty about how liars work, and I know what anyone who’s survived for eighty-some years in this society knows about what bullshit sounds like. So I can at least offer a few general suggestions about my suggested Internet Self-Defense Course.

If you come upon a statement alleging something is true, the first thing you do is look for supporting evidence. If none is offered, you have no reason to accept the allegation,

unless you can corroborate it from your own experience and observation, and even then, you can’t say you’re certain. The sky is blue – go look. Yup: looks blue, even though it isn’t.

If supporting evidence is offered or alleged to exist, what are its sources? Are they believable? Are they known to have specific ideological or political agendas? If the evidence is based on “studies,” “surveys,” or “research,” who paid for them, and what was their methodology?

Many more questions need to be asked before you accept some statement as accurate,

but to teach people how to answer even those very basic ones would not be the work of a week or a month of classes. As I tried to demonstrate when talking about my search for the meaning of “positron emission tomography,” even the attempt to answer those basic questions is fraught with difficulty, and usually raises further questions that raise further questions that raise further questions . . . . So my suggested course should offer students some reasonable criteria for reaching a verdict. While the more you know, the more you know you don’t know anything for sure, the fact remains that you have to act, and action requires reaching an assessment of a situation.

For if you don’t act, your puppies will never be safe from Tom Cruise and his kind.

***

Johnny’s Wonton Shopping Trip: A Letter to My Editor

My editor has written me the following: “College is now irrelevant to anyone with a good mind who knows what he wants from life. Higher Education is now just a place for group socialization or for those who need to be given discipline and direction. Digital platforms have democratized knowledge.” It seemed to me I’d heard similar thoughts before, and after a while I remembered where. In an essay I used to teach back toward the turn of the century, “Wouldn’t You Rather Be at Home,” Ellen Ullman wrote:

“What had happened between 1995 . . . and . . . 1998 was the near-complete commercialization of the Web. And that commercialization had proceeded in a very particular and single-minded way: by attempting to isolate the individual within a sea of economic activity. Through a process known as ‘disintermediation,’ producers have worked to remove the expert intermediaries, agents, brokers, middlemen, who until now have influenced our interactions with the commercial world . . . .

[All italics following are mine] “Removal of the intermediary. All those who stand in the middle of a transaction, whether financial or intellectual: out! Brokers and agents and middlemen of every description: good-bye! Travel agents, real-estate agents, insurance agents, stock-brokers, mortgage brokers, consolidators and jobbers – who needs you?. . . . Even the professional handlers of intellectual goods, anyone who sifts through information, books, paintings, knowledge, selecting and summing up: librarians, book reviewers, curators, disc jockeys, teachers, editors, analysts why trust anyone but yourself to make judgments about what is more or less interesting, valuable, authentic, or worthy of your attention?

Ullman, herself a software engineer and consultant, and a perceptive and thoughtful observer, failed to anticipate the step that’s followed disintermediation. That step might be called “re-intermediation,” meaning the replacement of those old-fashioned human intermediaries by cookies, algorithms and web pages dominated by the self-worship of their designers and serving the profit motive. She also failed to consider the side effects this substitution might foster. (She did recognize that some side effects would likely occur: “I’ve long believed that the ideas embedded in technology have a way of percolating up and outward into the nontechnical world at large, and that technology is made by people with intentions and, as such, is not neutral.”

In 1889, the futurist writer Edward Bellamy was quick to imagine the side effects of the new technology of sound recording. In “With the Eyes Shut,” considering the infiltration of the phonograph into daily life, he suggested that the phonograph has “’improved the time’ by invading privacy, spewing propaganda, and substituting generic patter for considered responses. It has become indispensable by making its users expect the constant stimulation of news, information, and sound. It has made the skills of reading and spelling obsolete; even more ominous, it has become a substitute for face-to-face interaction. People could communicate by recorded cylinder, or with automatons standing in for moral and political leaders” (Simon, 266-267).

Bellamy’s vision failed to materialize, at least so far as the phonograph was concerned. People soon proved far more interested in listening to recorded music than recorded words, so their literacy was not obliterated, and since people found it easy to gather around the phonograph, and later the radio, face-to-face interaction didn’t end, either, though it began to move down the long road toward the atomic family and the “interest group.”

In 1909, The British novelist E. M. Forster wrote “The Machine Stops,” in which he imagined a world where people lived in an underground hive, each cell inter-connected by electronic means remarkably like our internet. His view of the imagined side-effects of this imagined society were no more sanguine than Bellamy’s.

But these early skeptics didn’t have a chance of being taken seriously, for a massive change in public attitudes toward technological change had already come about: “With the further development of industrial capitalism, Americans celebrated the advance of science and technology with increasing fervor, but they began to detach the idea from the goal of social and political liberation.” Instead they embraced “the now familiar view that innovations in science-based technologies are in themselves a sufficient and reliable basis for progress” (Carr, 160-161).

That has certainly been the case for the digital innovations over the past forty years. Computers and internet access have been embraced without discernible reservations by consumers and by the business, government and education communities. The observable results? Let’s ask Johnny.

Johnny, a college freshperson this year, was born in 2002, so he’s never known a world without cell phones, internet access or ubiquitous digital film effects. If he is typical, he has spent nearly 11 hours of each of his days looking at a screen – computer, smart phone, tablet or television. Nearly 9 of those hours have been devoted to “entertainment media.” If Johnny has been getting his suggested 8 hours of sleep, he’s had about 5 hours out of every day to develop a good mind and figure out what he wants out of life. That’s if he didn’t decide that what he wanted out of life was to get high, play sports, eat and hang out. If he did want to do any or all of those things, he didn’t have much if any time left to think further about “what he wanted from life.”

Those 11 hours of screen time have been exclusive of the hours spent in school. During his school day, has Johnny been studying his native or another language – how to read it, how to use it to express himself, how to critically analyze it? Nope. He’s been looking at more screens. Johnny has graduated from high school with certain abilities: he has learned to choose pre-selected answers to questions he didn’t formulate. He has learned that this ability constitutes “accountability.” His ability to read and interpret those questions and answers marks about the limit of his ability to read or understand anything. He has not been required to read many if any books during his schooling. If his parents are among the two-thirds of Americans who never read books for pleasure, he has not been encouraged by precept or example to read any books outside school. Nevertheless, my editor would have Johnny eschew higher education and strike out on his own, finding what further instruction and knowledge he might feel the need for on the internet.

Now, I will not dispute that the internet contains a nearly overwhelming collection of information. It can all be Johnny’s, if he knows how to find it. And if he can read it. And if he can assemble it into meaningful patterns. Poet and radio commentator Andrei Codrescu, in the early days of the internet, wrote, “An observer in, let’s say, the sixteenth century, would be astonished to see the quantities of sheer information consumed by an average American in an average town on an average day. Our sixteenth-century observer would, at first, faint from the sheer excitement and delight at the volume of knowledge, and then would try to grab as much of it as possible. He or she would, however, be able to grab no more than about five minutes worth from our media before short-circuiting and vanishing in a puff of smoke . . . . Because a sixteenth-century observer, unlike a twentieth-century consumer, would try to make sense of the information by connecting it” (Codrescu, 46).

To illustrate Codrescu’s point, a smattering of headlines from the stories the Google news feed has presented during the past few months. What meaningful patterns emerge?

Parents arrested after 4-year-old boy finds gun, shoots himself

Death’s Door is a must for those looking to scratch the itch of a classic Zelda dungeon-delving

Perfectly preserved 310-million-year-old fossilized brain found

US ranks last in healthcare among 11 wealthiest countries despite spending most

Yes, the villain Starro in ‘The Suicide Squad’ is a vengeful starfish

A potato named Doug may be the largest ever unearthed

Ted Cruz condemns Big Bird for advocating Covid vaccines for kids

Meta Shows Research Towards Consumer Force Feedback Haptic Gloves

We all must go to Peppa Pig World, says UK PM Johnson in speech flap

World’s first living robots can now reproduce, scientists say

Pope Francis warns young people not to be tempted by consumerist sirens.

I suggest that you wouldn’t have to be born in the sixteenth century to be reduced to a puff of smoke if you tried to assemble even those few bits of “news” into some sort of coherent picture of your world. If Johnny seeks to look into the News Behind This News, he will encounter the commercialization of the Web Ullman noted, at a level she could scarcely have imagined.

Let’s say Johnny wants to find more detail behind the Pope’s warning to young people. He clicks on the headline, and after a second’s teaser view of the Story, his screen is covered by a pop-up for some iteration of Fox News: “Be the first to know! We’ll tell you about cool new site features, games, puzzles and more / Enter your email Count Me In!” (In red button awaiting Johnny’s click). Johnny decides against investing in this enticement and X’s it out to return to the Pope.

The first thing he sees, over the entire top third of the recovered screen, is not the Pope, but a set of photographs advertising OM Premium Quality Caviar. Below that, the headline he first selected appears again followed immediately by large boxes advertising “Fox News FLASH HEADLINES” (to which Johnny may subscribe) and King Sooper’s offer of “Lay’s Potato Chips Classic / Final Cost $1.88* / * When you buy 3 With Card” (That proviso added in about a 4-point font).

Johnny ignores these intrusive pitches to get to The Story itself, which follows for 3 paragraphs, the first of them reading, “Pope Francis ended his visit to Greece Monday by encouraging its young people to follow their dreams and not be tempted by the consumerist ‘sirens’ of today that promise easy pleasures.” The next two paragraphs contain no further mention of the Pope’s anti-consumerist message, dealing instead with the weather during his departure from Greece and the location of his final visit there.

The Story is then interrupted by a series of small, animated photographic panels showing young women in nearly transparent undergarments made by a firm called Lunya.

Below this appears: POPE FRANCIS VISITS CYPRUS AND URGES PEOPLE TO HEAL DIVISIONS, below which is a 3/4 screen photo captioned “Pope Francis arrives for a meeting with young people . . . in Athens, Greece.” Flanking this along the right one-quarter of the screen is, “More from Fox News / Celebrities with face tattoos / Tucker Carlson: We’re in for a whole new . . . / Allergy sufferers have nearly 40% lower risk of . . . / Former UCF running back Otis Anderson Jr Shot . . . / Kyle Rittenhouse reveals what will become of AR-15 . . . / Alec Baldwin hits back at George Clooney’s respon . . . / Sponsored Stories / Ads by Yahoo / 7 Ways to Retire Comfortably with $500k (Photo of middle-aged white couple who look so happy they must have just ingested a life-threatening drug) / Worst Colleges in America (photo of 3 busty cheerleaders filling out sweaters reading ‘U S Indecipherable Letter’)”. Then, at last, The Story reappears. This time, two complete paragraphs are devoted to the Pope’s warning:

“He echoed a common theme he has raised with young people, encouraging them to stay fast in their faith, even amid doubts, and resist the temptation to pursue materialist goals. He cited Homer’s Odyssey and the temptation posed by the sirens who ‘by their songs enchanted sailors and made them crash against the rocks.’

“‘Today’s sirens want to charm you with seductive and insistent messages that focus on easy gains, the false needs of consumerism, the cult of physical wellness, of entertainment at all costs,’ he said. ‘All these are like fireworks: they flare up for a moment, but then turn to smoke in the air.’”

The Story then veers to a Syrian refugee’s account of his family’s travails reaching Greece after their home was blown up. This paragraph is followed by the headline

POPE FRANCIS BRINGS HOPE TO THE POOR IN ASSISI VISIT

over a full-screen photo captioned, “Pope Francis visits Aint Dionysius School of the Ursuline Sisters in Athens Greece,” to the right of which appears the previously seen Lunya ad with young women in underwear. Then two more paragraphs of The Story containing the Pope’s reflections on the Syrian refugee’s odyssey, which concludes with the Pope’s adjuration to “‘Dream big! And dream together!’” then another identical Lunya ad, another photo of the Pope in Athens, then the final paragraph of The Story:

“Francis is returning to the Vatican with some important pre-Christmas events on his agenda: a scheduled meeting with the members of a French commission that investigated sexual abuse in the French Catholic Church; a scheduled meeting with Canadian indigenous peoples seeking a papal apology for abuses at Catholic-run residential schools; and Francis’ own 85th birthday on Dec. 17.”

This is immediately followed by “Sponsored Stories You Might Like,” which are:

Blonde with large breasts in 2 flanking photos, captioned “We All Had a Crush on Her, Where Is She Now”

Photo of four hotdogs in buns with condiments erupting from them, captioned “Costco Is Dropping These Customer Favorites”

Nubile young woman in yard photo flanked by photo of moose in fenced back yard, captioned “Maybe This Is Why Australia Is Full of Dangers”

Photo of hand holding credit card and two $100 dollar bills, captioned “Do Your Holiday Shopping & Get a $200 Bonus”

Photo of tray of eggs in refrigerator, captioned “Why Europeans Don’t Refrigerate Their Eggs”

Photo of Blonde woman making disappointed moue, captioned “The Cast Finally Admit That The Show Is Fake.”

After a “Start Your Free Trial” offer to stream Fox Nation, there is the “Conversation” section, purporting to contain 379 comments, whose depth is reflected in this early one by someone calling himself “RepubicMatters”:

“If the consumer stops spending his money, the economy will come crushing down unfortunately. Innovation is driven by wonton shopping by the consumer.”

I’m not sure whether my editor considers “a good mind” to be a genetic gift, or, if he does, he thinks it a gift in need of further development, but let’s grant that Johnny has one. I’ll even grant that he “knows what he wants from life” – some people do, after all, seem to discover their lives’ purposes in their early years.

Since Johnny has sought to examine the Pope’s remarks about “consumerist sirens,” we can assume he gives some credence to papal opinions and wants to discover how the Pope has gone about supporting them. What he finds are arguments from simile – the Pope referring to the sirens in The Odyssey, which it is highly unlikely Johnny has read, and to fireworks, which Johnny has probably seen, at least on television. Perhaps the Pope’s reference to the Odyssey piques his interest, and he googles “Sirens.” Here is what he first encounters:

“Top banner: K-12 Pandemic Funding / How should you prepare for reporting? switch:

Duluth Trading company logo – Address, hours In-store shopping/ Curbside pickup XDelivery

Who are the sirens in The Odyssey?

Expert Answers?

Ad: Windigo Logistics Hiring Now

photo of huge empty warehouse

then on to Peak Vista Community Health Centers then

This Day In History

1964 Sam Cooke American singer-songwriter dies

1901 Marconi transmits first radio signal across ocean

and so on with a continuing string of completely unrelated factoids, and, finally,

A:

JENNIFER RODRIGUEZ

CERTIFIED EDUCATOR

Sirens are Greek mythological beings best known for their brief but memorable appearance in Homer’s The Odyssey.

According to Greek lore, sirens have human heads and bird-like bodies. They sing beautifully and use their songs (also sometimes referred to as siren calls) to kill sailors who travel near . . . then a box containing:

Unlock This Answer Now (large font)

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy enotes ad-free and cancel anytime.

Red Box, white letters: Start your 48-hour Free Trial / to the right of which is:

Ask a Question

Box: Enter Your Question for The Odyssey Submit Question buzzer

Popular Questions

What are 3 examples of times when Odysseus demonstrated epic hero/god like qualities in The Odyssey?

Then another This Day in History Box that alternates history factoids with ads for various companies

What are the challenges that Odysseus had to face on his journey home?

Who does Odysseus encounter in the Land Of The Dead in Homer’s Odyssey?

Then across the bottom, 3 individual ads for Detox & Residential Treatment quickly replaced by an Instant Cash Offer on Johnny’s car, The Witcher on Netflix, and a Colorado Springs DUI attorney, replaced in their turn by Duluth Trading Company ads /then Related Questions:

Who are the Laestrygonians in The Odyssey

ad: The Truth about side sleepers quickly replaced by Mens’ Cold Weather Clothing Sheels (They also sell dumbbells, barbells and weights)”

If Johnny is able to pick out the Certified Educator’s explanation of “Sirens” from the welter of unrelated advertising and irrelevant information blobs, will he have come much closer to understanding the Pope’s message? Or will he be left wondering where to find killer singers with birdlike bodies? Perhaps they have been sleeping on their sides too long, or neglecting their barbells.

Are there internet sites of information free of these sorts of irrelevancies and commercial interruptions? There are. Are they likely to appear among the first 100 listings for any Google or other commercial search engine? They are not. How is Johnny to know of their existence, unless his use of the internet has been given some discipline and direction by mentors who know that serious information sites exist, know how to find them, and know what specific sorts of searches they’re good for? And who are themselves governed by values other than pecuniary?

Absent such instruction, Johnny is more likely than not to wind up believing all sorts of cockamamie nonsense simply because he encountered it on the internet. He has little in his own hard drive with which to sift and compare whatever “information” he runs into, since “rote memorization” has been in bad academic odor since before he was born, and since he has not learned to read critically and skeptically, but solely for purposes of regurgitation.

Nor is the internet, his source of information, regulated by any agency devoted to such antiquated values as truth, accuracy, or the good of society. The sole regulators of the internet are the engineers of commerce, and their values are those of what Jules Henry called “pecuniary philosophy,” which he described thus: “The heart of truth in our traditional philosophies was God or His equivalent, such as an identifiable empirical reality. The heart of truth in pecuniary philosophy is contained in the following three postulates:

“Truth is what sells.

Truth is what you want people to believe.

Truth is that which is not legally false” (Henry, 50).

The owners of the corporate behemoth tend to be pretty well along in years, and so must employ members of the younger digital generations to actually operate this gorgeous new engine of commerce. And it has been apparent almost from the birth of the internet that these

technologically hip youngsters, the re-intermediators, share certain characteristics – those common to young adolescents. They love to fantasize about super-powers, seemingly human characters immune to the physical laws that limit actual humans. They have learned to command the new digital technologies so that they can make astonishingly convincing moving images of these fantasy creatures. They are enthralled by their own technological expertise, and unconcerned with the purposes to which it is being put. And they hold any other humans not members of their in-group in utter contempt.

Ellen Ullman took note of the latter attitude, meditating on a billboard whose message was “Now the world really does revolve around you”: “Every time I saw it, its message irritated me more. It bothered me the way the ‘My Computer’ icon bothers me on the Windows desktop, baby names like ‘My Yahoo’ and ‘My Snap’; my, my, my; two-year-old talk; infantilizing and condescending” (Ullman, 289).

The internet’s hegemony over today’s communications reminds me of Bellamy’s premature prediction: “It has become indispensable by making its users expect the constant stimulation of news, information, and sound. It has made the skills of reading and spelling obsolete; even more ominous, it has become a substitute for face-to-face interaction.” It would be pretty to think that Johnny might somehow be immune to these effects, but given the amount of time he spends immersed in them, that seems unlikely. And so, Johnny may bravely struggle to hear the Pope’s warnings, but it’s more likely that he will find himself distracted from them by the overwhelming commercial clamor that surrounds, interrupts, and contradicts them. The same will likely be true for his further searches for a meaning beyond wonton shopping. For the internet, the only place he can imagine to conduct his searches, has for its all-pervasive meaning the relentless encouragement of that very wonton shopping.

Nearly one hundred years ago, Matthew Josephson wrote a long analysis of the workings of the first generation of capitalist titans in America, The Robber Barons. Speaking of John D. Rockefeller’s South Improvement Company, he explained the origin of his title: “Entrenched at the ‘narrow’ of the mighty river of petroleum they could no more be dislodged than those other barons who had formerly planted their strong castles along the banks of the Rhine could be dislodged by unarmed peasants and burghers” (Josephson, 120). Our passionate, spreadeagled embrace of the entry of computer networks into every aspect of our lives, commercial, public, and personal, has given the economic forces which have created and learned to exploit those networks a power that would have drawn the astonished envy of Rockefeller, Morgan and Gould. The internet in a couple of short generations has grown its castles at the narrows of every river in the world.

I find it highly doubtful that Johnny is going to find the reliable information and the skills with which to assemble it into a working life plan on the internet. I think he is far more likely to become that wonton consumer – of physical objects, of “ideas,” of amusements – that the Pope warned him about, just to the left of the pretty young lady’s silkily encased crotch.

Works Cited in Order of Citation:

Ellen Ullman, “Wouldn’t You Rather Be at Home?” in New Affinities, Pearson Custom Publishing, 2001

Linda Simon, Dark Light, Harcourt, Inc., 2004

Nicholas Carr, The Glass Cage, Norton, 2014

Andrei Codrescu, “Intelligent Electronics,” in New Affinities, Pearson Custom Publishing, 2001

Jules Henry, Culture Against Man, Random House, 1963

Matthew Josephson, The Robber Barons, Harcourt Brace, 1934

***

Pass the Baby Owls

In the basement of the last Furr’s Cafeteria in America, the heads of Microsoft, Apple, Google and Facebook meet each September 8 with leaders of the two main political parties, presidents of major corporations and advertising agencies, and a rotating cast of foreign enemies and enemy domestic celebrities. Their long-term goal is the eradication of literacy, first in English-speaking countries, eventually throughout the world. After gorging themselves on endangered species, they molest puppies until they turn to formulating the next step in their crusade.

I tend to view conspiracy theories skeptically, this one included. Surely Bill Gates and Mark Zuckerberg can find snacks more appealing than baby owls. Yet evidence for this theory keeps intruding on my attention.

A couple of years ago, Apple CEO Tim Cook was invited to give the commencement speech at Duke University. His address, composed largely of the time-honored shibboleths beloved of corporate commencement speakers down through the ages, was distinguished by being composed in paragraphs of one sentence each. Reading it was like watching a military parade of clichés, or reading the latest pop business manual. Midway through, Cook urged the graduates to “dare to think different.”

Did he know he was using an adjective where an adverb was required? Was his grammatical error an effort to connect with his audience, or did it amount to a statement that, as one of the winners in the new digital economy, he was above such piddling considerations as using the proper part of speech? Or was it part of something darker and far more pervasive?

I am cursed with an inability to not-listen, and that inability applies to commercial messages. Two months ago I began jotting down examples of the advertising industry’s assaults on the grammatical concept of parts of speech. A partial list follows:

Safeway delivery is bringing the fresh (adjective as noun)

Change the way you pizza (noun as verb)

However you happy . . . . (adjective as verb)

Wake up to the wow (interjection as noun)

Enjoy the go (verb as noun)

Let’s put smart to work (adjective as noun)

No matter how you family, it’s better (noun as verb)

No one outPizzas the Hut (noun as verb)

Feed your happy (adjective as noun)

Want to brain better? (noun as verb)

Gift like you mean it (noun as verb)

Let’s give this holiday all the merry we’ve got (adjective as noun)

Get our ap today and fuel better (noun as verb)

Compared with many other languages, English is pretty lightly inflected, depending more heavily on word order than on varied word endings to create meaning. But the elimination of the concept of parts of speech throws remaining inflections out the window, leaving us with such possible sentences as “Grover coked he conk to being unlow,” or “Foot a mile in she walk residue,” which are a bit less ease to interpreting.

I will grudgingly admit that any English speaker can likely interpret the intended meaning of most of these phrases. Some of them are mere elisions: “bringing the fresh” is, in context, clearly implying “the fresh food.” We use such shorthand all the time. Other phrases require more major reconstruction to make any sense. “Feed your happy,” for example, poses the unanswered question, “happy what?” How do you put an abstract adjective such as “smart” to work?

No problem putting dumb to work, as Google’s headline writer’s demonstrate daily:

Britney Spears Documentary Brings Validity to #FreeBritney Movement, While                               Putting Misogynistic Media on Blast

There are an insane amount of cool space things happening in 2021

L’Oreal’s New Doodad Sounds Like the Supersmart Skincare Gadget I Need

This Peloton-Beat Saber mashup turns my Oculus Quest into a sweat sponge

Well, I guess if the amount of cool space things is truly insane, then those space things are essentially uncountable, so “amount” is a better choice than “number.” But “amount” is singular and requires a singular verb. So I have to put that headline on blast. Unless one of the cool space things is a New Doodad or a Saber mashup, in which case I’ll just put it on sweat sponge.

Meet Joe Bftsplk

Certain portions of the entertainment industry, notably the thoroughly corporatized rap/hip-hop/whatever-is-the-new-brandlabel division, are encouraging a more personalized approach to language destruction, mandating that their weekly new Megastars accept intentionally misspelled and/or unpronounceable names for themselves:

Megan Thee Stallion Posts Photo of Her Gunshot Wound, Makes New Statement

Gigi Hadid On Her Home Birth With Zayn Malik And How They Plan To Raise Daughter Khai In PA

DABABY ARRESTED IN ROBBERY CASE …Victim Doused With Apple Juice

Timothée Chalamet Photobombed Margot Robbie at the Oscars and It Was Perfectly Adorable

Ty Dolla Sign Recruits a Super Team in Kanye West, Fka Twigs and Skrillex For ‘Ego Death’

Lil Jon Refuses to Join Lil Wayne and Lil Pump in Supporting Trump

Lil Pump threatens to leave the U.S. if President Trump isn’t reelected

Cardi B Responds to Claim She’s in a “Mentally Abusive Relationship” With Offset

Makeup guru, influencer Ethanisupreme dead at 17

That I haven’t the faintest notion of who any of these luminaries might be, of course, is no blast on them. It just shows that I’m an octagenarian who hasn’t bothered to keep up with pop culture. After all, Megan Thee Stallion recently appeared on Time’s list of 100 most influential people. And when thee Time sayz you influential, you gotz to be important. I can perhaps be excused my ignorance of Ethanisupreme, since I’ve never  been a big consumer of makeup, and I’ve been sort of busy for the past 17 years. Ethanisupreme we hardly knew ye.

FUBAR for Sure

Computer technicians are extraordinarily fond of acronyms – even fonder, it that’s possible, than the military. I’ve selected a few examples from recent headlines:

NVIDIA launches GeForce RTX 2080 Ti Cyberpunk 2077 Edition – a limited run GPU

BJP Tweets IMF’s April GDP Estimate Which Was Lowered In June

AMD Ryzen 9 5900H delivers +23% single-core performance gain over Ryzen 9                                       4900H on Geekbench while cocking a snook at the Intel Core i7-10700K

A new system designed by Rockstar devs could improve NPCs in GTA 6

ASUS Z490 Motherboards Tested With Intel Rocket Lake Desktop CPU, Lack                                          PCIe Gen 4.0 Support on M.2 Slots

It is clear even to me that a lack of PCle Gen4.0 support is very likely fatal to M.2 Slots and will probably cock a snook at your ASUS every time.

Well, every profession has its jargon, a major function of which is to keep the riff-raff out. Ask your doctor about Abluminabliblamab. He’ll be glad to tell you it’s a zeezobactic inhibitor for people suffering from moderate to severe blastomyelitic poryphyry to improve your NPCs. Then he’ll go off to giggle in the supply closet.           

The Great Leap Forward to 4,000 BC

“Simplifying” the grammar of English and blowing up its hard-won standardized orthography are probably not sufficient to render the language useless and its previous forms incomprehensible. “If a man does away with his traditional way of living and throws away his good customs,” says an African proverb, “he had better first make certain that he has something of value to replace them.”      

Recognizing this, the Japanese contingent of the Furr’s conspirators introduced the emoji. In the words of Wired, emoji are “… more like a primitive language. The tiny, emotive characters—from 😜 to 🎉 to 💩—represent the first language born of the digital world, designed to add emotional nuance to otherwise flat text.” (https://www.wired.com/story/guide-emoji) In just over twenty years, the number of officially recognized emojis has grown to 3,521.

That Wired description is interesting. How is it that a “primitive” language “adds emotional nuance”? What makes a text “flat”? The fact that it is written only in English? Is English by definition without emotional nuance, or only so for people who can scarcely comprehend it? Let’s see. Here are a couple of flat (words only) texts, first as originally written, then given enhanced emotional nuance:

To be, or not to be: that is the question:
Whether ’tis nobler in the mind to suffer
The slings and arrows of outrageous fortune,
Or to take arms against a sea of troubles,
And by opposing end them?

2  🐝 or  🚫 2 🐝 that = ?

Whether = 👑+ in the 🧠 to 🤕

the slings + 🏹s of 😠ous $$$,

Or 2 take 🗡️🔫 💣 vs a 🌊 of 😟s

+ by ⚔️ing 🔚 them?

Of course, poetry may have been designed to supply its own emotional nuance. Maybe philosophical prose will provide a better test:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights….

🧑‍🤝‍🧑    📃these 🥠 = 💡,

that ♾️ 🚹 = ⚗️ed =,

that 🚹 = 💸 by their 🔱 w/🈯 🤳 ▶️s

This is the direction we’re headed. It will be hard to convince me that this language of crude imagery represents anything but a diminution of our ability to think or communicate clearly. “♾️🚹,” it could be argued, might represent “an infinity of pissoirs” as easily as it represents (in my translation) “all men.” “…these 🥠 =💡” could, with some stretching, be taken to mean “these truths to be self-evident,” but it could just as well be translated “these fortune cookies are the same as incandescent bulbs,” which would be rather far from Jefferson’s intent.

The computer industry is dominated by people operating from the profit motive, and of course the advertising industry has never had an original impulse it wasn’t paid to have by one biz or another. People seeking to profit from selling back to other people what they have removed from their control clearly have no interest in maintaining, let alone increasing, those customers’ abilities to think or communicate clearly. They need idiots, not savants. How else are they going to get people to buy the GeForce RTX 2080 Ti Cyberpunk 2077 Edition when they’ve been perfectly happy with the plain old 2076 Edition – in fact, have just barely learned to use it? No one in power has ever wished to be questioned or second-guessed, and those irritants will inevitably be reduced by reductions in literacy. In the case of government at all levels, such reductions will be advanced under the twin banners of Accountability and Transparency, two abstractions bracingly lacking in definable meaning.

Fortunately, advancing technology will soon make such antiquated concepts as “meaning” as irrelevant as the dodo or the typewriter, as Google headlines assure us:

Elon Musk’s Neuralink implanted a chip into a monkey’s brain and now he ‘can play                   video games using his mind’

Thought-detection: AI has infiltrated our last bastion of privacy

A wave of startups wants to make brain-computer interfaces accessible without needing surgery. Just strap on the device and think.

Startup wants to upload your brain to the cloud, but has to kill you to do it

Pass the baby owls, would you? 🙂

***

Three Little Words

Starting with my dad, people have been telling me to Stand Up Straight all my life. He told me. The Army told me. My friends told me, and probably a few enemies. No one ever told me how, until last year, when my chiropractor was moved to show me a little exercise in which I stood with my back against a wall, heels pressing against the baseboard, back of my head also pressing against the wall. “When you do that,” he told me, “lift your sternum, and hold it for a count of ten.”

No one had ever told me to lift my sternum, and it took me a few tries to feel what it meant, but it became clear very soon that lifting my sternum was the key to “standing up straight” without the kinds of tension the effort to do that that had always produced – the sucked-in gut, the strained-back shoulders. It didn’t take many days of doing that one exercise to make lifting my sternum effortless, and not many more for it to become an unconscious default. Result: I stand up straighter.

Shortly after this discovery, I happened to be listening to one of the World Series pre-game shows, and heard John Smoltz observe that one of the Houston pitchers was very good “when he was pitching from the inside of his knees.” This caught my attention because it was weird, and I’ve always liked weird. I’d never heard such a phrase before, and I had no idea what it might mean.

So I began to experiment with the idea, trying to discover what walking (at 79 years old, I’m not doing much pitching) from the inside of my knees would feel like. Something Shawn Green wrote helped me: “I had never considered that awareness could reside some place other than the head.” I began to learn how to move my awareness to the insides of my knees while I walked. Nothing fancy required – just a matter of concentrating your attention.

After I learned to do this, I found it made a noticeable difference in the way my feet engaged with the ground. My soles were completely touching down with each step. In the past, they’d been coming into contact with much more weight on their outside edges. I’d been aware of that, on account of it led to annoying pain, but had never figured out a way to correct it. Now, I had, courtesy of John Smoltz.

Not only were my feet connecting more comfortably and efficiently with the ground, but I began to feel the entire motion of walking emanating from my spine, not from my legs. I felt more centered, more in balance than I ever had. The sciatic nerve pain that had been plaguing me diminished or disappeared entirely.

A while later, I happened onto the film March of the Penguins, which set out to answer

the question of how Emperor penguins, with their nearly nonexistent legs and big floppy feet, manage to get across a good part of Antartica to their breeding grounds, and then back to the ocean, year after year. Turns out they can make these daunting trips because they rock from side to side as the walk, which makes their stride about 70 percent more efficient.

Since then, I’ve worked at incorporating that side to side motion in my walk from the inside of my knees, and it surely does give me more stamina and range than I used to have. And after 8 decades of wear and tear, neither stamina nor range increases on its own. So I’m indebted to a fellow the medical profession often calls a quack, a Hall of Fame Pitcher, and some National Geographic filmmakers, and so’s my dog. We thank them in our hearts every day we’re out walking.

And that’s my real point, here. You never know what you need to know until you hear it put in the right words, words that ring the bell of understanding in your head. And those words are not the same for everyone. That’s why writers keep writing, or at least why I do. I’m not expecting to say anything new, for God’s sake. Maybe if I were a physicist I could tell people something new about quarks or photons or some other substance I’d dreamed up, but there’s nothing to say about human life on this planet that hasn’t already been said.

On the other hand, nobody’s said it the way I do (or the way you do, or she does, or he does, or they does). And I might and you might just say something that rings that little bell for someone, that lets them see or hear or feel something they’d been missing without knowing they were missing it.

When I was 8 or 9 years old, someone or other made a hit record out of an old song from the 1930s called “Three Little Words.” Probably needn’t say that the three little words turned out to be “I love you-ou.” But they could as well be “Lift your sternum,”or “inside your knees,” or “Walk the way penguins do.” Well, that last is more than three words, but you get the idea.

Spread the love