Here is a term that has been on my mind a lot, ever since some kids walking down the street, twenty years ago, spotted me and my manually powered reel mower.
“Look at that dude! He’s got one of them throwback lawnmowers!”
That’s a good working definition of an atavism. The etymology given by the OED, as you might guess, is the Latin atavus, either “a great-grandfather’s grandfather” or more generally, “an ancestor.”
For once, the OED’s entry appears really limited, providing no usage examples. It notes resemblance to an ancestor rather than to one’s parents, or the recurrence of a disease common in distant family history, but not in one’s recent ancestors. My favorite print dictionaries, old and new, provide little more. So I will strike out into the atavistic thickets by myself.
I’ve seen our word, as noun and adjective, used both in science and elsewhere, to mean a “throwback,” something from an earlier time that has somehow erupted into the present. I write “erupted” because my sense of the term is not an historical or biological survival from an earlier epoch but something that emerges, like new. It calls the mind and eye back to an earlier time. Hence Frank Norris’ description of the titular character in one of my favorite novels, McTeague: “His head was square-cut, angular; the jaw salient, like that of the carnivora.” Norris’ protagonist is a brute, a throwback to some imagined caveman past.
Consider nonhuman examples: I do not mean a perfectly restored 1964 1/2 Ford Mustang but one sold as new, presumably a zero-mile example found improbably on the premises of a Ford factory. Better still, imagine the faces of shocked workers when such a car appeared magically on the assembly line. That dream of car collectors would be in keeping with the biological idea of atavism.
My favorite pop-culture atavism appears at the top of this post.
I have been waiting a long time to use the Mountain Dew “Throwback” logo for something. I drank the stuff in high school. Somehow I lost the taste, but my fondness for Hillbilly kitsch has remained strong.
This blog will continue all summer, so nominate a word by e-mailing me (jessid -at- richmond -dot- edu) or leaving a comment below.
This word seems easy enough. The adjective refers to existence. That is, indeed, the earliest definition in The OED Online, “of or relating to the existence of a thing.” That sense goes back as far as the 17th Century.
Outside of academia, one often encounters the word in the sense of “being a matter of life or death.” I’ve heard North Korean nuclear weapons, unmarked asteroids hurtling by the Earth, and slowly mounting climate change all referred to as “existential threats” to human civilization or even the survival of our species.
If only, however, it were that stark. We would have a very short post indeed this week, but we can blame mid-20th-Century philosophers and writers for making matters existential so complex. Here the OED and other references take us into the realm of existential philosophy, or existentialism. If you have read the works of Sartre or Camus, you may consider it a gloomy school of thought. Read The Stranger, or any of American author Paul Bowles’ austere and beautiful fiction to encounter the core of existentialism: that humans are alone in an indifferent if not hostile universe. Our actions, while freely chosen on our parts, mean, finally, nothing.
Yet an existentialist philosophy need not be so bleak. I’ve been reading Man’s Search For Meaning by Viktor Frankl, after running across the work as a reference in an article about the value of failure in learning.
Frankl, an Austrian psychologist, not only survived Auschwitz and, almost as harrowing, a Bavarian concentration camp in the Second World War’s last months, but he practiced medicine in the latter camp. He had little to offer fellow prisoners aside from a few aspirin doled out by the SS and kind words. Despite contracting typhus, Frankl reconstructed a manuscript seized from him at Auschwitz. It contained a new system of psychology that Frankl called logotherapy. This was an existentialist form of therapy to address what the psychologist called “the existential vacuum” of modern life, where cultural traditions have waned and leisure time often results in mere boredom. Frankl’s theory and practice emphasize focusing on creating meaning in one’s life and pursuing goals, even in the bleakest situations.
That’s hardly gloomy, yet there too our word of the week speaks to the essentials of human existence.
This blog will continue all summer, so nominate a word by e-mailing me (jessid -at- richmond -dot- edu) or leaving a comment below.
Science-Fiction and Fantasy writer Fran Wilde, who works with my students when she’s on campus, once quipped “Joe, you are a misanthrope in danger of becoming a curmudgeon.”
Fran actually had that backwards, and that says a great deal about how fine a line exists between these words and, perhaps, who they represent. The Oxford English Dictionary Online only takes the term back to the 16th Century, in the sense of being mean-spirited and mistrustful. The word’s genesis, the OED notes, is unknown.
Like some curmudgeons I have known, then, our word seems to have just shown up to spoil our days. The American Heritage Dictionary also reveals that for two centuries, attempts to find the origin of the word have failed. The term has, moreover, shifted in what it signifies. For a long time, the elusive curmudgeon often was depicted as old, mean, and miserly. Think of Ebeneezer Scrooge (a character I portrayed in our 6th Grade Christmas play). Lately the grasping miser seems to have given way to a merely grumpy old geezer, usually male. Thus my Simpsons’ example.
So short-tempered, mistrustful, grumpy? That’s me, Fran. But a hater of all mankind? Nonsense! That would be someone like Mark Twain late in his life, who wrote in an 1898 notebook entry that “The human race consists of the damned and the ought-to-be damned.” Those are the words of someone who really hates the entire species: a misanthrope. You see it in his later work, especially after A Connecticut Yankee in King Arthur’s Court.
I hope my fate is gentler than the hero of that novel or, for that matter, its author. Writing this has me grinning, something curmudgeons rarely do. So perhaps there is hope. Just stay off my lawn this summer!
This blog will continue through the balmy months, so nominate a word by e-mailing me (jessid -at- richmond -dot- edu) or leaving a comment below.
I am known for my hatred of superhero movies and, frankly, the entire genre of the superhero comic book. The plot arcs are so often predictable, the attempts at stirring our emotions so bombastic. I do enjoy the occasional effort such as Kickass that subverts the conventions of the genre, but that sort of film sounds its yawp into the teeth of a hurricane.
Now our superpower-obsessed tastes, not being content with ruining popular cinema, are also dumbing down speech, even student prose.
This morning during my drive to work, I listened to an otherwise talented NPR reporter use the adjective “super” to describe aspects of a refugee simulation under way. Her sloppy use of the term undercut the seriousness of the story: an Iranian immigrant who had fled Iraqi airstrikes in the First Gulf War teaches others how the experience of fleeing one’s home might feel.
The reporter, speaking too fast as so many current NPR staff do, described a life-raft as “super cramped” and at about that point, I wanted to turn off the radio. It’s a lazy word, “super,” that slowly has been creeping into student writing. I plan to add it to my Pet Peeves list at Writer’s Web.
The usage illustrates what Joe Glaser, author of Understanding Style, decries as too much informal diction seeping into formal writing. I have yet to see a student in my “Space Race” First-Year Seminar refer to the Saturn V moon rocket as “super big,” but I await that dark day with each written response.
My hunch about “super,” as with the even worse “totally,” comes from the increased orality and interruptive nature of informal speech. I hear students talk over each other, omitting nuance and forethought. Most of my students and even some of my peers are not doing as much serious reading–if any reading at all–beyond what a class assigns. When my students do read, they do not engage in any reflection on how a decent author crafts a sentence or uses language in surprising ways.
Thus non-readers are left with a small grab-bag of simple modifiers. “Super” has become the modifier of choice to replace other simple adjectives and adverbs: “very,” “extremely,” “extensively,” and the like.
In my courses, all of them more or less based upon a 100-point scale, I plan to deduct 1 point for “super” used in place of a more descriptive word. And I plan to be super clear about that.
On Feb. 23, my advanced academic writing students from the School of Professional and Continuing Studies confronted their iPods. After blogging as a prewrite and writing their critiques—in this case 600-word reviews of movies or TV episodes—they scaled what they’d written down to 300 words and prepared to tape their voices. They’d already been reading sections of their papers aloud each week in class in order to slow down, hear where they faltered, and find what worked and what needed to be re-structured and re-worded. Now they faced this new prospect. Their voices would be taped and they would listen to the recordings with their classmates. This is what petrified them the most.
One of my students, Janice, in reflecting on that initial fear wrote, “I liked listening … even when [classmates] did not enjoy listening to their own. It seems as if all of us felt the same way, although after the first listen, we seemed to get over the initial feelings of dread at hearing our own voices.”
Another student Seth admitted “when we listened to my recording in class I could not hold my head up, even though I heard the class making positive remarks.”
An interesting thing happened during recording, and then later as they heard as each other’s podcasts. Students became more aware of audience and the effect that had on revision. For Deborah, the rhetorical situation became real. In a reflective cover letter, she confided, “I was horrified at the thought of my fellow students listening to me ramble over grammatical errors and stumbling over misplaced words. I instantly began revising my paper, as well as practicing how I would say the words.”
Seth acknowledged that before this assignment revision was usually “a one-time read over and making some minor grammar corrections.” When he taped his voice with the iPod, he found that he paused often to make more precise word choices and to vary his sentence length. The biggest change was a global overhaul to the structure and moving paragraphs around to ensure his message was being heard the way he intended it.
He noted that “being able to listen to my classmates was a big help. This gave me an opportunity to compare and contrast my recording to theirs. From listening, I noticed they had sufficient details which I lacked in the beginning.”
Another student Ryan became “obsessed” with the clarity of the writing. “I spent a good amount of time choosing descriptive words such as ‘greasy,’ and ‘rusting,’” he wrote in his letter about the experience.
While not all students would say the assignment was enjoyable, most thought it was worthwhile. My class had the good fortune of being able to visit the Technology Learning Center (TLC) and get help from student consultants with uploading files onto Blackboard. One cable was not the right kind, but other than quickly switching cables, uploading was not an issue. What’s the next pedagogical step for student podcasting? I plan to use it in the summer with my English 202U students for a preliminary assignment on the way to writing a longer, more developed paper on “What Happened to the American Dream?”
This summer I let a fellow writing teacher intimidate me with technology. He handed me an iPod and said, “Pick out a song” and I was baffled and more than a little embarrassed that I did not know how. “Like this,” he said, spinning the whirligig that I’ve since learned is called a click wheel and selecting a song by Radiohead.
It was enough to provoke me into learning how to use the card-sized plastic and metal audio device and to begin to consider applications for my first Advanced Academic Writing class at the School of Professional & Continuing Studies. My suspicion that I was onto something were confirmed by Assistant Professor Kevin Bruny’s presentation at the annual spring faculty meeting on how his human resource management class benefitted from the audio and video capability of iPods.
Since then, I’ve done research and found that Duke University successfully piloted the use of iPods to first-year students in 2004, and Middlebury College students had “mixed success” using them for 2005-2006 summer language school, success with “pronunciation and vocabulary studies” and minor problems uploading to the Web.
Crispin Dale of the University of Wolverhampton in the U.K. reported in 2008 on “Podogogy” in the International Journal of Teaching and Learning in Higher Education, in other words, the ways iPods stimulated creativity in learning and teaching college level dance, theater, and music classes (4).
A key feature of iPod in the college classroom, according to Peter Galuszka in a 2005 article “Technology’s Latest Wave” in Black Issues in Higher Education, is its portability. Give students an iPod and they can take lectures with them, in their suitcase, to meetings, and standing in line at the DMV (in the DMV’s defense, the last time I only had to wait three minutes). Middlebury’s writing program, according to a case study posted on Educause, embraced iPods to record class sessions and post on a blog. In my SPCS class, I have adult learners, and like me, they seem hesitant to take risks, not just with whirligigs but on taking chances with their writing.
My first idea, therefore, was to ask them to go all out in critiquing a movie, book, or TV episode they’ve seen, heard, or read lately and record their voices reading these reviews aloud. I assigned their choice of a “rave” or a “slam,” an exercise borrowed from Richard Johnson-Sheehan and Charles Paine’s Writing Today that walks students through arriving at evaluation criteria, a necessary component in writing research papers.
On Feb. 23, I will hold my breath a little as students use a USB cable to upload the wav files onto Blackboard, and I take comfort in having backup–student technicians to answer questions at the CTLT. The next step will be listening to each other’s podcasts and commenting through discussion threads on tone and word choice. My dream-scenario is that my writing students will begin to see a range of what’s possible with persona and language in arguing a point.
Another idea for how to use iPods came to me while talking to Ken Warren, Academic Technology Consultant at the Center for Teaching, Learning, and Technology. Why not use them as mobile learning tools for revision? My students are already reading their work aloud. Each time they bring in an assignment, I ask them to share a selected section.
Hearing where they stop, falter, and self-correct can become an impetus for revision perhaps more so than feedback. With iPods, they can record their voice, play it back, and listen to how their writing sounds. Many SPCS students work full-time, and the iPod lets them record, listen, and reflect, no matter where they are in the queue to renew license and tags.
Are we there yet? Are my students using iPods to revise? I know one of my students has been using hers because last Thursday she informed me she’d misplaced the little white cable that powers it up when the battery dies. I’ll let you know how the rave and slam assignment goes. In the meantime, I’m relying on staff at the CTLT for allaying whirl-and-click trepidations and answering questions, mostly mine.
Dale, Crispin. “iPods and Creativity in Learning and Teaching: An Instructional Perspective.”
International Journal of Teaching and Learning in Higher Education. 20.1 (2008):1-9. ProQuest. Web. 12 Feb. 2012.
Galuszka, Peter. “Technology’s Latest Wave.” Black Issues in Higher Education 22.2 (2005): 24-
28. Education Research Complete. Web. 12 Feb. 2012.
“Middlebury College Case Study.”Educause. 1999-2012. Web. 13 Feb. 2012.
You might read the title of this posting with an echo of Elmer Fudd expressing his chagrin at one of our beloved tricksters. The relevance of this, if not immediately apparent, will be suggested in a bit.
The Oxford English Dictionary tells us that “wiki” is a Hawaiian word meaning fast, the emphatic form of which is “wikiwiki.” This phrase was first used to apply to a user-edited website called WikiWikiWeb composed by Ward Cunningham in 1995 – this was the first wiki. With 9,889,432 views per hour in English alone, most folks know about Wikipedia, but fewer understand what a wiki is or how to use one. Unlike traditional semi-static/gatekeeper websites that require complex software to create and are often controlled by a single person, a wiki can be built online, often for free, and designed so a limited or unlimited number of people can edit it easily.
I have been using Wikispaces for my writing workshops as well as for process drafting of research essays in my first-year writing courses. All changes are recorded in the “history” tab, so there is a complete record of who wrote what and when and this makes for a more thorough collection of drafts and revisions. Often the pages will have a “discussion” tab that is useful for writing workshop feedback, posting questions to readers etc. But wikis can be used for more than writing courses, they can be convenient and flexible group collaboration tools for any project, academic or otherwise. Browse the sites for Wetpaint or PB Works and note the varying complexity and features offered, or check WikiMatrix to find a handy tool for comparing a number of wikis of your own choosing.
Oh, and about those leaks….
In terms of “wiki-ness,” the Wikileaks website is not really a wiki since it is not editable by its users but its revelations certainly introduce an element of speed into the slower pace of partial diplomatic disclosure. Aside from speed, another aspect of wiki-ness manifest by Wikileaks is that of transparency and accountability – or at least the transparency and accountability of the powerful, since Wikileaks founder Julian Assange is less than transparent himself.
On a wiki, every user and every change are recorded, there is no way to hide and all information is immediately available to users. By contrast, FOIA information is laborious to obtain and is often heavily redacted, which is partly why these leaks evoke mixed feelings for me. Why do we subscribe to transparency so selectively? Isn’t one of our chief criticisms of North Korea its absolute lack of transparency? No doubt these leaks will increase tension, make negotiation more difficult and possibly risk some lives but it might also be argued that, in a WMD world, everyone’s life is at risk every day when governments are not transparent.
However, there is a surprising amount of government and institutional cooperation worldwide and a massive assault by individual hackers to shut down or shut out Wikileaks, a fact suggesting that the threat we feel from transparency trumps the value we claim to put on it. Or perhaps it is a measure of our faith in the goodness of what our leaders and institutions do in secret, but this doesn’t seem in keeping with our recent waves of anti-government sentiment, so the motivation of leak opponents is unclear to me. And now, these anti-leak actions have led to “Operation Payback” a barrage of counter-hacking by supporters of Wikileaks in what may be our first public cyberwar.
One thing is for sure: without that cloak of secrecy, with the knowledge that we were watching, powerful persons and groups at all levels would act quite differently, perhaps with more humility, responsibility and vision. As the ambiguous Wikileaks logo suggests, time is running out for our planet and it may just take some kind of digital trickster to stir things up. Ultimately such transparency may be what assures our survival.
Social Media sites like Facebook, Twitter and MySpace are being featured more regularly in the mainstream media and becoming part of our national conversation even though 51% of Americans do not use them. However, one group that is using these sites extensively are employers who pre-screen via social media. This topic came back to my attention recently when a student wrote a journal posting about an NPR feature on this practice and its implications. My student’s response was to consider pulling all of her social media sites down to prevent potential prejudice. And she is not alone in her concern.
Employer snooping is enough of a concern to inspire a social-media deletion site called Web 2.0 Suicide Machine and Facebook is apparently trying to block it from deleting sites that users want deleted. I guess we all click that “terms of agreement” button without really thinking about it (or reading it!). I doubt we would be so quick to click if the first lines of the agreement read: ANY AND ALL INFORMATION, IMAGES, AUDIO, VIDEO, TEXT OR OTHER COMMUNICATION BECOMES CORPORATE PROPERTY AND MY NOT EVER BE COMPLETELY EXPUNGED, RETRIEVED OR CONTROLLED IN ANY WAY BY THE USER. Of course, this is generally true for email as well, but that hasn’t bothered us much over the years.
No doubt employers will continue to snoop and surveil, but I wonder about the impact. In a way, such social media are a mildly homogenizing influence already, in spite of their many options, but if we’re all afraid to express ourselves in these limited ways because we might lose out on a job, won’t we become even more homogenized and bland in our timidity? When Huxley wrote Brave New World one of the purposes for the application of technology was the deliberate homogenization of each class to promote easy management and maximum production. The novel reveal the future of that world, but it is clear that submission to such micro-management and identical duplication are subtly conditioned over time. One clue to the future of their world comes in the opening scene of the novel as a group of Alphas are given a tour of the Hatcheries and Conditioning Center: “A troop of newly arrived students, very young, pink and callow, followed nervously, rather abjectly, at the Director’s heels. Each of them carried a notebook, in which, whenever the great man spoke, he desperately scribbled. Straight from the horse’s mouth. It was a rare privilege.” This does not sound like a group likely to take initiative or arrive at creative solutions to persistent problems – but I’ll bet they would pass an employer pre-screening.
We are doomed in Birkerts’ “Electronic Millennium” unless we adapt to its forms of communication, yet carry with us the Humanities’ irascible and unhip hermeneutics for providing social commentary and critique. Notably, we somehow have to manage this for skim-the-surface students who live in an eternal now of consumerist bliss (or unfulfilled desire).
I nail these 9.5 theses to the digital doors at Wittenberg. Since this is a blog, I won’t make it 95, but that rascal Luther had the luxury of a bookish century to support his spleen.
Get over your fetish for “The Book.” Reading and its habits, not bound volumes, transform our minds. As new forms of communication enhance the reading experience, we should move beyond our walls of books to consider how embedded film, audio, image, and experiential elements enhance new texts. Then we must develop critical methods to teach them. Civilization will not fall if we stop reading Henry James, sad as that would be. It did not end when most educated folk stopped reading Aquinas. If, however, we stop reading thoughtfully, we’re in real trouble.
Embrace Web. 2.0 in a thoughtful manner. These tools can further the critical method of the technologically adept humanist. I’ve learned that Twitter provides a painless way to post a link, report progress on a project, and share ideas quickly with those who share my interests. Blogs provide my students with the opportunity to practice in public what they do only for me in their paper journals, as they move from private to public (and ever more formal) discourse at their course wiki-sites.
Refuse the “eternal now” culture and its interruptive technologies. I don’t carry a cell phone. I check mail three times daily so I can focus on the tasks for which I’m paid and evaluated: supporting students, doing research, and teaching well. To what extent do you practice such habits and provide an example to students? They learn, for instance, that I routinely delete e-mails without a subject line 🙂
Seduce others into seeing The Matrix for what it is. We have many tech users but few who consider their practices critically. Ask students in appropriate assignments to log their uses of a particular networked technology. It reveals much about them. I’ve had fewer writers fret about “those addicted to gaming” when they take a long, hard look at how much time they dedicate to Facebook.
Practice teche and episteme. Kudos to Tom Boellstorff in Coming of Age in Second Life for reminding me what these words mean, as he notes that academics live in their heads too often and don’t create enough. For me, Techne means making in Second Life and outside it, by writing for a general readership in our local alternative weekly and other non-academic venues.
Employ “Ordnung” without driving a buggy. Futurist Howard Rheingold found, when doing research for “Look Who’s Talking,” that the Amish have a sophisticated system for deciding which new inventions get sanctioned or prohibited by their bishops. Generally the community use a new tool for a time, and at each step the members ask whether the tool builds community or pulls it apart.
Dare to reinvent past treasures. Rezzable’s Virtual Tut, my own House of Usher Simulation, and Jane Austen’s (and Seth Grahame-Smith’s) Pride and Prejudice and Zombies point the way to a New Humanities that will move beyond rigor for its own sake to bring playfulness and the ancient sense of “ludus”–school and play–into our classrooms.
Question to paddlers of tomorrow. Textbook publishers, software companies, and some of our colleagues who are early adopters become overly eager and evangelize us about each wondrous new application that awaits. Like some evangelists, some of these paddlers want our money. Others mean well. I listen and apply theses 1-7 in these cases.
Watch South Park or write for the Alphaville Herald. We need to take ourselves less seriously and find social commentary in the lowest of places. Humor is the best medicine to prevent sanctimoniousness.
Thesis 9.5? Add your own in the comments section! “Hush up Iggy” does not count.
In “Into the Electronic Millennium,” a chapter in the very readable and depressing The Gutenberg Elegies, Sven Birkerts laments that our culture of connectedness and instant access destroys something that he–like many Humanities faculty I know on campus–cherish: the contemplative life as reflected in the slow, thoughtful, and reflective reading of challenging books:
Curricula will be streamlined and simplified, and difficult texts will be pruned and glossed. Fewer and fewer people will be able to contend with the masterworks of literature or ideas. Joyce, Woolf, James, and the rest will go unread, and the civilizing energies of their prose will circulate aimlessly between closed covers.
Enter Twitter, with its 140 character tweets, and you have exhibit A for the decline of civilized life as we know it (or maybe we have exhibit R–the lamentations have been going on for a while).
I set out here not to skewer Birkerts or my cyberphobic colleagues. Instead, while reaching to an audience that accepts Web 2.0 tools like Twitter, I want to point out the nature of the cultural decay Birkerts catalogs:
Language Erosion: Nuance gets lost as we shorten our prose, substitute little words for big ones, and lose touch with the origins of words and our cultural history.
The Flattening of Historical Perspectives: Neil Postman’s belief that we live in a “and now, this!” culture of consumption and gratification.
The Waning of the Private Self: Expectations of 24/7 access, quick replies, and easy answers at our fingertips lead us suspect the introspective person, the loner, the dawdler.
And, Professor Birkerts, I agree with you, even as I post a tweet and log on to Second Life.
I too fear a future like that of M.T. Anderson’s Feed, a dark satire of a consumerist culture out of control where vagaries such as “thing” and “stuff” are about the most complex terms in the language, where the Internet is in our heads and not outside them, and where no one remembers much of anything from before the globe became a deadzone of toxic waste-sites.
My students read less and less for pleasure. Most take the easiest path in their studies and even crossing campus. They even fight the difficulties of learning the non-intuitive interface of SL. In fact, many of them seem to want a eternal early-June day of temperatures in the mid-80s, low humidity, and someone else to cut the grass they sit on with their friends. In time they may, in another reference in your book, become “efficient and prosperous information managers living in the shallows of what it means to be human and not knowing the difference.” That is Anderson’s vision of a time just before the Great Collapse of American life.
Twitter alone won’t make that future arrive, especially if we academics appropriate (ah, Marx, thanks for that verb) it for noble ends.
So how do we “Fight the Feed” while using it to keep our cherished ways of learning alive?