Can the University Press Be Saved From Itself?

Two_Arabs_Reading_in_a_Courtyard

Painting by Rudolf Ernst via Wikimedia Commons.

I’ve been following an online discussion about the relevance and sustainability of university presses (here and here) and whether or not there’s a future for publishers of “books that no one needs to use or wants to read.” Opinions expressed by those laboring within the UP community cite shrinking budgets, the corporatization of the academy, niche markets, and the disruptive onslaught of the Digital Revolution as the leading threats to the traditional UP publishing model. Charles Watkins, director of Purdue University Press, described a more nuanced problem and proffered a solution:

Many university presses, especially smaller ones, did not do themselves a service by attempting to fly beneath the radar at their institutions … . Focusing just on academic disciplines and not serving their university community was not a good strategy. If a university press is subsidized by its parent institution, it should expect to give something tangible back. That can range from explicity aligning the publishing list with the institution’s disciplinary strengths to providing additional publishing services outside the press’s imprint.

Very diplomatic, but he’s still just dancing around the real issue. In a post at a copyeditors’ virtual water cooler, Tammy Ditmore (a professional editor with considerable academic press experience) pointed out the obvious: The king has no clothes!

What [Watkins] doesn’t mention is how the pressure on academics to publish monographs remains as high as ever. Tenure and promotion committees rarely acknowledge changing times, and many give little weight to anything other than scholarly mongraphs published by the top UPs. … So universities pressure their faculty to create books that no one will read, which puts pressure on libraries to buy books that no one will read, which puts pressure on universities to support UPs to create books that no one will read. It seems like a vicious and pointless cycle that very often does NOT contribute to informed dialogue, which is ostensibly the role of academic publishing.

It’s that old “publish or perish” rubric. In the pursuit of tenure, academic aspirants are required to crank out esoteric monographs that no one outside a small circle of specialists will ever consult. It is a rite of passage that those who came of age with Mr. Chips are loathe to surrender: Academics writing to impress other academics in an infinitesimal echo chamber, an exclusive club that disdains anything so unseemly as social media or publishing well-researched, interesting nonfiction aimed at the unwashed masses.

A Broader Mission

In our extended conversation, Ditmore elaborated,

In the nonacademic world, those niche markets get taken care of through self-publishing or tiny niche publishers or even through blogs and electronic discussion lists. Why do the specialized academic niches need to be subsidized so they can produce expensive, hardbound volumes that few people will want to buy? Especially when about three-quarters of  [the content of] those expensive, hardbound books is re-hashing all the prior research on an issue to prove the author has read everything else written on the topic, and one-quarter of the book attempts to advance an argument by one turn of the screw?

Why indeed. Times do change, and the university press must change with them. Publishing scholarly monographs has long been the university press’s raison d’être, but what happens when the dead-tree monograph becomes an anachronism—a quaint artifact of the pre-digital world? Just as it no longer makes sense for the doctoral curriculum to be focused solely on preparing PhD candidates for nonexistent tenure-track teaching positions, an overemphasis on the publication of pricey, small-run, hardbound doorstops is unrealistic and misguided.

Here’s a thought: Why not publish books people want to read?

I certainly won’t gainsay the importance of the monograph to the scholar’s professional development, but there’s no reason for it to be a physical volume, or the primary source of the university press’s income. Digital technologies render the publication and distribution of this kind of specialized research and analysis a relatively inexpensive process. Further, open-access, cross-platform publishing encourages scholarly collaboration and ensures that such data will be searchable. Both of these factors promise to boost usage, but even such expanded utility will not generate the revenue stream necessary to keep a university press afloat.

I agree with Watkins’s contention that UPs need to rethink their mandate, but he’s being entirely too timid. I would recommend broadening his parochial concept of “serving their university community … [through] aligning the publishing list with the institution’s disciplinary strengths [and] providing additional publishing services outside the press’s imprint” to a more expansive mission statement—something along the lines of “servicing an eager and receptive global market by producing books its constituents want to buy and read.”

The UP as Trade Publisher

There are more college-educated readers in the population today than ever before, so why not tap this huge potential market? Rather than being content with churning out yet another scholarly monograph on global economics (zzzzzz…), wouldn’t it be more fiscally responsible and creatively rewarding to have a book like Thomas Picketty’s Capital in the Twenty-First Century (a New York Times bestseller) prominently featured on your university press’s website and Facebook page as well?

The university press should be a robust and functional organ in a multifaceted publishing ecosystem, not an insular, adamantine ward of the academy. Remember those hulking console stereos from the ’60s? Oh, they were adequate—if all you wanted to listen to was Mantovani. But as consumers became more sophisticated audiophiles, they replaced those beasts with component systems that enabled the user to mix and match complementary elements to achieve the sound that soothed their soul. The university press needs to adopt that kind of creative flexibility.

Commercially viable titles would help subsidize pure scholarship while building the professor-cum-author’s (and the press’s, and the university’s) cred. We need not throw the baby out with the bathwater—digital monographs (scholarly journals, too, for that matter) can peaceably coexist with stellar trade nonfiction in the university press’s catalog. Generating income to underwrite a sustainable business model that foots the bill for orthodox scholarship while entertaining and enlightening the public-at-large with worthy trade nonfiction sounds like a win-win proposition to me.

Turning Scholars Into Storytellers

But there’s a catch: Producing compelling nonfiction calls for authors who can write well, ably assisted by editors who know what they’re about. The first element in the equation is problematic; PhD programs are not designed to produce skillful communicators. That really needs to change, and I believe it will.

The unvarnished truth is that a PhD sheepskin is no longer a ticket to a cushy tenured faculty berth, so the nature of scholarly exposition must also evolve. An increasing number of universities are retooling their curricula to prepare doctoral candidates for alternative careers in the Real World, where strong communication skills are critical—and this applies to both the arts and the sciences.

And what about the other half of the equation? University presses should be hiring rather than firing editors. Without good editors, the quality of the books they produce will suffer. It’s as simple as that. If you don’t believe me, ask any bestselling author.

If they can’t justify the cost of keeping a full complement of top-flight editors on staff, university presses should cultivate a stable of qualified freelancers. And they shouldn’t cheap out—unpaid grad students, peer reviewers, beta readers, and crowdsourced editing just won’t do. Professional editing is simply a sound business investment. As Tom Wolfe reminded us in The Right Stuff, “No bucks, no Buck Rogers.”

Demoting and digitizing the monograph, turning scholars into masterful storytellers, adding that professional editorial polish, and aggressively marketing the product to a general audience may not single-handedly rescue the university press from oblivion, but it sure can’t hurt.

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.

Creative Nonfiction: The True Story of a Lone Literary Genre that Rescued Academic Authors from Obscurity, Enlightened the Masses, and Saved the World.

Drieluik met allegorie op het kunstonderwijs, Willem Joseph Laquy, ca. 1770 - Rijksmuseum

Drieluik met allegorie op het kunstonderwijs, Willem Joseph Laquy, ca. 1770 (Rijksmuseum)

In the orthodox liberal arts community, tenure-track professors are expected to teach courses while burnishing their scholarly bona fides through research and publishing efforts—the familiar “publish or perish” model. The soporific tomes churned out under this rubric are often arcane, heavily footnoted monographs relegated to “assigned reading” status; no one expects them to be bestsellers and they aren’t. With the legitimacy of liberal arts curricula being challenged daily and university presses being warned that they’ll have to start turning a profit or else, it seems obvious that the self-serving approach of scholarship for its own sake must be reconsidered.

Ironically, academic publishing could be its own salvation.

Rather than accepting an outdated publishing model as a necessary evil, scholars are increasingly choosing to write for a broader audience—the general public. They are exploring the potentially lucrative realm of creative nonfiction.

Creative nonfiction (a.k.a. literary nonfiction or narrative nonfiction) is a genre of storytelling that presents actual events in a narrative style using techniques commonly applied to fiction writing (think: In Cold Blood, Angela’s Ashes, and A Midwife’s Tale). From an academic perspective, producing books that people actually enjoy reading yields a cascading torrent of positive outcomes: it helps educators build name recognition and strengthen their personal “brand” (and become better communicators in the process), brings prestige and a much-needed revenue stream to beleaguered university presses, and of course, makes knowledge more accessible to all, rather than rationing it out to the privileged few who can afford to shell out the inflated prices of textbooks and specialist journals. Further, successfully tapping the mainstream market makes a strong argument in favor of building and maintaining robust humanities and social sciences programs in our colleges and universities.

University press acquisition editors who once turned up their noses at such plebian literary efforts are beginning to see the potential of publishing titles that hold the promise of reaching a huge market (including digital versions for e-readers and tablets). The canny implementation of social media as an effective marketing tool also alters the calculus. Creative nonfiction is the fastest-growing literary market in mainstream publishing, having eclipsed literary fiction. Do I have your attention now?

But before we get too far ahead of ourselves, I must stress that for all these benefits to materialize, scholar-authors have to craft compelling stories that resonate with readers beyond the confines of the classroom, the peer-reviewed journal, and the professional conference.

In a New York Times op-ed, columnist Nicholas Kristof cited Will McCants, a Middle East specialist at the Brookings Institution, in explaining the institutional bias against popular nonfiction writing: “Many academics frown on public pontificating as a frivolous distraction from real research,” McCants said. “This attitude affects tenure decisions. If the sine qua non for academic success is peer-reviewed publications, then academics who ‘waste their time’ writing for the masses will be penalized.”

Writing for a general audience

In recent years, a handful of academics have bucked the establishment with varying degrees of success. Stephen Ambrose began publishing this type of crossover history in the 1960s, but it wasn’t until his 1992 publication of Band of Brothers that he managed to crack the bestseller lists. John Keegan, a military historian of impeccable academic credentials also penned gritty profiles of war and warriors that found an appreciative audience in the public sphere; his Face of Battle is considered a classic of the genre. Fellow history professor Michael Howard annointed Keegan  “at once the most readable and the most original of living historians.”

Proving that creative nonfiction techniques can breathe new life into a crowded field of historiography, Allen Guelzo recently added Gettysburg: The Last Invasion to the sprawling list of over 6,000 extant titles on the subject. His study, which bagged an impressive array of awards, received glowing reviews: Military History Quarterly called it “a stylish, comprehensive, and entertaining narrative.”

Ultimately, it took someone from outside the academy—a mere journalist—to make academicians really sit up and take notice. Having convinced a small publishing house to take on her lean work of creative nonfiction, Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time, veteran science writer Dava Sobel finally let the genie out of the bottle.

Sobel recounted a dramatic human-interest tale in a terse but accessible journalistic style that would make Hemingway proud. Liberated from academic jargon and the intrusion of hundreds of footnotes and references, Longitude proved to be a breath of fresh air for readers interested in the history of science but unwilling to wade into a dry, academic doorstop.

Much to the chagrin (and utter indignation) of the scientific community, Sobel’s little book sold like hotcakes. In fact, the thin volume was snatched up by Penguin and later optioned for a four-part docudrama starring Jeremy Irons. Could a conventional historical monograph have made such a splash?

The ‘Sobel Effect’

The wild success of this unintimidating read drove academicians mad. One wrote a scathing journal article sarcastically titled, “The ‘Sobel Effect’: The amazing tale of how multitudes of popular writers pinched the best stories in the history of science and became rich and famous while historians languished in accustomed poverty and obscurity, and how this transformed the world. A reflection on a publishing phenomenon.” I’d provide a link, but of course the article was published in an exclusive subscription-only peer-reviewed professional journal of which I am not worthy. Just as “Remember the Alamo!” morphed from being a Mexican army taunt into a highly effective Anglo-republican battle cry, “The Sobel Effect” was soon being uttered by publishers to characterize a very desirable attribute indeed.

One academic science blog reflected, “It is not so much the scientists themselves as the science historians who object to this sort of writing. They are left wondering: why it is that they have spent their entire career in science and can barely sell one thousand copies of their book, while Sobel and company (who are mainly journalists and authors) can make the best seller list? Jealousy?”

Some academics dismiss such “simplistic” renderings of complex themes, arguing that serious scholarship cannot be presented to the lay public in a manner that does not compromise the underlying facts. Apparently, über-popular astrophysicist Neil deGrasse Tyson didn’t get the memo.

Over the past decade, a number of historians have established themselves as superstars of the creative nonfiction medium—David McCullough, Doris Kearns Goodwin, and Civil War historian James McPherson, to name a few.

How to do it

In the wake of Dava Sobel’s enormous success, hopeful writers have struggled to replicate the phenomenon by attempting to reduce it to a formula (replete with impossibly long subtitles) with predictable results. There is no template for successfully mining this genre; it’s just so durned difficult to capture lightning in a bottle. Producing exceptional creative nonfiction calls for the scholar’s research chops and the narrative flair of the professional storyteller. Success requires a good deal of talent, deep subject knowledge, expository skill, and a crackerjack editor. In sum, it comes down to a good (true) story, well told.

Writing really meaty, commercially viable creative nonfiction has much in common with crafting a bestselling novel. Sadly, nonfiction writers—many of whom are recovering academics—are driving themselves to distraction trying to wrestle their thesis, dissertation, or pet research subject into an engaging narrative, because despite years of formal education, they were never taught to tell a good story.

Tips to get you started

When writing creative nonfiction, you must fight the urge to descend into “Great Man” hagiography, dumb-down the facts, or attempt to add color to the story when the stark reality is more compelling than the gilded lily could ever be.

Key traits of the creative nonfiction genre include

  • Appropriate POV—Exercise your creativity: the author can be an objective observer, a subjective witness, or even a participant in the action. Creative nonfiction is an ideal vehicle for memoir or relating the story of an “invisible” or disenfranchised person or group.
  • Narrative style—Creative nonfiction lends itself to clear, simple, descriptive language mercifully bereft of academic jargon and erudition. The prose serves the story, rather than being an impediment to it.
  • Character development, motivation, and pacing are key elements of creative nonfiction writing. This is where the storyteller’s art comes into play.
  • Flexibility of form—No need to follow a prescriptive structural model; rather, adapt form to content. Creative nonfiction can take the form of a book, essay, journal article, blog post, etc.
  • Above all, maintainenance of authenticity—History is subjective (read my post about this here), so it follows that creative nonfiction is equally a product of the storyteller’s interpretation of the “facts.” Your truth will always be more fascinating than fiction, so keep it real.

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.

 

Can History Be True?

“Time dissipates to shining ether the solid angularity of facts.” – Ralph Waldo Emerson

Napoleon a Dit“What is history but a fable agreed upon?” This pithy maxim is generally credited to Napoleon Bonaparte, a man who had plenty of experience manipulating the historical record. Ironically, I have not been able to satisfactorily link this quotation directly to the Little Corporal—the earliest reference I can find is Ralph Waldo Emerson citing it in his famous essay, “History” (1837).

And therein lies the rub: This dubious attribution has been repeated often enough for nearly two centuries to gain credence (a Google search returned 1,140,000 hits). And as repetition leads to consensus, consensus rationalizes validation. All the more so in the Information Age, in which an anonymous “editor” can submit material to Wikipedia, which (universal warnings notwithstanding) has become the go-to reference for Everyman. So hearsay becomes fact by default. Never mind that even if such a declaration was traceable to its original source, its meaning is contingent upon its context.

History is a malleable commodity, indeed. So much for Ranke’s objective historicism!

History is more than a series of data points

Still, the thrust of this aphorism should not be dismissed out of hand. Despite the well-meaning efforts of cliometricians and practitioners of the new social history to infuse the study of the past with the scientific certitude of Big Data and sociological methodology, history obstinately refuses to be reduced to mathematical formulæ and statistical tabulations. “The operations of life, whether private or publick admit no such laws,” counseled Samuel Johnson. “The caprices of voluntary agents laugh at calculation.”

Historical evidence takes many forms: from intimate personal correspondence to authoritative institutional documents; from anecdotal tribal traditions to carefully collected and curated oral histories; from graphic images on hillsides, standing stones, cave walls, illuminated manuscripts, and websites to sound recordings on a variety of media; and of course, there are those three-dimensional artifacts… . Each with its own hidden agenda. Collectively this body of evidence is capable of yielding some understanding of the essence of an individual subject or group at a specific moment in time, but none of its component parts are value-neutral.

The relativist would say that every scrap of evidence collected by even the most fastidious historian, regardless of provenance, is subject to interpretation—the highly subjective filter of human agency. If that is so, history will always be more art than science, a unique species of literature, or as Emerson framed it, “There is properly no history; only biography.” Whatever your philosophy, it’s hard to argue with E. H. Carr’s commonsense pronouncement: “The function of the historian is neither to love the past nor to emancipate himself from the past, but to master and understand it as the key to the understanding of the present.” The writing of history is truly an interdisciplinary form of composition.

Putting the “human” in humanities

History is the record of an event or events instigated, experienced, related, and recorded by people. Some were lettered, others illiterate; some were eyewitnesses, others had their backs turned at the crucial moment; some were well-intentioned reporters, while others were just looking for a free beer. No social forces—be they economic, political, religious, technological, ideological, or military—can exist without the involvement of human beings; flesh and blood, gristle and bone. And the relationship is a symbiotic one: Real people, from serf to lord, are the fulcrum upon which these inanimate forces exert their powerful influence and vice-versa.

Commenting on the profound value of Sir Walter Scott’s historical novels, Thomas Carlyle reminded us, “the bygone ages of the world were actually filled by living men, not by protocols, state-papers, controversies and abstractions of men. Not abstractions were they, not diagrams and theorems; but men, in buff or other coats and breeches, with colour in their cheeks, with passions in their stomach, and the idioms, features and vitalities of very men.” (I trust we can forgive Carlyle his quaint chauvinism—we are all a product of our times.)

But academic history is presumed to be nonfiction (even if it falls within the genre of creative, or narrative, nonfiction), so to maintain a sense of verisimiltude it must be predicated on thorough research and data collection in as many of the evidentiary fields as possible. Clio (whom Herbert Butterfield affectionately called “that old reprobate”) must be courted with deference and respect. Yet when the research phase is complete, these cumulative facts reveal nothing in and of themselves; the past is unconcerned with the present.

A historian must weigh every word, every fragmentary artifact, and paint a convincing portrait of her subject based on the subjective selection, arrangement, and interpretation of these data. Any randomly chosen group of impeccably credentialed, conscientious scholars can analyze an identical assemblage of primary data and produce wildly divergent readings of the same historical “truth.” None are necessarily right or wrong—they just reflect different points of view, as in Kurosawa’s Rashomon. The problem is articulately expessed by comedian Steven Wright when he deadpans, “How do you know if it’s bad jazz?”

This is not meant to be a deconstuctionist diatribe; rather, I am suggesting that as historians, we belong to an exclusive club. We are the progeny of the bards and the shanachie—keepers of the flame. We are the storytellers…

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.

Technology & Culture Update 5/25/13

things come apartTechnology and culture embodied in art: Since we’ve still got a few days of Bike Month left, I thought I’d share some velo-centric goodness with y’all. To kick things off, get a load of photographer Todd McLellan’s wild photo of a dissected vintage road bike. This image, taken from the artist’s “Disassembly Series,” is just one of many quotidian items rendered as objets d’art that McLellan says, “have, are, or will be in our everyday lives.” The complete study is now available as a coffee table book called Things Come Apart.

Bike helmets work! Well, there’s a shock. I’ve addressed this issue before, and I’m gobsmacked that it takes a well-funded scientific study to conclude that you’ll protect your eggshell-like brain bucket by wearing a helmet. I’m equally appalled when I see a cyclist riding sans helmet—a transgression occasionally compounded by a helmet dangling from the handlebars. D’oh!

Some folks believe that commuting by bike is dangerous and are petrified of experiencing a Close Encounter of the Automobile Kind, but that seemingly rational fear has been proven fallacious. Still, when New York City announced its plan to launch a bike-share program, skeptics insisted that it would be unsafe, due to the automotive congestion (and the notorious recklessness of the cabbies of Gotham).

Mayor Bloomberg caved, so while it is apparently perfectly sensible to legislate the volume of sugary drinks New Yorkers can consume to protect them from diabetes, protecting his constituents’ heads from brain damage would be compromising their personal freedom. Go figure.

A recent piece on NPR reinforced the conclusion that cycle vs. automobile collisions are rare, but cycling crashes (with other bikes, pedestrians, or potholes) are in fact quite common. In any case, a helmet will protect your noggin. It’s just—sorry—a no-brainer. And counterintuitively, the report concludes, “the more people bike, the safer it may become.” Just wear yer dang helmet, people…

Silent spring of (18)62: You might think we’ve pretty much squeezed all the life out of the Civil War, but as Spielberg’s biopic Lincoln revealed, there are always new perspectives to be illuminated. As a Civil War historian myself, I was fascinated to learn that two academics have discovered another way to put old wine in new bottles. Timothy Silver and Judkin Browning, professors at Appalachian State University, received a $100,000 research fellowship to co-author an environmental history of the Late Unpleasantness.

The peripatetic migration of men and animals during the war years was largely contingent upon weather patterns, and the environmental impact of those movements on the local populace and the nation-at-large has yet to be the subject of academic scrutiny. For example, Silver believes that weather, rather than strategy or tactics, resulted in the termination of McClellan’s “On to Richmond” campaign. The environmental historian speculates, “If it hadn’t rained and the war had ended with McClellan taking Richmond in 1862, there would have been no Emancipation Proclamation,” and therefore, no fodder for another Spielberg epic. Interesting theory, but there are a couple of pretty big “ifs” in there.

HhHH cvrMetonymic magic: “me·ton·y·my (noun) : a figure of speech consisting of the use of the name of one thing for that of another of which it is an attribute or with which it is associated.” So says Merriam-Webster. While the media is all atwitter with the announcement of the billion-dollar deal involving the acquisition of Tumblr by Yahoo! (who concocts these silly names?), I was, perversely perhaps, more entertained by James Fallows’s treatise on this obscure linguistic construct.

Fallows shares his readers’ comments regarding the subtleties that escaped elucidation in the dictionary definition. There are some colorful examples given to illustrate the point, my favorite being, “Calling [Karl] Rove ‘Turd Blossom’ is metaphor – he’s not actually a flower. Calling him ‘the Brain’ or ‘Bush’s Brain’ is metonymy – he is famous for his use of his brain.” To put a finer point on it, I suspect this particular metonym was a play on the German epithet, Himmlers Hirn heisst Heydrich (abbreviated as “HHhH”), which translates to: “Himmler’s brain is called Heydrich.” (Incidentally, there’s a wonderful novel by the same name—check it out).

This may seem like so much pedantry to the average reader, but you’re not “average,” are you? Language matters. The proper use of our rather rich language is what separates the men from the boys in the world of intelligent, clear messaging (it’s just a figure of speech, so please don’t label me a “sexist pig”—that would be a metaphor, not a metonym).

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.

Technology & Culture Update 4/5/13

Image from the Book of KellsTrinity College Dublin recently posted individual hi-def images of every page (all 667 of ’em) of the justly famous illuminated manuscript known as the Book of KellsWhat a wondrous orgy of color, calligraphy and ornamental design! The circa eighth-century masterpiece recently served as the inspiration for the highly acclaimed animated film, The Secret of Kells, which was nominated for an Academy Award for Best Animated Feature in 2010. This is an outstanding example of what can be accomplished when technology influences culture in a good way.

Like the country itself, our language is seasoned with the polyglot contributions of an array of cultures. And that doesn’t even include the home-grown slang that is uniquely American. So how do you find the perfect word when you’re nowhere near your reference shelf or computer? Thesaurus Rex for the iOS to the rescue! More than a static e-book, T-Rex is an iPhone app that engages the power of digital technology to help you refine your searches. According to its developers’ marketing hyperbole, “Thesaurus Rex has revolutionized that ‘list of synonyms’ into a dynamic experience that sorts and filters words by their senses, relevance, complexity, and length.” I plan to give it a test drive; I welcome every tool that helps me write better.

As the academy struggles with the changing definitions of scholarly publishing in a digital world, Nature magazine offers a special issue devoted exclusively to the subject. Not surprisingly, the Open Access movement is an overarching theme: from OA’s influence on publishing costs and copyright issues to the explosion of shady operators usuing bogus journals to fleece unwitting scholars. There’s also a piece about the awesome Digital Public Library of America (DPLA) initiative — about which more below.

The DPLA is envisioned to be “an open, distributed network of comprehensive online resources that would draw on the nation’s living heritage from libraries, universities, archives, and museums in order to educate, inform, and empower everyone in the current and future generations.” Think of it as the great Library of Alexandria rising Phoenix-like from its own ashes. You can read an excellent backgrounder on the project here.

And I’ll take this opportunity to note that my friend and colleague Dan Cohen has been tapped to take the helm as the inaugural executive director of the DPLA, so the program’s in very good hands. Dan was instrumental in the development of the Roy Rosenzweig Center for History and New Media (CHNM) at George Mason University, which serves as a polestar of the digital humanities movement. There’s already lots of interesting stuff at the DPLA website (so go have a look!), but the official launch is scheduled for April 18. This is history in the making, kidz — the DPLA will be the virtual house that we built.

Hands-free books? Publishing pundit Nathan Bransford philosophizes about how Google’s “Project Glass” might affect our reading habits. However, the cutting-edge specs are already being cloned in China, and an American firm (Vergence Labs) is offering its own iteration of the technology under the moniker of “Epiphany Eyewear.” Vergence claims its geeky-looking frames are a match for Google’s “smart glasses.” And the beat goes on…

book spine poetryIn celebration of National Poetry Month — you knew it was National Poetry Month, right? — we’d like to draw your attention to a couple of unique genres of that literary medium. The first involves creating poetry by stacking up books (the physical, dead-tree kind) and reading the titles as verse. It’s all the rage on Pinterest and Tumblr. Go ahead, give it a try! In a somewhat higher-tech (though equally arbitrary) approach, techno-geeksters Sampsa Nuotio and Raisa Omaheimo harness the autocomplete feature in Google search to generate “Google Poetics.” You can see the results posted on their Tumblr page. Yes, you can join in the fun, and fear naught, the Mighty Google won’t pull the plug on this project.

The embarrassment of riches offered by the mass of information easily accessed on teh webz offers the temptation to indulge in sloppy scholarship and cut ‘n paste research methods. But beware: failure to attribute sources can ruin your weekend. Benjamin A. Neil, a legal affairs prof (truth!) at Towson University, was busted for serial plagiarism and felt obliged to resign his position as head of the local school system’s ethics panel as a result. Wise move, Ben. A master of understatement, Neil defended his cadged scholarship saying, “I don’t think I’ve done anything wrong. The issue seems to be that I didn’t put things in quotes.” D’oh! Consider this a cautionary tale, boys and girls. Purloin, publish and perish.

And while we’re on the subject: Mark Liberman (contributor at Language Log) commented on a blog post by John McIntyre, who was riffing on Roy Peter Clark’s blog post, who in turn cites Richard Posner’s Little Book of Plagiarism about a particularly abstruse aspect of literary replication Posner calls “self-plagiarism.” Whew! Now you can add the Digital Warrior-Poet to that list of breadcrumbs. And if you’re not seeing tracers yet (gotta love those psychedelics), note that there is a “National Summit on Plagiarism and Fabrication” going on at the American Copy Editor’s Society conference in St. Louis as I upload this post. Is it just me, or does the blog format tend to produce things that resemble the cover of Pink Floyd’s classicUmmagumma album, “Ummagumma”?

Finally, I’d like to note that the humanities lost a staunch evangelist this week with the passing of Roger Ebert. His fearlessness and accessible style brought film criticism out of the realm of literary snootiness and into our everyday lives. He taught us how to appreciate the intricacies of the cinematic medium and he did it with grace, humor and goodwill. In a time when we could really use a few more heroes, we are all the more conscious of our profound loss. Roger has taken a “leave of presence,” as he put it, and we will miss his wit and humanity. His passing stands as a gentle reminder to us all to embrace this day, this moment.

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.

Why Read Moby-Dick? Indeed.

London’s Crystal Palace: A whale of an exhibition.

 

Walter Isaacson’s biography of Steve Jobs is a penetrating profile of an obsessive-compulsive visionary (or flaming asshole, if you had the ill fortune to suffer his storied wrath and frequent temper tantrums). Reading about Jobs’ larger-than-life persona stirred long-dormant images of Captain Ahab and his equally obsessed creator, Herman Melville. But obsession is where the similarity ends: Where Jobs was committed to stripping his creations down to their very essence (“Simplicity is the ultimate sophistication”), Melville shares more in common with Bill Gates and Microsoft, determined to encumber a bloated product/manuscript with every bell and whistle possible.

Coincidentally, Moby-Dick; or, the Whale has enjoyed a resurgence of interest in the humanities of late. I was bemused by Moby-Dick in Pictures: One Drawing for Every Page by Matt Kish (which, I confess, looked like a portfolio of absent-minded doodles on scratch paper to my plebian eye) and entertained by Nathaniel Philbrick’s passionate defense of Melville’s epic teasingly titled Why Read Moby-Dick?

Why indeed. Philbrick anoints Moby-Dick as “the greatest American novel ever written” and rhapsodizes over the “magisterial power” of Melville’s prose. Beyond the actual plot, he claims the tale offers a trenchant allegory of mid-nineteenth century America. According to Philbrick, Moby-Dick is a cultural icon that is “as close to our American bible as we have.” I guess he likes it.

At the risk of blaspheming American literary scripture, I offer the following counterpoint to Philbrick’s gushing exegesis.

“A strange sort of a book”

After cranking out several banal sea yarns in the narrative genre which he himself termed “romance of adventure,” Melville made a critical error in judgment by deciding it was time to produce his literary legacy, his masterpiece. He also needed to pay the rent.

His neighbor and confidant, Nathaniel Hawthorne, saw a diamond in the rough in Melville and was shocked to discover that he had never read Shakespeare (or most other literary masterworks). Hawthorne plied his colleague/protégé with the great books of English literature. Melville gratefully absorbed them and (consciously or otherwise) incorporated a variety of literary styles in his magnum opus. Hawthorne believed his friend was gifted but green; writing in 1850, he noted that Melville’s novel, Mardi (which immediately preceded Moby-Dick), was “so good that one scarcely pardons the writer for not having brooded long over it, so as to make it a great deal better.” Melville “brooded long” over Moby-Dick, but in the end, the sprawling epic could have been “a great deal better” had a good editor intervened.

As the head of the household, Herman Melville was torn between attempting to churn out a bestseller and crafting a literary masterpiece—he desperately desired to accomplish both, but didn’t know how to go about it. Midway through the process he shared his frustration with his revered mentor: “What I feel most moved to write, that is banned, —it will not pay. Yet, altogether, write the other way I cannot.” As the manuscript took shape, he forewarned Richard Henry Dana, “It will be a strange sort of a book, … I fear.”

When it finally appeared, the work confounded the critics. Evert Duyckinck, a close friend of the author and leading light of the New York literary publishing scene, labeled Moby-Dick “an intellectual chowder,” and Joseph Conrad later called it “a rather strained rhapsody with whaling for a subject and not a single sincere line in the three vols of it.” A generation hence, Bernard DeVoto extrapolated, “Moby-Dick has, as fiction, no structure whatever. Its lines of force mercilessly intercept one another. Its improvisations are commoner and falser than those in Huck Finn. It does not suffer from burlesque (exuberant humor had no place in Melville’s nature) but its verbal humor is sometimes more vicariously humiliating than such passages as Huck’s discussion of kings … . And, though Melville could write great prose, his book frequently escapes into a passionately swooning rhetoric that is unconscious burlesque. He was no surer than Mark, he was in fact less sure, of the true object of his book, and much less sure of the technical instruments necessary to achieve it.”

“Wantonly eccentric”

The book’s schizophrenic cosmic dance is a disjointed romp through a litany of diverse voices: from romantic narrative to moralistic parable; from Elizabethan soliloquy to Calvinist sermon; from a satire on legal discourse to a parody of naturalist erudition—there’s even a deranged comic opera sequence worthy of a Gilbert and Sullivan–Tom Waits collaboration. Melville’s ponderous prose is rendered more obtuse by his peripatetic linguistic gymnastics that wander aimlessly through the rolling seascape of the novel. Witness: “That certain sultanism of his brain, which had otherwise in a good degree remained unmanifested; through those forms that sultanism became incarnate in an irresistible dictatorship.” Say wha?

Many early reviewers shared their disdain of Melville’s discordant attempt at mastering the novel form, complaining that “all the regular rules of narrative or story are spurned and set at defiance.” Still others were shocked by his abuse of style; one literary scribe noted how Ahab “raves by the hour in a lingo borrowed from Rabelais, Carlyle, Emerson, newspapers transcendental and transatlantic, and the magnificent proems of our Christmas pantomimes.” The London Literary Gazette called Melville’s prose “wantonly eccentric” and “outrageously bombastic,” while another (more discreet) British critic pronounced, “Mr. Melville is endowed with a fatal facility for the writing of rhapsodies.”

In all fairness, at this point in his career, Herman Melville was a young man and a relatively immature author. He was just beginning to plumb the depths of his soul for the meaning of life—in short, he was dazed and confused. “And so, through all the thick mists of the dim doubts in my mind, divine intuitions now and then shoot, enkindling my fog with a heavenly ray,” reveals Ishmael/Melville. “And for this I thank God; for all have doubts; many deny; but doubts or denials, few along with them have intuitions. Doubts of all things earthly, and intuitions of some things heavenly; this combination makes neither believer nor infidel, but makes a man who regards them both with equal eye.”

Hawthorne kenned Melville’s metaphysical struggle, noting that his friend could “neither believe, nor be comfortable in his unbelief.”

“A final hash”

The protracted section dealing with the nature of the whale and the whaling industry (known in scholarly circles as the “cetological center”) could certainly stand on its own as a worthy contribution to the corpus of natural history (not surprising, coming from an author for whom “a whale-ship was my Yale College and my Harvard”), but it also effectively fractures the narrative arc. This perplexing stylistic dichotomy results in Moby-Dick being a book-within-a-book; indeed, Duyckinck felt that there might even be “three books in Moby Dick rolled into one,” and he enumerated them: a transcendental, soul-searching romance (with a healthy dose of Faustian melodrama thrown in for good measure); “a thorough exhaustive account … of the great Sperm Whale”; and a “moralizing, half essay, half rhapsody, in which much refinement and subtlety, and no little poetical feeling, are mingled with quaint conceit and extravagant daring speculation.” And as a rousing sea-faring adventure saga, Moby-Dick puts Captain Horatio Hornblower in the shade.

Considering its stubborn refusal to be classified, perhaps we shouldn’t label Moby-Dick a novel at all. Addressing this theory, one modern scholar offered, “Moby-Dick both in its quest plot and in its plot of cetological inquiry manages to refine the basic interests of an adventure narrative into what can only be called an epistemological suspense.” Melville’s masterpiece would very likely be the lone entry in this new literary genre. The author was painfully aware of his story’s multiple personality disorder; he declared resignedly to Hawthorne, “the product is a final hash.”

In addition to being a befuddling admixture of literary styles, the text is simply too long. An early British reviewer suggested that the book “might very conceivably have been comprised in half of these interminable volumes.” Duyckinck, too, weighed in on the subject: “The intense Captain Ahab is too long drawn out … . If we had as much of Hamlet or Macbeth as Mr. Melville gives us of Ahab, we should be tired even of their sublime company.”

Melville’s obsessive proclivities are well documented and according to one biographer, his correspondence during the critical period in which he expanded and largely rewrote the manuscript is peppered with the mantra, “I can’t stop yet.” A reviewer echoed this declaration: “… once embarked on a flourishing topic he knows not when or how to stop.” More characteristic of his weakness for declamatory rhetoric, Melville (in the persona of his alter ego Ishmael) feverishly spouts, “Unconsciously my chirography expands into placard capitals. Give me a condor’s quill! Give me Vesuvius’ crater for an inkstand! Friends, hold my arms!” One wag of a critic lamented, “Oh that his friends had obeyed that summons!”

That he was capable of producing remarkable prose is not in question. Consider this passage from the chapter entitled, “The Symphony”:

Hither, and thither, on high, glided the snow-white wings of small, unspeckled birds; these were the gentle thoughts of the feminine air; but to and fro in the deeps, far down in the bottomless blue, rushed mighty leviathans, sword-fish, and sharks; and these were the strong, troubled, murderous thinkings of the masculine sea.

Yet Melville also had an unsettling penchant for the overuse and misuse of punctuation—particularly the semicolon, of which he was inordinately fond. Couple this quirk with the truly abominable prose he was occasionally capable of spewing, and you are presented with something like this:

Look! here, far water-locked; beyond all hum of human weal or woe; in these most candid and impartial seas; where to traditions no rocks furnish tablets; where for long Chinese ages, the billows have still rolled on speechless and unspoken to, as stars that shine upon the Niger’s unknown source; here, too, life dies sunwards full of faith; but see! no sooner dead, than death whirls round the corpse, and it heads some other way.

Does a novel have to fit a rigid, predetermined structure to be great (or even good)? Should we conclude that a painting can only rise to the level of great art if the artist renders a balanced composition with controlled brush strokes and slavishly adheres to the tenets of an established school? Picasso certainly didn’t think so (and neither do I).

To use a more modern literary analogy, I must admit that while I was initially put off by J.P. Donleavy’s execrable English, I found that as I became drawn into the zany realm of post-World War II bohemian Dublin, I began to feel that the author’s linguistic indiscretions mirrored his characters’ wild behavior. The author’s quirky prose perfectly complements the social milieu he was sketching. But in Donleavy’s case, this literary device was employed quite intentionally and adds a certain piquant veracity to his work. Not so, Melville and his whale story. One must know the rules before breaking them.

Moby-Dick as Allegory

Melville’s masterpiece was composed at the apogee of the Industrial Revolution. It was an epoch of social upheaval and change, a period in which artisans were being methodically and inexorably supplanted by machines in the name of progress. Given this profound sea change, Moby-Dick is to mid-nineteenth century American literature what London’s Crystal Palace exhibition is to the Machine Age—a harbinger of things to come.

The Crystal Palace (which opened its doors in 1851, the same year as the publication of Moby-Dick) was a hulking monolith whose færy-castle aspect held the promise of unimagined wonders within. Its glass skin allowed shafts of filtered light to penetrate the depths of its cavernous interior where seemingly endless “pavilions,” each with its own distinct personality, invited exploration.

Prince Albert grandiosely viewed this inaugural “world’s fair” as an opportunity to promote international peace and goodwill (ahem, and commercial intercourse), but the public wasn’t so philosophical about it. To the typical attendee, the scale of the structure itself was imposing and not a little intimidating, and once inside, there was a mind-boggling array of widgets and gizmos to fill one with awe—a day at the Crystal Palace was likely to wear one to a frazzle.

Yet despite the fact that this massive commingling of fine art and the “useful arts” was assembled under one roof, purportedly with a noble common theme, one would be hard-pressed to find the thread of continuity between the Koh-i-nor diamond and Colt’s revolving pistols. To the working-class folk who attended in droves, it was a palace of the possible, a testament to the cultural and industrial superiority of the British Empire. The Crystal Palace was many things to many people, but at its core it was a celebration of “industry” in the full Victorian sense of the term.

Moby-Dick, too, is many things to many people—both literal and allegorical. But when you strip away the layers of meaning applied after the fact by generations of literature professors, “the greatest American novel ever written” is ultimately little more than a reflection of the angst of a tormented soul attempting to deal with the ephemeral aspects of his spiritual and aesthetic growth while being buffeted about by the pitch and yaw of the artist trying to survive in a crass, unforgiving commercial world. “I am so pulled hither and thither by circumstances,” Melville confided to Hawthorne in a blue funk. “The calm, the coolness, the silent grass-growing mood in which a man ought always to compose, —that, I fear, can seldom be mine. Dollars damn me; and the malicious Devil is forever grinning in upon me, holding the door ajar.”

Herman Melville produced better prose when he wasn’t trying so hard.

the DW-P

Aden Nichols is an independent editor and writer. He is available for print and digital projects: books (academic, narrative/creative nonfiction, memoir, speculative/alternate history, etc.), websites/social media, and business communications. Visit his website (www.LittleFireEditorial.com) or email him at: Aden@LittleFireEditorial.com.