Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

End Times

The Sixties.

Depending on what your major concerns are, that period means different things.  For many people, it was revolution, civil rights, the peace movement.  For many others, it was music.

For Michael Walker, it was evidently the latter.  In his new book, What You Want Is In The Limo,  he chronicles what he considers the End of the Sixties through the 1973 tours of three major rock groups—The Who, Led Zeppelin, and Alice Cooper.

His claim, as summarized in the interview linked above, is that after Woodstock, the music industry realized how much money could be made with this noisy kid stuff (which by Woodstock it no longer was—kid stuff, that is) and started investing heavily, expanding the concert scene, turning it from a “cottage industry” into the mega-million-dollar monster it has become.  1973, according to Walker, is the year all this peaked for the kind of music that had dominated The Sixties, made the turn into rock star megalomania, and ushered in the excesses of the later Seventies and the crash-and-burn wasteland of the Punk and New Wave eras (with a brief foray into Disco and cocaine before the final meltdown).

The bands he chose are emblematic, certainly, but of the end of the Sixties?  I agree with him that 1973 is the year the Sixties ended, but the music aspect, as always, was merely a reflection, not a cause.  What happened in 1973 that brought it all to an ignominious close was this: Vietnam ended.

(Yes, I know we weren’t out until 1975, but in 1972 Nixon went to China, which resulted in the shut-down of the South China rail line by which Russia had been supplying North Vietnam, and in 1973 the draft ended, effectively deflating a goodly amount of the rage over the war.  The next year and a half were wind-down.)

Walker’s analysis of the cultural differences before and after 1973 are solid, but while the money was certainly a factor, a bigger one is exhaustion.  After a decade of upheaval over civil rights and the war in Vietnam, people were tired.  Vietnam ended and everyone went home.  Time to party.  Up to that point, the music—the important music, the music of heft and substance—was in solidarity with the social movements and protest was a major component of the elixir.  Concerts were occasions for coming together in a common aesthetic, the sounds that distinguished Woodstock acting as a kind of ur-conscious bubble, binding people together in common cause.

Once the primary issues seemed settled, the music was just music for many people, and the aspects which seemed to have informed the popularity of groups like Cream or the Stones or the Doors lost touch with the zeitgeist.  What had begun as an industry of one-hit wonders returned to that ethic and pseudo-revolutionary music began to be produced to feed the remaining nostalgia.

(Consider, for example, a group like Chicago, which began as socially-conscious, committed-to-revolution act—they even made a statement to that effect on the inside cover of their second album—and yet by 1975 were cashing in on power ballads and love songs, leaving the heavily experimental compositions of their first three albums behind and eschewing their counter-culture sensibilities.)

To my mind the album that truly signified the end of that whole era was The Moody Blues Seventh Sojourn, which was elegaic from beginning to end.  The last cut, I’m Just A Singer In A Rock’n’Roll Band, was a rejection of the mantle bestowed on many groups and performers during the Sixties of guru.  With that recording, the era was—for me—over.

Also for me, Alice Cooper never signified anything beyond the circus act he was.  Solid tunes, an edgy stage act, and all the raw on-the-road excess that was seen by many to characterize supergroups, but most of Cooper’s music was vacuous pop-smithing.  The Who and Led Zeppelin were something else and both of them signify much more in artistic terms.  Overreach.

But interestingly enough, different kinds of overreach.  Walker talks of the self-indulgence of 45-minute solos in the case of Zeppelin, but this was nothing new—Cream had set the standard for seemingly endless solos back in 1966 and Country Joe McDonald produced an album in the Nineties with extended compositions and solos.  Quadraphenia was The Who’s last “great” album, according to Walker, and I tend to agree, but two kinds of exhaustion are at work in these two examples.  Zeppelin exhausted themselves in the tours and the 110% performances.  The Who exhausted the form in which they worked.  After Quadraphenia, all they could do was return to a formula that had worked well before, but which now gained them no ground in terms of artistic achievement.  As artistic statement—as an example of how far they could push the idiom—that album was a high watermark that still stands.  But the later Who Are You?  is possibly their best-crafted work after Who”s Next.  “Greatness”—whatever that means in this context—had not abandoned them.  But the audience had changed.  Their later albums were money-makers with the occasional flash of brilliance.  They were feeding the pop machine while trying to compose on the edge, a skill few manage consistently for any length of time.

“Excess” is an interesting term as well.  Excess in what?  The combination of social movement with compositional daring had a moment in time.  When that time passed, two audiences parted company.  Those who wanted to party (often nostalgically) and those who were truly enamored of music as pure form.  They looked across the divide at each other and the accusation of excess was aimed by each at different things.  The one disdained the social excess of the other while the latter loathed the musical excess of the former.  People gleefully embracing Journey, disco, punk, and a gradually resurgent country-western genre thought the experimental explorations of the post-Sixties “art rock” scene were self-indulgent, elitist, and unlistenable.   People flocking to Yes and Emerson,Lake & Palmer concerts, cuing up Genesis and UK on their turntables, (and retroactively filling out their classical collections) found the whole disco scene and designer-drug culture grotesque.  Yet in many ways they had begun as the same social group, before the End of the Sixties.

The glue that had bound them together evaporated with the end of the political and social issues that had produced the counterculture and its attendant musical reflection in the first place.  Without that glue, diaspora.

And the forms keep breaking down into smaller and smaller categories, which is in its own way a kind of excess.  The excess of pointless selectiveness.

Is the Novel Still Dying?

In 1955, Normal Mailer was declaring the death of the novel. A bit more than a decade later, it was John Barth’s turn.  There have now been a string of writers of a certain sort who clang the alarm and declare the imminent demise of the novel, the latest being a selection of former enfants terrible like Jonathan Franzen and David Foster Wallace.

Philip Roth did so a few years back, adding that reading is declining in America.  The irony of this is that he made such claims at a time when polls suggested exactly the opposite, as more people were reading books in 2005 (as percentage of adult population) than ever before.  In my capacity as one-time president of the Missouri Center for the Book I was happily able to address a group of bright adolescents with the fact that reading among their demographic had, for the first time since such things had been tracked, gone precipitously up in 2007.

And yet in a recent piece in the Atlantic, we see a rogues’ gallery of prominent literateurs making the claim again that the novel is dying and the art of letters is fading and we are all of us doomed.

Say what you will about statistics, such a chasm between fact and the claims of those one might expect to know has rarely been greater.  The Atlantic article goes on to point out that these are all White Males who seem to be overlooking the product of everyone but other White Males.  To a large extent, this is true, but it is also partly deceptive.  I seriously doubt if directly challenged any of them would say works by Margaret Atwood or Elizabeth Strout fall short of any of the requirements for vital, relevant fiction at novel length.  I doubt any of them would gainsay Toni Morrison, Mat Johnson, or David Anthony Durham.

But they might turn up an elitist lip at Octavia Butler, Samuel R. Delany, Tannarive Due, Nalo Hopkinson, Walter Mosley, or, for that matter, Dennis Lehane, William Gibson, and Neal Stephenson (just to throw some White Males into the mix as comparison).  Why?


The declaration back in the 1950s that “the novel is dead” might make more sense if we capitalize The Novel.  “The Novel”—the all-encompassing, universal work that attempts to make definitive observations and pronouncements about The Human Condition has been dead since it was born, but because publishing was once constrained by technology and distribution to publishing a relative handful of works in a given year compared to today, it seemed possible to write the Big Definitive Book.  You know, The Novel.

Since the Fifties, it has become less and less possible to do so, at least in any self-conscious way.  For one thing, the Fifties saw the birth of the cheap paperback, which changed the game for many writers working in the salt mines of the genres.  The explosion of inexpensive titles that filled the demand for pleasurable reading (as opposed to “serious” reading) augured the day when genre would muscle The Novel completely onto the sidelines and eventually create a situation in which the most recent work by any self-consciously “literary” author had to compete one-on-one with the most recent work by the hot new science fiction or mystery author.

(We recognize today that Raymond Chandler was a wonderful writer, an artist, “despite” his choice of detective fiction.  No one would argue that Ursula K. Le Guin is a pulp writer because most of her work has been science fiction or fantasy.  But it is also true that the literary world tries to coopt such writers by remaking them into “serious” authors who “happened” to be writing in genre, trying ardently to hold back the idea that genre can ever be the artistic equivalent of literary fiction.)

The Novel is possible only in a homogenized culture.  Its heyday would have been when anything other than the dominant (white, male-centric, protestant) cultural model was unapologetically dismissed as inferior.  As such, The Novel was as much a meme supporting that culture as any kind of commentary upon it, and a method of maintaining a set of standards reassuring the keepers of the flame that they had a right to be snobs.

Very few of Those Novels, I think, survived the test of time.

And yet we have, always, a cadre of authors who very much want to write The Novel and when it turns out they can’t, rather than acknowledge that the form itself is too irrelevant to sustain its conceits at the level they imagine for it, they blame the reading public for bad taste.

If the function of fiction (one of its function, a meta-function, if you will) is to tell us who we are today, then just looking around it would seem apparent that the most relevant fiction today is science fiction.  When this claim was made back in the Sixties, those doing what they regarded as serious literature laughed.  But in a world that has been qualitatively as well as quantitatively changed by technologies stemming from scientific endeavors hardly imagined back then, it gets harder to laugh this off.  (Alvin Tofler, in his controversial book Future Shock, argued that science fiction would become more and more important because it taught “the anticipation of change” and buffered its devotees from the syndrome he described, future shock.)

Does this mean everyone should stop writing anything else and just do science fiction?  Of course not.  Science fiction is not The Novel.  But it is a sign of where relevance might be found.  Society is not homogeneous (it never was, but there was a time we could pretend it was) and the fragmentation of fiction into genre is a reflection that all the various groups comprising society see the world in different ways, ways which often converge and coalesce, but which nevertheless retain distinctive perspectives and concerns.

A novel about an upper middle class white family disagreeing over Thanksgiving Dinner is not likely to overwhelm the demand for fiction that speaks to people who do not experience that as a significant aspect of their lives.

A similar argument can be made for the continual popularity and growing sophistication of the crime novel.  Genre conventions become important in direct proportion to the recognition of how social justice functions, especially in a world with fracturing and proliferating expectations.

Novel writing is alive and well and very healthy, thank you very much, gentlemen.  It just doesn’t happen to be going where certain self-selected arbiters of literary relevance think it should be going.  If they find contemporary literary fiction boring, the complaint should be aimed at the choice of topic or the lack of perception on the part of the writer, not on any kind of creeping morbidity in the fiction scene.

Besides, exactly what is literary fiction?  A combination of craft, salient observation, artistic integrity, and a capacity to capture truth as it reveals itself in story?  As a description, that will do.

But then what in that demands that the work eschew all attributes that might be seen as genre markers?

What this really comes down to, I suspect, is a desire on the part of certain writers to be some day named in the same breath with their idols, most of whom one assumes are long dead and basically 19th Century novelists.  Criticizing the audiences for not appreciating what they’re trying to offer is not likely to garner that recognition.

On the other hand, most of those writers—I’m thinking Dickens, Dumas, Hugo, Hardy, and the like—weren’t boring.  And some of the others—Sabatini, Conan Doyle, Wells—wrote what would be regarded today as genre.

To be fair, it may well be that writers today find it increasingly difficult to address the moving target that is modern culture.  It is difficult to write coherently about a continually fragmenting and dissolving landscape.  The speed of change keeps going up.  If such change were just novelty, and therefore essentially meaningless, then it might not be so hard, but people are being forced into new constellations of relationships and required to reassess standards almost continually, with information coming to them faster and faster, sometimes so thickly it is difficult to discern shape or detail.  The task of making pertinent and lasting observations about such a kaleidoscopic view is daunting.

To do it well also requires that that world be better understood almost down to its blueprints, which are also being redrafted all the time.

That, however, would seem to me to be nothing but opportunity to write good fiction.

But it won’t be The Novel.

Rules of the Road

Books have speed limits.

Some you can breeze through, a quick run along a sunny straightaway, windows open and wind in your ears.  Others demand that you slow down, pay attention, move with care.

For the slow-but-dedicated reader, if there is a special plea or prayer upon picking up a new, dense book, it should be “Please don’t waste my time.”  I’m one of those readers who feels a compulsion to finish what I start.  I seem constitutionally incapable of just putting a book aside part way through and never coming back to it.

Oh, I’ve done it!  Years later, though, if I stumble across that book, ignored in a box or on the shelf of someone to whom I’ve loaned it, I experience a moment of guilt, a regret that I somehow betrayed it, and that it may take umbrage for having been dallied with and left for another.   “So…a restless spirit haunts over every book, till dust or worms have seized upon it, which to some may happen in a few days, but to others later…” according to Jonathan Swift.  Death to a book is to be ignored.

But we are mortal and have only so much time.  I have a crabbed admiration for people who can decide within ten pages that what follows is not worth their time and can put it aside without a twinge of conscience.  (Crabbed because another part of me keeps wondering what they’re missing.)

“Please don’t waste my time.”  For me, part of the problem is an inability to know if a book will be a waste of time.  Possibly some paragraph or a chapter or even a line from a character will make the whole thing worthwhile.

I read in anticipation.  This is true of all books.  I’m looking for something.  Surprisingly, I find it more often than not, and in this I have to count myself fortunate.  I have read books not worth my time and most of them I have forgotten.  But something of them lingers and sometimes I recognize them again, at least as a type, and before I make the mistake of reading the first chapter (if I get past page 30 I’m trapped, I must go on) I avoid what I sense is coming.

My own discipline notwithstanding, that is my main rule of this particular road: Don’t waste my time.  I can always be proven wrong, but there are certain books I’m not interested in reading.  Certain kinds of books I should say.  I don’t want to hurt their feelings, I don’t want to feel betrayed.  Books are like people—some are compatible, others should be avoided.

I’ll likely never read another Stephanie Meyers book.  Ever.  (I read The Host—don’t worry, I was paid to, for a professional review—and I found it exceptionally dull, too long for its weight, derivative, and a cheat.  That’s the worst reaction I’ve had to a novel in decades.)  On the other hand, I will likely read everything Iain M. Banks publishes.

I recently read Blue Highways by William Least Heat Moon and it’s the perfect example of a book with a speed limit.  Slow down, pass with care, deer crossing ahead.  A paradox, because it is a broad, multi-laned road with no posted limits.  But zipping through it would be to miss everything.  Autobiographical in the sense that Heat Moon is the narrator and the trip recorded was his hegira around the continental United States—but not in the sense that it is about him.  While it is impossible that his own self and life could be kept entirely out, it’s about the road and its impact on the traveler he was.  The main character is the journey.  The sights along the way demand attention.  You do not speed read a book like this, which is reflected in the stated purpose of his travels on the back roads, state highways, and some seldom-used tracts in parts of the country most of us have no idea exist.  If you want to go fast, take the superhighways.  And see nothing.

But if you’re interested in landscape, in impressions of setting on character, on the topography of perception and topology of awareness…

That’s the kind of book I intend to talk about here.

Another rule of the road for me is going to be the advocacy of writers and of local bookstores.  Writers do what they do because they—most of them—love it.  It can be a difficult relationship, to be sure, but the deeper the love the better the result.  Which by extension invites the further relationship with the reader.  Where you first meet is actually important.  Buying from big chains is like engaging a series of one-night-stands.  Buy your books locally, get to know your bookseller, and by extension support the writer.

More on that later.

As to the kinds of books I love to read and which I’ll write about here, well…I said I am a slow reader.  Once, back in high school, I took a speed reading course.  By the time I graduated high school I was reading about 2500 words a minute.  I could go through an average sized book in an evening if I wanted.  I read a lot of  books that way.

What I did not have was very much fun.  I slowed down intentionally.  It occasionally takes me an inordinate length of time to read a book.  I get through about 70 to 80 a year, cover to cover, and I’m usually reading 3 or 4 simultaneously.

Which means I am not “current.”  The last book I finished was Gore Vidal’s Lincoln, which was published back in 1984.  I finished it just before the New Year.  It, too, demanded careful reading and I took my time.  Point being, I am irretrievably “behind” in my reading and will likely remain so—largely without regret or much feeling that it’s causing me any harm.

But I do read new books now and then and I may read more for the purposes of this column.

Another caveat:  I make a distinction between “like” and “good.”  There are plenty of books we read that we like but which, by any metric of craft or art, are not especially good.  I read Edgar Rice Burrough’s John Carter of Mars novels when I was a kid and I really, really liked them.  From time to time, I pick one up again and I find I still like them, but I can’t say they’re very good.  On the other hand, I read James Joyce’s Ulysses and have no hesitancy at all declaring it to be not only a good book but a great one—but I didn’t particularly like it.  I will do my best to state my prejudice in this regard when it seems relevant.

To be sure, if I write about it here, it meant something to me.  It had an impact.  It was a worthwhile journey.  But none of us ever really go down the same road, even if we get on at the same ramp.  My journey won’t be the same as anyone else’s.

And isn’t that the very best thing about good books?

A Bit of Comfort and Caution

This is my new blog.  I’ve attached a link to my old one, the Distal Muse, so you can get over here.

Why a second blog?  Simple.  I’d like to put my reviews and literary opinions here, apart from the Muse, which has become more of a hodge podge of commentary over the years.  A site dedicated just to my readings and my thoughts on literature might find a welcome audience among those who could care less what my politics are or my opinions on music or film or people in general or things as they are.

For now, this is a placemarker.   I have work to do in my office and it might be a week or two before I start posting.  In the meantime, you can still check out the Distal Muse.  I will be back.  Try to do this up proper, a real honest-to-goodness lit’rary stopover.