Cinema Versus ‘Theme Parks’

“I don’t see them. I tried, you know? But that’s not cinema.  Honestly, the closest I can think of them, as well made as they are, with actors doing the best they can under the circumstances, is theme parks. It isn’t the cinema of human beings trying to convey emotional, psychological experiences to another human being.”

Martin Scorcese said that in an interview about Marvel superhero movies. The observation has sparked some controversy. A lot of people heard him trashing their favorite form of movie, others—including Francis Ford Coppola—found resonance with his statements.

The part of his statement I disagree with is the part that I hear every time someone from the literary world suggests science fiction is not “real” literature—because it doesn’t deal with humans experiencing authentic emotions in a meaningful context. In its own way, Mr. Scorcese has recast the classic dismissal of science fiction and fantasy in regards to film.

To which I would say, “Care to justify that in terms of cinema as a whole?” It can be argued, I think, that the gangster film on which Scorcese made his reputation is not a milieu about ordinary people having emotional experiences in common with their audience, but about a distinct subset of humanity that distorts itself into an extreme condition to pit itself against the world. Their experience are by definition, at least in cinema, going to be over-the-top, magnified, and at odds with the common. The backgrounds are likewise going to be exaggerated and often surreal, set-pieces to support encounters of violence and passions pared down by adrenaline to caricatures of ordinary daily experience. They “entertain” for precisely those factors that for two hours remove us from our mundane lives and give us entreé into lives we will (hopefully) never take part in. The point of them is to allow a vicarious experience completely out of the ordinary.

They are anchored to us by asking “How would we react in the same circumstances?” and honestly following the thread of answers to what connects these people to us.

But the characters themselves, while often despicable, are extraordinary.

As are the characters of the gunslinger, the private detective, the cop, the soldier, the knight, the barbarian, etc.

It is their extraordinariness that attracts us, holds our attention, and carries us along through unlikely adventures to, one hopes, a satisfying and cathartic conclusion.

How is that any different than what we see in Captain America? Iron Man? Thor?

Oh, they come from the worlds of science and fantasy and wield unusual abilities.

So, once again, because they appear to us in the context of science fictional settings and offer challenges outside historical experience, they are not legitimate cinema…

To an extent, Scorcese has a point. They do offer “theme park” rides. It takes a rather extraordinary film like Winter Soldier or, stepping to a different franchise, Wonder Woman to see the genuine human story beneath the glossy, glitzy, hyper-realized settings, but it’s there. And for those films that fail to deliver that human element, well, it’s not that they aren’t cinema, they’re just bad cinema.

But “cinema” has always indulged the exotic, the novel, the visually unique to achieve what may be argued to be its primary advantage as a medium. The full embrace of the exotic cannot be used to reclassify certain films as “not cinema” because they utilize exactly that potential.

No, this is another version of reaction to a genre distinction because you don’t get it.  It’s the reason several excellent SF films failed to find notice with the Academy for years because they were that “spacey kid stuff.” Now good SF is finally being recognized by the Academy, leaving the position of poorly-regarded declassé genre in need of a new resident, and in this instance Mssrs Scorcese and Coppola elect the big superhero franchises.

Let’s face it—there have always been superhero films. Dirty Harry is a species of superhero, as is Jason Bourne and James Bond. Chuck Norris and Steven Segal have made their share of superhero films. And when you think about it, just about any Western where the hero faces impossible odds and wins is a superhero film. One could go down the list and find just cause to name any number of historical or quasi-historical epics as members of that club. Robin Hood is a superhero. The Lone Gunman story is a species of superhero film. And these all draw from various mythologies that are readily accessible as superhero stories. Hercules, Cuchulain, Gilgamesh, Samson…

Of course these films are cinema. Just as science fiction is literature.

You just have to speak the language.

The Downside of Expanded Participation?

It occurred to me the other day that there is a serious problem with the way audiences and films interact these days. It’s a relatively new problem, one that has grown up with social media, but it has roots in an older aspect of film production, namely the test screening. The idea being that before a general release, a film is shown to select audiences to gauge reactions and tweak the final cut before it is set free into the zeitgeist.  There’s logic to it, certainly, but I’ve always been uncomfortable with it because it’s an attempt to anticipate what should be an honest reaction to a work of art.  I try to imagine Rembrandt showing a painting to a client halfway or two-thirds finished and, depending on the reaction, going back to change it to conform to some inarticulate quibble on the part of someone who has no idea what should be on the canvas. Art, to a large extent, is a gamble, and test screenings are the equivalent of loading the dice or counting cards.

It’s understandable, of course, because a movie is not a painting done by one person, but a hugely expensive collaborative work with investors and questions of market share. But it still bothers me. (What if a test audience had complained that Bogey didn’t get Bergman at the end of Casablanca and the studio went back to change it to suit?)

Today there’s another phenomenon that is related to test audience but is even more invasively surreal. The pre-assessment by fans ahead of release. Sometimes years ahead.

This obsessive speculation has evolved into a form of semi-creative wheel-spinning that mimics a huge test audience, the key difference being that it is “testing” work not yet done. Fanfic seems to be part of this, but only as a minor, and apparently undervalued aspect. We have a large, active community engaged in predetermining what will, should, ought not, and might happen in forthcoming movies. Large enough and active enough that I believe it has affected how those movies are made, possibly unconsciously. The feedback loop is pernicious. The vindictiveness of the test audience can also be so severe as to impact decisions that have yet to be taken up.

The most visible way this has manifest—and this varies from franchise to franchise—is in the “look” of new films, especially in the effects, but also in the selection of cast, location, and choreography. Whether intentional or not, film makers pump things into next productions in an attempt to meet the expectations of this hypercritical superorganism.

This organism constructs alternate narratives, raises possible plot lines, critiques character development, and then, when the finished product fails on some level, engages in the kind of evisceration that cannot but give the creators pause to rethink, check themselves, question (often pointlessly) every choice made to that time.

I’m not sure this process happens at any conscious level, but it seems to mean the Doc Smith approach to bigger, splashier, louder, stranger films, at least in the Marvel and DC universes, and to a lesser extent the related products like Valerian or any given Bruce Willis vehicle of late, is a response to this incessant viral nattering. The anticipatory critical response must get through and affect the people in the main office.

Television has suffered less of this, it seems, because, at least in terms of story, these series suffer less from the kind of crippling second-guessing the motion pictures display.

Before all this near-instantaneous data back-and-forth, studios produced movies, people may have known they were being made, but little else got out to the general public until the trailers announcing upcoming releases. Based on those, you went or didn’t, and the movie was what it was, and you either liked it or didn’t. We were not treated to weekly box-office reports on news broadcasts. The films, with few exceptions, had a two-week first release run at the front line theaters, then moved down the hierarchy for one or two week engagements at smaller chains until they ended up at a tiny local theater, after which they vanished until popping up on tv at some point. You then went to the next and the next and the next. Choice was addressed by the fact that at any one time there might be a dozen new movies coming to the theaters a month. The film was what the producers made it. It was offered, you saw it, you took your response home, that was it.

A lot of the product was mediocre, but often reliably entertaining, and for the most part was made in a way that studios were not threatened with bankruptcy if they failed.  The really great ones came back from time to time or enjoyed extended runs in the theaters.

Fandom evolved and when the age of the internet dawned and the cable industry grew and the on-demand availability of movies was met by videotapes (later DVDs) and now streaming services, the products remained in front of self-selected audiences all the time.

This has changed the way these films are made. Not altogether to the bad, I hasten to add. I believe we’re passing through a kind of golden age of high quality films and certainly exceptional television.

But the budgets, the tendency to ignore better stories that lack the kind of epic myth-stuff of the major franchises, the endless bickering online and subsequently in conversations everywhere, and now this absurd war on what is, for wont of a better term, SJW content…

I can’t help it. Grow up.  So Doctor Who is a woman. Big deal. The character does not belong to you. Instead of chafing that some reification of idealized masculinity in being threatened, try just going with it and see where it takes you. That’s the whole purpose of storytelling! To be remade by narrative and offered a new view! To be challenged out of your day-to-day baseline assumptions!

Star Wars has been ruined by all the SJW crap! Really?

While I can see that discussion groups and this expanded dialogue can be fun and instructive, I think an unintended consequence has been to grant certain (very loud) people a sense of ownership over what is not theirs. The cacophony of anticipatory disappointment actually has a dampening effect on those of us who would simply like to be surprised and delighted all on our own.  There is utility in silence, purpose in the vacuum, a vacuum to be filled by a new film. Box office is (or can be) detrimentally affected by the chattering carps of disillusioned fan critics who are terrified of James Bond becoming black, of Thor being turned into a woman, of the Doctor showing us how gender prejudice applies in our own lives.

I’ve been disappointed with new manifestations of favorite characters in the past, don’t get me wrong. My response has been to turn to something else. Those characters don’t belong to me, I don’t have a right to expect their creators to do what I think they should, and I recognize that probably a whole lot of people are just fine with a new direction. Otherwise sales figures would push them to change it again. it’s the pettiest of sour grapes to try to preload a rejection in advance of actually seeing the product.

I have no numbers to back up my impression, but I think it worth considering that the “my life will end unless the next movie comes out exactly the way I want it” school of anticipatory criticism is having a distorting effect over time, both on the product and on the ability of audiences to simply encounter something “clean” and take a personal and unmitigated response away from it.

Just a thought.

Purity In Fiction (or, Jonathan Franzen’s Latest Attempt At…Something)

The most recent entry in the annals of attempted applied snobbery came recently from Jonathan Franzen, who, while certainly a gifted prose stylist, seems bent on making himself into the grumpiest white literary snob on the planet.

Disclaimer: I have read Mr. Franzen’s essays.  I have tried to read his fiction, but quite honestly found nothing much of particular interest. A cross, perhaps, between Dickens and Roth, with leanings toward Russo and Gardner. I admit to having been seriously put off by his antics back when Oprah Winfrey tried to draft him into her popular reading group series.

I also admit that I’ve never been quite sure what to make of all that. Till now.

He has offered Ten Rules For Novelist’s.  By the tenth you realize you are being lectured by someone who wishes to be regarded one way, suspects he may be regarded another way, but is afraid he is not being regarded at all, at least not as any kind of exemplar or Wise Head With Priceless Advice. His “rules” suffer from the curse of the “lit’rary.”  Distilled, it would seem he’s telling us that “if you don’t write like me, or try to write like me, you’re wasting your time and destroying the culture simultaneously.”

Others have weighed in on the problematic nature of these. It may seem self-serving to tilt at lawn ornaments with pretensions to windmill-ness, but frankly, I already know I’m not paid much attention to and nothing I say here will do anything for a career I do not have.

I am, however, much irked by this kind of thing. It’s disingenuous in it’s effect if not intent (how would I know how much of this he really believes?). By that I mean, it is not a set of rules to help aspiring writers, it is a set of reasons for not being a writer. Latent within these is the unspoken belief that, whoever you are, You Are Not Worthy.

Having said that, there are a couple of these I sort of agree with. Not, mind you, as proscriptions, but as matters of personal taste. Number 4, for instance: Write in third person unless a really distinctive first-person voice offers itself irresistibly.

I default to third-person because I tend to be a bit put off by first-person. To me, First Person is terribly artificial. No one goes through life narrating what they do. (Granted, all tenses, with regard to How People Actually Live, are artificial. Telling a story is an artifice, a Made Thing. Any “naturalness” to it all is part of the Art. It’s a seduction, convincing a reader to subsume his/her consciousness to the dictates of the narrative so that it feels natural.) Yes, once in a while, a story requires a different voice, even a different tense, because the writer is trying for a different effect. You make these choices for effect. You want the reader to go to a certain place in a specific way.

So while I agree with Rule # 4, I think Franzen phrased it in a way that tries to make it seem less of a matter of technique and more something that emerges from the zeitgeist. In other words, it’s a dodgy, deceptive way to say it.

There are too many little aphorisms and unexamined heuristics connected to writing that, if taken at face value, deter rather than aid the aspiring writer. We do not need more of them. For instance, Rule 1:  The reader is a friend, not an adversary, not a spectator.

Yeah…so? How is this a rule? An observation, yes, but in what way does this constitute necessary advice? And frankly, it’s not always true, nor is it even internally true. A “reader” is a stranger you hope to make a friend, of sorts, but they need convincing. Especially if you intend telling them hard truths, which seems to be what Mr. Franzen’s literary aim is. They will be, however briefly, a kind of adversary. And let’s face it, all art is initially a spectacle—requiring an audience, which is comprised of spectators. Many will stay for one game and never come back. They are not your friends. But they watched. As they read, they may shift often between these three conditions, and the adroit writer may wish them to do exactly that, because each state allows for different effects, which transfer aesthesis in different ways.  (And, really, James Joyce treated his potential audience not only as adversaries but occasionally as an angry mob with pitchforks—and by so doing created manifold aesthetic effects that are essential to the ongoing value of his works.)

Rather than go through them all, let me take the three “rules” I find most egregious. Numbers 2, 5, and 8.

Rule # 2: Fiction that isn’t an author’s personal adventure into the frightening or the unknown isn’t worth writing for anything but money.

I actually know what he’s trying to say here, but he said it in such a way as to betray the aristocratic self-image he wishes to convey and ends up doing disservice not only to a great deal of fictive output but reifies the academic nonsense about the nature of actually Writing For A Living.

What this is, firstly, is a variation on the Write What You Know, one of those aforementioned aphorisms that are less than useless. It seems to mean write only what you yourself have gone out into the world and experienced first-hand and even then be careful because you probably don’t know it as well as you think you do and in that case do another story about a writer suffering the self-doubt of the underappreciated. (Rules 5 and 8 underscore this, by the way.)

Secondly, it’s essentially claiming that writing, true writing, the pure quill, as it were, can only be done by the Elect. It’s a priesthood and defined by suffering and, often, by accidental success. I find it remarkable how many times we have been treated to lectures about the sordidness of writing for money from writers who have a Lot Of It. In other words, they are successful enough that they are offered platforms from which to tell the rest of us that we should just give it up.

Writing is, perhaps, a calling of sorts, but in its commission it is a craft and if one intends to do it as a vocation—which, in this instance, means having the opportunity to do as one’s primary activity—then you do it for pay. Otherwise, two things—you starve or no one ever hears of you because you choose not to starve and take a job that prevents you from writing all the time. (I can hear the rejoinder—“well, if no one wants to buy your work in sufficient quantity, then it must be inferior”—which both ignores the realities of the market and exhibits hypocrisy at the same time.)

Most of us never get the opportunity to make this our living.  We get paid poorly, distributed badly, and rarely get recognized outside our own little patch. To have someone whose books regularly debut on best seller lists tell us that writing for money is somehow disreputable and sullies our work is the height of snobbery.

Rule # 5: When information becomes free and universally accessible, voluminous research for a novel is devalued along with it.

What does that actually mean? When I do a great deal of research to make my work sing with verisimilitude and I find that my readers know enough to appreciate what I have done, it increases the value of that work.  Again, this is snobbery, based on the assumption that the True Novelist has the time and resources to do something the rest of us can’t do.

The only thing that a novelist can do that the hoi polloi can’t is tell a story that moves people. They can know or have access to everything the writer knows and has learned and yet the one thing they will still not be able to do is tell that writer’s story.*

But that rule offers a glimpse into the requirements of the priesthood. When you can go to the library and look up the secret handshake of the order, the value of joining that order—or, more pertinently, living in awe of that order—diminishes.

But people still might go to the temple for the pleasure of the spectacle. So make it good spectacle.

Rule # 5 is a bizarre kind of anti-intellectual classist elitism.  And a rule for what?  Hiding information from people so you can look more impressive?

Rule # 8:  It’s doubtful that anyone with an Internet connection at his workplace is writing good fiction.

This is a kind of corollary to #5 and suffers the same flawed reasoning.

This is of a species with the whole “the novel is dead” nonsense someone brings up every so often. The last time it came up, it was obvious that the Novel being obited was the Great American Novel Written By A White Male.  It ignored women, nonwhite writers, and genre.

(Oh, genre! My ghod, what a smear upon the face of Great Literary Art!)

I said above that I have read Mr. Franzen’s essays.  I have dipped into his fiction. He is quite a good writer. I concede he can write a scene and turn out a fine sentence. In his fiction, he writes about things in a way that I can find no traction. He might be saying some things I would be moved by, but his approach leaves me cold. For this reader, he commits the one unforgivable sin—he is uninteresting.

He also seems to lead with an expectation that he will be disappointed. In us, in the universe, in himself. His essays exhibit a glumness that becomes, after a while, a drag on my psyche.

These rules suggest an answer.  He seems really to believe he should be regarded in ways that he fears he is not—and probably isn’t. The nonsense with Oprah led me to see him as pretentious and these rules have convinced me. The regard of the general public moved on in the latter half of the 20th Century as the balkanization of fiction categories multiplied and the position of Great Writer as Conscience of the Culture sort of dissipated.

But that doesn’t mean regard for novels diminished, nor does it mean the value of those novels has lessened, it only means that no one group can dictate the Standard Model of Significant Fiction anymore. The podium has, in fact, expanded, and the work that constitutes what is most worthy now includes things the Pure Writer seems to feel is beneath them.

____________________________________________________________________________________

*I just realized, rereading this (12/24/18) that this implicitly conflates “information” with Truth. which is a complete misapprehension of the nature of what a writer does. We do not merely convey information (the thing which, if everyone has access to it, becomes devalued) but process our encounter with reality and reconfigure it into some kind of truth-telling, which is, while perhaps dependent on information, not the same thing as simple information.

 

 

Music and Popular Trends

Anyone who knows me for any length of time eventually learns of my sometimes intransigent tastes in music. (Not only music, but whereas other art forms prompt conversations about form and substance that remain largely theoretical, analytical, and impersonal, when it comes to music, especially popular music, things can get a bit touchy.)  I have a minor musical background, I play (or play AT) keyboard and guitar, and in my youth I had fantasies of being a rock star.

I grew up with a wide range of influences, although in the end it was a pretty static assembly. My parents had about fifty records. A wide mix, ranging from Strauss waltzes and Grieg, to Chet Atkins, Bobby Darin, movie soundtracks, Peggy Lee, one odd Tennessee Ernie Ford record, and Les Paul. A few other oddments, including some Gershwin, Nat King Cole, and a couple of jazz records I do not remember clearly. But there were also music programs on television then the like of which we rarely see anymore and I was raised with a huge variety. My father was, in his engineering way, a stickler for technique.

That last is important.

When I came of an age to start finding my own music, it lead me into some strange byways. When everyone else was going insane with the Beatles, I was listening to Walter Carlos.  When the Rolling Stones were the rebellion of choice, I’d stumbled on The Nice. And finally, when I had a budget, the first albums I purchased were Santana, Chicago, and…

Yes.

Fishing in the waters of new rock music, I had no idea who was in, who was out, what the roots of some of this music might be.  I only knew what caught my attention and made me feel good. I heard a Yes tune late one night on our local independent FM station and I never got over it.

Before I understood there were divisions and lines being drawn between various musical styles, I had a very eclectic collection anchored by what became known as Progressive Rock.  Along with James Taylor,  America, Cream, and the other assorted sonic adventures, you would find, by the mid Seventies, in my collection not only every Yes album then available but also ELP, Jethro Tull, Genesis, Renaissance, and a smattering of others, all of whom by the end of the decade were being heaped with derision in the music press and by a growing crowd of discontents who pursued Punk, Disco, or New Wave, declaring that Prog was pretentious snob music.

I never heard anything but grandeur and emotional transcendence.  Later, after the ash had settled and the music scene had burned to the ground and been rebuilt in dozens of smaller abodes, I realized that what I was hearing in Prog was a modern attempt to capture what one heard in Beethoven or Tchaikovsky or Sibelius. I wholly approved.

But it became a sore point over time when the inevitable conversations about “good”music became more and more balkanized over what I eventually decided was a kind of reverse snobbishness if not outright anti-intellectual protest against sophistication, skill, and imagination.  I heard the same kinds of criticisms from people who took regular potshots at science fiction.

But till now I never paid that much attention to the history. The What Happened aspect.

David Weigel’s new book, The Show That Never Ends, is a solid history of a form that, most people forget, dominated popular music for almost a decade.  Emerson, Lake, & Palmer were at one time the biggest act on the planet.  Yes, which almost never broke the Top 40 charts, filled arenas.

And then, seemingly overnight, everyone was listening to The Cars, The Sex Pistols, The Police, almost anything Other Than music with the kind of intricacy usually associated with either jazz or classical.

So what did happen?

Weigel writes unsentimentally but with sympathy about how  combination of audience exhaustion, music industry economics, and ultimately the self-destruction of some of the key artists and their own  creative exhaustion led to a melange of unsatisfactory products. Self-indulgence, musically and otherwise, introversion, and the jangling disconnect between market demands and pursuit of vision ended up making a bit of a mess that resulted in…

Well, oddly, not the end of progressive rock, because it is still with us, even in new manifestations, and many of the mainstays of the first wave progressives are now respected elder statesmen whose contributions to music are finally being acknowledged.  It is obvious in hindsight that many of the bands who pushed Prog aside in the Eighties and Nineties could not have done the kind of music they did without the tools invented by those Old Pretentious Guys.

When it comes to that music, Weigel displays an admirable understanding of it as composition.  He talks about the construction of these works and what set them apart theoretically from other forms.  It is a pleasure to read his descriptions of how many of the pieces that form the bedrock of progressive rock came about and what makes them fascinating to listen to.

One element of the “downfall” of Prog Weigel does not touch on, though it is there in the narrative if you care to tease it out, was the unsustainability of one of the effects of some of these acts.  Look at Yes, look at early Genesis, look even at ELP, and part of the glamor, the attraction, was that they had built a world. It was almost a literary act, the creation of a whole suite of aesthetic components that offered the illusion that one could enter into it, as if into Narnia, and live there. For a few hours on a concert night, the illusion could be powerful, and the dedicated album art and the philosophizing one read in interviews all added to the illusion.

But in the end it was not really possible, and in the morning there was the real world, and disappointment gradually encroached.  It wasn’t “just” a good concert, but a promise that could not be fulfilled.

For some, maybe many. You had in the end to be an “insider” to get it and finally the huge soap bubble simply could not be sustained.

Ultimately, though, this was the kind of stretching that popular music needed even if the beneficiaries of it did not continue to write and play in that idiom, and as pure music some of it is, indeed, transcendent.

Now that so many of these folks are beginning to pass from the scene, revisiting their contributions, reassessing their output as music rather than as some kind of cultural statement, would seem in order. Weigel’s book would be a good place to start.

 

2017

Looking at my list, I read, cover-to-cover, 51 books in 2017. That doesn’t seem like much to me, but knowing there are people, even people I know, who haven’t read one book in that time, it’s probably in the top something-or-other of national averages. At 63, I’m not sure I care anymore. It never was about quantity, as I’ve told myself time and again, but there are so many good books and I want to read them all!

We have engaged another study group this year. Rousseau. When we agreed to join, we thought we were doing just one of his works, his Second Discourse on Inequality. Come to find out, our guiding light wants to cover all the major Rousseau. Next up is Emile. I haven’t read Emile since high school. I remember little about it, other than it served to enrich a later reading of C. J. Cherryh’s Cyteen. Very well. Rousseau it is.

But in 2017, I felt torn between two kinds of reading. Reading to escape—because, really, look at the world—and reading to understand how to deal with reality.

A third category was the books for my science fiction group at Left Bank Books. Twelve titles, mostly selected by me, but voted on now by the whole group. My intention in this group is to read a mix of new(wish) and classic. This year we’ll be doing our first nonfiction title.

It’s given me a chance to reread some of my favorites. In almost every instance, I’ve found a practically new novel. For instance, Delany’s Trouble On Triton. I no longer recall clearly how I felt about it when I read it back in the Seventies, but this time through it was fascinating in that Delany opted to tell the story through the eyes of a person incapable of any kind of satisfaction in what in many ways is practically a paradise (never mind the little war going on). He wrote it as a kind of response to Le Guin’s The Dispossessed and it works quite well as that, flipping the relationship on its head at almost every point. Bron Helstrom is not a misunderstood everyman in a society unwilling to accommodate his uniqueness. Rather, he is a perpetually ill-fitting square peg that refuses, constitutionally, to be accommodated, even by a society that has no qualms trying to help him “fit.”

We also read Joan Vinge’s magisterial Snow Queen. I confess this was the first time I managed to get all the way through. While some of it is a bit dated, I found it a magnificent piece of world-building.

Then there was Ninefox Gambit by Yoon Ha Lee.  On the surface, this is an intense military SF novel, but it works with religious motifs, time and calendars, and the whole notion of long games and ghosts. The details of the world-building are impressive and the motivations for the conflicts are unusual to say the least. There is an element of Mayan cosmology buried beneath the surface, transformed into the kind of Otherness that gives the best science fiction its appeal.

Borne by Jeff Vandermeer is an odd novel. Compared to some of his work, which I find utterly unclassifiable (in the best sense), Borne is much more accessible, even though it presents as bizarre a world as any Vandermeer has ever offered. I came to the conclusion that this is a take on Alice Through The Looking Glass, done with an utterly different set of cultural expectations.

We read Keith Roberts’ Pavane, Chabon’s  The Yiddish Policeman’s Union Long Way To A Small Angry Planet by Becky Chambers, Autonomous by Annalee Newitz, Use Of Weapons b y Iain M. Banks, Too Like The Lightning by Ada Palmer, Tim Powers’ The Anubis Gates, all of which generated excellent discussion.

Along with the other newer SF I read this past year, I conclude that the genre has never been healthier, more fascinating and inventive. The quality of imagination and craft have combined to produce some of the best work ever done.

Likewise in science writing, as exemplified by Carlo Rovelli, whose Reality Is Not What It Seems impressed me with its clarity and concision. (I’d been wondering what happened to string theory the last few years and this book told me.)

The Book That Changed America by Randall Fuller is more history than science. It details the initial encounter with Darwin in America and examines its impact. Both its initial welcome by people who saw in it a sound argument against slavery and then its later rejection as the assault on biblical fealty it became.

Sidharta Mukerjee’s The Gene is likewise marvelously accessible, a history and examination of the Thing That Makes Us Us.

In the same vein, but much more exhaustive in its historicity, was David Wooton’s The Invention of Science, a chronicle of how science came into being. He demonstrates that it was not the revelation popular myth suggests, but a long accumulation of understandings that depended utterly on the social context in which they emerged. (For instance, we have no science as practice without the printing press.) Reviewing first appearances of words and concepts, the book shows how culture had to change before the possibility of the modern concept of science could even begin to take hold. Just realizing that prior to Columbus there was no common understanding of the concept of “discovery.”

Just as enlightening were Charles C. Mann’s pair of New World histories, 1491 and  1493, which combined tear away most of the preconceptions about the world Europe destroyed when it crossed the Atlantic.

I read some excellent novels outside genre—Jacqueline Winspear’s well done Maisy Dobbs series (three of them), The Hot Country by Robert Olen Butler, Kerry Greenwood’s Cocaine Blues, the first Miss Fisher mystery, and travel writer William Least Heat-Moon’s first foray into fiction, Celestial Mechanics. But primarily, I read nonfiction and SF.  It was that kind of a year.

As a bookseller, I noticed a sharp rise in the purchase of science books. Overall book sales were generally higher than in past years.  People are reading things which seem to offer a more solid grasp of reality. Perhaps this is in reaction to the state of the world or just the state of the country.  People seem to want To Know, in very concrete ways, if their selections are indicative. I see less escapism, and when I do see it, it is not the sort that would leave the psyché unchanged.

I already have half a dozen titles read for this year. It will be interesting to see how all this evolves by December.

Good reading to you all.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

End Times

The Sixties.

Depending on what your major concerns are, that period means different things.  For many people, it was revolution, civil rights, the peace movement.  For many others, it was music.

For Michael Walker, it was evidently the latter.  In his new book, What You Want Is In The Limo,  he chronicles what he considers the End of the Sixties through the 1973 tours of three major rock groups—The Who, Led Zeppelin, and Alice Cooper.

His claim, as summarized in the interview linked above, is that after Woodstock, the music industry realized how much money could be made with this noisy kid stuff (which by Woodstock it no longer was—kid stuff, that is) and started investing heavily, expanding the concert scene, turning it from a “cottage industry” into the mega-million-dollar monster it has become.  1973, according to Walker, is the year all this peaked for the kind of music that had dominated The Sixties, made the turn into rock star megalomania, and ushered in the excesses of the later Seventies and the crash-and-burn wasteland of the Punk and New Wave eras (with a brief foray into Disco and cocaine before the final meltdown).

The bands he chose are emblematic, certainly, but of the end of the Sixties?  I agree with him that 1973 is the year the Sixties ended, but the music aspect, as always, was merely a reflection, not a cause.  What happened in 1973 that brought it all to an ignominious close was this: Vietnam ended.

(Yes, I know we weren’t out until 1975, but in 1972 Nixon went to China, which resulted in the shut-down of the South China rail line by which Russia had been supplying North Vietnam, and in 1973 the draft ended, effectively deflating a goodly amount of the rage over the war.  The next year and a half were wind-down.)

Walker’s analysis of the cultural differences before and after 1973 are solid, but while the money was certainly a factor, a bigger one is exhaustion.  After a decade of upheaval over civil rights and the war in Vietnam, people were tired.  Vietnam ended and everyone went home.  Time to party.  Up to that point, the music—the important music, the music of heft and substance—was in solidarity with the social movements and protest was a major component of the elixir.  Concerts were occasions for coming together in a common aesthetic, the sounds that distinguished Woodstock acting as a kind of ur-conscious bubble, binding people together in common cause.

Once the primary issues seemed settled, the music was just music for many people, and the aspects which seemed to have informed the popularity of groups like Cream or the Stones or the Doors lost touch with the zeitgeist.  What had begun as an industry of one-hit wonders returned to that ethic and pseudo-revolutionary music began to be produced to feed the remaining nostalgia.

(Consider, for example, a group like Chicago, which began as socially-conscious, committed-to-revolution act—they even made a statement to that effect on the inside cover of their second album—and yet by 1975 were cashing in on power ballads and love songs, leaving the heavily experimental compositions of their first three albums behind and eschewing their counter-culture sensibilities.)

To my mind the album that truly signified the end of that whole era was The Moody Blues Seventh Sojourn, which was elegaic from beginning to end.  The last cut, I’m Just A Singer In A Rock’n’Roll Band, was a rejection of the mantle bestowed on many groups and performers during the Sixties of guru.  With that recording, the era was—for me—over.

Also for me, Alice Cooper never signified anything beyond the circus act he was.  Solid tunes, an edgy stage act, and all the raw on-the-road excess that was seen by many to characterize supergroups, but most of Cooper’s music was vacuous pop-smithing.  The Who and Led Zeppelin were something else and both of them signify much more in artistic terms.  Overreach.

But interestingly enough, different kinds of overreach.  Walker talks of the self-indulgence of 45-minute solos in the case of Zeppelin, but this was nothing new—Cream had set the standard for seemingly endless solos back in 1966 and Country Joe McDonald produced an album in the Nineties with extended compositions and solos.  Quadraphenia was The Who’s last “great” album, according to Walker, and I tend to agree, but two kinds of exhaustion are at work in these two examples.  Zeppelin exhausted themselves in the tours and the 110% performances.  The Who exhausted the form in which they worked.  After Quadraphenia, all they could do was return to a formula that had worked well before, but which now gained them no ground in terms of artistic achievement.  As artistic statement—as an example of how far they could push the idiom—that album was a high watermark that still stands.  But the later Who Are You?  is possibly their best-crafted work after Who”s Next.  “Greatness”—whatever that means in this context—had not abandoned them.  But the audience had changed.  Their later albums were money-makers with the occasional flash of brilliance.  They were feeding the pop machine while trying to compose on the edge, a skill few manage consistently for any length of time.

“Excess” is an interesting term as well.  Excess in what?  The combination of social movement with compositional daring had a moment in time.  When that time passed, two audiences parted company.  Those who wanted to party (often nostalgically) and those who were truly enamored of music as pure form.  They looked across the divide at each other and the accusation of excess was aimed by each at different things.  The one disdained the social excess of the other while the latter loathed the musical excess of the former.  People gleefully embracing Journey, disco, punk, and a gradually resurgent country-western genre thought the experimental explorations of the post-Sixties “art rock” scene were self-indulgent, elitist, and unlistenable.   People flocking to Yes and Emerson,Lake & Palmer concerts, cuing up Genesis and UK on their turntables, (and retroactively filling out their classical collections) found the whole disco scene and designer-drug culture grotesque.  Yet in many ways they had begun as the same social group, before the End of the Sixties.

The glue that had bound them together evaporated with the end of the political and social issues that had produced the counterculture and its attendant musical reflection in the first place.  Without that glue, diaspora.

And the forms keep breaking down into smaller and smaller categories, which is in its own way a kind of excess.  The excess of pointless selectiveness.

Is the Novel Still Dying?

In 1955, Normal Mailer was declaring the death of the novel. A bit more than a decade later, it was John Barth’s turn.  There have now been a string of writers of a certain sort who clang the alarm and declare the imminent demise of the novel, the latest being a selection of former enfants terrible like Jonathan Franzen and David Foster Wallace.

Philip Roth did so a few years back, adding that reading is declining in America.  The irony of this is that he made such claims at a time when polls suggested exactly the opposite, as more people were reading books in 2005 (as percentage of adult population) than ever before.  In my capacity as one-time president of the Missouri Center for the Book I was happily able to address a group of bright adolescents with the fact that reading among their demographic had, for the first time since such things had been tracked, gone precipitously up in 2007.

And yet in a recent piece in the Atlantic, we see a rogues’ gallery of prominent literateurs making the claim again that the novel is dying and the art of letters is fading and we are all of us doomed.

Say what you will about statistics, such a chasm between fact and the claims of those one might expect to know has rarely been greater.  The Atlantic article goes on to point out that these are all White Males who seem to be overlooking the product of everyone but other White Males.  To a large extent, this is true, but it is also partly deceptive.  I seriously doubt if directly challenged any of them would say works by Margaret Atwood or Elizabeth Strout fall short of any of the requirements for vital, relevant fiction at novel length.  I doubt any of them would gainsay Toni Morrison, Mat Johnson, or David Anthony Durham.

But they might turn up an elitist lip at Octavia Butler, Samuel R. Delany, Tannarive Due, Nalo Hopkinson, Walter Mosley, or, for that matter, Dennis Lehane, William Gibson, and Neal Stephenson (just to throw some White Males into the mix as comparison).  Why?

Genre.

The declaration back in the 1950s that “the novel is dead” might make more sense if we capitalize The Novel.  “The Novel”—the all-encompassing, universal work that attempts to make definitive observations and pronouncements about The Human Condition has been dead since it was born, but because publishing was once constrained by technology and distribution to publishing a relative handful of works in a given year compared to today, it seemed possible to write the Big Definitive Book.  You know, The Novel.

Since the Fifties, it has become less and less possible to do so, at least in any self-conscious way.  For one thing, the Fifties saw the birth of the cheap paperback, which changed the game for many writers working in the salt mines of the genres.  The explosion of inexpensive titles that filled the demand for pleasurable reading (as opposed to “serious” reading) augured the day when genre would muscle The Novel completely onto the sidelines and eventually create a situation in which the most recent work by any self-consciously “literary” author had to compete one-on-one with the most recent work by the hot new science fiction or mystery author.

(We recognize today that Raymond Chandler was a wonderful writer, an artist, “despite” his choice of detective fiction.  No one would argue that Ursula K. Le Guin is a pulp writer because most of her work has been science fiction or fantasy.  But it is also true that the literary world tries to coopt such writers by remaking them into “serious” authors who “happened” to be writing in genre, trying ardently to hold back the idea that genre can ever be the artistic equivalent of literary fiction.)

The Novel is possible only in a homogenized culture.  Its heyday would have been when anything other than the dominant (white, male-centric, protestant) cultural model was unapologetically dismissed as inferior.  As such, The Novel was as much a meme supporting that culture as any kind of commentary upon it, and a method of maintaining a set of standards reassuring the keepers of the flame that they had a right to be snobs.

Very few of Those Novels, I think, survived the test of time.

And yet we have, always, a cadre of authors who very much want to write The Novel and when it turns out they can’t, rather than acknowledge that the form itself is too irrelevant to sustain its conceits at the level they imagine for it, they blame the reading public for bad taste.

If the function of fiction (one of its function, a meta-function, if you will) is to tell us who we are today, then just looking around it would seem apparent that the most relevant fiction today is science fiction.  When this claim was made back in the Sixties, those doing what they regarded as serious literature laughed.  But in a world that has been qualitatively as well as quantitatively changed by technologies stemming from scientific endeavors hardly imagined back then, it gets harder to laugh this off.  (Alvin Tofler, in his controversial book Future Shock, argued that science fiction would become more and more important because it taught “the anticipation of change” and buffered its devotees from the syndrome he described, future shock.)

Does this mean everyone should stop writing anything else and just do science fiction?  Of course not.  Science fiction is not The Novel.  But it is a sign of where relevance might be found.  Society is not homogeneous (it never was, but there was a time we could pretend it was) and the fragmentation of fiction into genre is a reflection that all the various groups comprising society see the world in different ways, ways which often converge and coalesce, but which nevertheless retain distinctive perspectives and concerns.

A novel about an upper middle class white family disagreeing over Thanksgiving Dinner is not likely to overwhelm the demand for fiction that speaks to people who do not experience that as a significant aspect of their lives.

A similar argument can be made for the continual popularity and growing sophistication of the crime novel.  Genre conventions become important in direct proportion to the recognition of how social justice functions, especially in a world with fracturing and proliferating expectations.

Novel writing is alive and well and very healthy, thank you very much, gentlemen.  It just doesn’t happen to be going where certain self-selected arbiters of literary relevance think it should be going.  If they find contemporary literary fiction boring, the complaint should be aimed at the choice of topic or the lack of perception on the part of the writer, not on any kind of creeping morbidity in the fiction scene.

Besides, exactly what is literary fiction?  A combination of craft, salient observation, artistic integrity, and a capacity to capture truth as it reveals itself in story?  As a description, that will do.

But then what in that demands that the work eschew all attributes that might be seen as genre markers?

What this really comes down to, I suspect, is a desire on the part of certain writers to be some day named in the same breath with their idols, most of whom one assumes are long dead and basically 19th Century novelists.  Criticizing the audiences for not appreciating what they’re trying to offer is not likely to garner that recognition.

On the other hand, most of those writers—I’m thinking Dickens, Dumas, Hugo, Hardy, and the like—weren’t boring.  And some of the others—Sabatini, Conan Doyle, Wells—wrote what would be regarded today as genre.

To be fair, it may well be that writers today find it increasingly difficult to address the moving target that is modern culture.  It is difficult to write coherently about a continually fragmenting and dissolving landscape.  The speed of change keeps going up.  If such change were just novelty, and therefore essentially meaningless, then it might not be so hard, but people are being forced into new constellations of relationships and required to reassess standards almost continually, with information coming to them faster and faster, sometimes so thickly it is difficult to discern shape or detail.  The task of making pertinent and lasting observations about such a kaleidoscopic view is daunting.

To do it well also requires that that world be better understood almost down to its blueprints, which are also being redrafted all the time.

That, however, would seem to me to be nothing but opportunity to write good fiction.

But it won’t be The Novel.

____________________________________________________________________

Addendum:  When I posted this, I was challenged about my claim that Mailer said any such thing. Some suggested Philip Roth, others went back even further, but as it turns out, I have been unable to track down who said exactly what and when. Yet this is a stray bit of myth that refuses to die.  Someone at sometime said (or quoted someone saying, or paraphrased something ) that the Novel Is Dying and it persists.  It has become its own thing, and finding who did—or did not—say it may be problematic at best.  It is nonetheless one of those things that seems accepted in certain circles.  It would be helpful if someone could pin it down, one way or the other.

Rules of the Road

Books have speed limits.

Some you can breeze through, a quick run along a sunny straightaway, windows open and wind in your ears.  Others demand that you slow down, pay attention, move with care.

For the slow-but-dedicated reader, if there is a special plea or prayer upon picking up a new, dense book, it should be “Please don’t waste my time.”  I’m one of those readers who feels a compulsion to finish what I start.  I seem constitutionally incapable of just putting a book aside part way through and never coming back to it.

Oh, I’ve done it!  Years later, though, if I stumble across that book, ignored in a box or on the shelf of someone to whom I’ve loaned it, I experience a moment of guilt, a regret that I somehow betrayed it, and that it may take umbrage for having been dallied with and left for another.   “So…a restless spirit haunts over every book, till dust or worms have seized upon it, which to some may happen in a few days, but to others later…” according to Jonathan Swift.  Death to a book is to be ignored.

But we are mortal and have only so much time.  I have a crabbed admiration for people who can decide within ten pages that what follows is not worth their time and can put it aside without a twinge of conscience.  (Crabbed because another part of me keeps wondering what they’re missing.)

“Please don’t waste my time.”  For me, part of the problem is an inability to know if a book will be a waste of time.  Possibly some paragraph or a chapter or even a line from a character will make the whole thing worthwhile.

I read in anticipation.  This is true of all books.  I’m looking for something.  Surprisingly, I find it more often than not, and in this I have to count myself fortunate.  I have read books not worth my time and most of them I have forgotten.  But something of them lingers and sometimes I recognize them again, at least as a type, and before I make the mistake of reading the first chapter (if I get past page 30 I’m trapped, I must go on) I avoid what I sense is coming.

My own discipline notwithstanding, that is my main rule of this particular road: Don’t waste my time.  I can always be proven wrong, but there are certain books I’m not interested in reading.  Certain kinds of books I should say.  I don’t want to hurt their feelings, I don’t want to feel betrayed.  Books are like people—some are compatible, others should be avoided.

I’ll likely never read another Stephanie Meyers book.  Ever.  (I read The Host—don’t worry, I was paid to, for a professional review—and I found it exceptionally dull, too long for its weight, derivative, and a cheat.  That’s the worst reaction I’ve had to a novel in decades.)  On the other hand, I will likely read everything Iain M. Banks publishes.

I recently read Blue Highways by William Least Heat Moon and it’s the perfect example of a book with a speed limit.  Slow down, pass with care, deer crossing ahead.  A paradox, because it is a broad, multi-laned road with no posted limits.  But zipping through it would be to miss everything.  Autobiographical in the sense that Heat Moon is the narrator and the trip recorded was his hegira around the continental United States—but not in the sense that it is about him.  While it is impossible that his own self and life could be kept entirely out, it’s about the road and its impact on the traveler he was.  The main character is the journey.  The sights along the way demand attention.  You do not speed read a book like this, which is reflected in the stated purpose of his travels on the back roads, state highways, and some seldom-used tracts in parts of the country most of us have no idea exist.  If you want to go fast, take the superhighways.  And see nothing.

But if you’re interested in landscape, in impressions of setting on character, on the topography of perception and topology of awareness…

That’s the kind of book I intend to talk about here.

Another rule of the road for me is going to be the advocacy of writers and of local bookstores.  Writers do what they do because they—most of them—love it.  It can be a difficult relationship, to be sure, but the deeper the love the better the result.  Which by extension invites the further relationship with the reader.  Where you first meet is actually important.  Buying from big chains is like engaging a series of one-night-stands.  Buy your books locally, get to know your bookseller, and by extension support the writer.

More on that later.

As to the kinds of books I love to read and which I’ll write about here, well…I said I am a slow reader.  Once, back in high school, I took a speed reading course.  By the time I graduated high school I was reading about 2500 words a minute.  I could go through an average sized book in an evening if I wanted.  I read a lot of  books that way.

What I did not have was very much fun.  I slowed down intentionally.  It occasionally takes me an inordinate length of time to read a book.  I get through about 70 to 80 a year, cover to cover, and I’m usually reading 3 or 4 simultaneously.

Which means I am not “current.”  The last book I finished was Gore Vidal’s Lincoln, which was published back in 1984.  I finished it just before the New Year.  It, too, demanded careful reading and I took my time.  Point being, I am irretrievably “behind” in my reading and will likely remain so—largely without regret or much feeling that it’s causing me any harm.

But I do read new books now and then and I may read more for the purposes of this column.

Another caveat:  I make a distinction between “like” and “good.”  There are plenty of books we read that we like but which, by any metric of craft or art, are not especially good.  I read Edgar Rice Burrough’s John Carter of Mars novels when I was a kid and I really, really liked them.  From time to time, I pick one up again and I find I still like them, but I can’t say they’re very good.  On the other hand, I read James Joyce’s Ulysses and have no hesitancy at all declaring it to be not only a good book but a great one—but I didn’t particularly like it.  I will do my best to state my prejudice in this regard when it seems relevant.

To be sure, if I write about it here, it meant something to me.  It had an impact.  It was a worthwhile journey.  But none of us ever really go down the same road, even if we get on at the same ramp.  My journey won’t be the same as anyone else’s.

And isn’t that the very best thing about good books?