An Observation: The Personal and the Proetic

Formative influences can be doggedly resistant to reassessment. There’s some accuracy in suggesting that they should be left alone. But once the idea occurs, leaving it alone can become a species of stubbornness rather than any kind of self-nurture. 

Still, care is required, especially in an age in which so much information, much of only marginal relevance to the main subject, is available and forms the basis of a kind of revisionism that too often only serves to widen the gulf between beginnings and the present. Analyzing a body of work in the light of personal revelations is tricky. Certainly there’s a connection, but how and with what effect is a subtler question than the ready dismissals of previously seminal work in the light of a creator’s shortcomings allow. 

On those rare occasions my opinion about this is solicited, I say that “If you find someone whose work you really like, then go read it all, see it all, hear it all before you find out one personal detail about them. What you later learn about the artist may alter everything, but you should give the work a fair encounter.” Obviously there are exceptions, but few I’ve found that cannot be deduced from the work itself. Deciding in advance that X is a bigot of some sort may be accurate and fair, but even a catastrophe of a human being is capable of producing worthwhile art. (Ezra Pound is still regarded as a poetic genius despite the fact that he was an apologist for fascists. I assume the fascism does not manifest in the work in any deleterious way—I wouldn’t know, I’m simply basing this on the reputation, both of the man and the work.)

On the other hand, I always found something off-putting in D.H. Lawrence in his treatment of women. In its day, perhaps, it seemed radical and somewhat enlightened, but despite the beauty of much of his writing, it somehow struck an off note. Later, when I learned about his life, some of this made sense. But had I known about him beforehand, I might never have read the work. Worse, I may have dismissed it as not worthwhile in a more general sense. As it is, my understanding of the work is enriched by the later knowledge in a way that does not bleed the work of its artistic value.

We can go down the list. Great artists with personal characters problematic at best who nonetheless produced amazing work the world would be less for ignoring because…

The quasi-academic practice of reanalyzing such works in light of current standards of behavior only to relegate such artists to a suspect file can do damage in a different way. Among the various problems is the conclusion that an artist cannot be more than his or her personal limitations. That, somehow, a given artist cannot be “trusted” once such personal scandals are revealed.

Trusted how?

This can be particularly difficult in our own personal relation to, say, first influences.

I credit Isaac Asimov with the work that set me on a path to being a writer. Of late, his personal tendencies to be a, hmm, “dirty old man” have cast a pall over his reputation. Fair enough. He wasn’t an exemplary human being. His habit of forcing himself—publicly—on unwilling women with uninvited kisses is cringe-worthy. This is the hallmark of someone who in many ways was still an adolescent, albeit one with a sense of privilege born of reputation.

But what does that have to do with the Foundation Trilogy?

I read Foundation and Empire when I was 13. Because of the nature of where I got my books then (Luekens Drug Store, from a spinner rack just inside the door), I got what was available. I had no idea about ordering or anything, I just perused the rack and bought what looked cool. (This was the same place I got my comics.) So the second book in the series was the first one I saw. It surprises me now that I fell into it so easily, but then when later I learned that these three books are really just compilations of short stories and novelettes, it made sense. I didn’t have to read them in order, though that helped.

There was something vast and impressive on the page, the scope he conveyed in a few paragraphs, and the epic importance of what was happening. This connected with my young imagination in ways that are difficult to convey, other than by pointing out that first encounters that become touchstones seem to carry with them a universal sense of vitality and significance against which everything else is diminished. (I find the same issue when discussing with anyone under, say, 45 the impact that the original Star Trek had on us.) All I remember afterward was how badly I felt the urge to create something that did the same thing. Later I realized that this meant writing.

Soon after, I discovered I, Robot and then the rest of Asimov’s novels and short stories.

His treatment of women was, in retrospect, prepubescent. Virtually blank slates. There were women. Sometimes men married them. (He managed Arkady as well as he did by sticking to her youth sans sexuality. Which made her like Nancy Drew or a Bobsey Twin. Unsatisfying for a more mature reader, but nothing terrible.) The closest he came to maturity in fiction was in The Gods Themselves, but that is a curious case, and nothing much is actually there. It might be argued that his lack of female characters as characters who are women is pathetic, but I see it as someone who knew virtually nothing about women avoiding the topic lest he make a fool of himself. (He did anyway, as in The Stars Like Dust, but this is a matter of complete cluelessness, not a manifestation of hidden perversity.)  Much of science fiction published in the 1940s and 50s is like this. Many factors played a role, not least of which was editorial expectation. The general expectation of women’s “place” was pervasive and retrograde and awaited the social revolutions yet to come before people raised to not notice would become aware. Two magazines were launched partly on the grounds of writers feeling constrained by such innate prudery,  The Magazine of Fantasy and Science Fiction and Galaxy, but even in these examples of what we might consider responsible views of women and relationships were the exceptions. Mostly, it was a vast unexplored sea that awaited writers with the chops to deal with the subject more fully. And a publishing environment that allowed for it.

My point being, his later personal proclivities, unpleasant as they were, seem to imbue his fiction not at all except by an absence. 

There are many writers (and painters and musicians and actors, etc) who I doubtless would dislike personally and some of whom I would have serious problems with, whose work I nonetheless have enjoyed and value. I do not believe we are reduceable to single traits. When engaged in an act of creation, my past certainly comes into play, but the requirements of the work put me in a mode outside of my daily tactics. I give the work authority over my private foibles. It may not always work, but I hope (and believe) that the result defies analysis by biographical specificity.

In other words, the work is a thing unto itself. It may be flawed, it may fail, and certainly some of those failures may be traceable to personal aspects of the way I see the world, but the work remains its own thing, to be judged by its own content. This is a standard of apprehension that, for me, is only fair, and seeks to avoid a priori condemnation based on similar personal aspects of a given viewer/reader/listener. The work is the work. 

Exposure to honest work done by flawed people is one way to learn to recognize propaganda, which is dishonest work done for flawed reasons. If we do not learn the difference, then art has failed us.

Clear-Eyed And Informed

One of the quickest ways to end conversations in casual social gatherings is to contradict someone expounding on myth, hearsay, and bad history. You’ve been there; we all have. Someone at some point starts holding forth on some chestnut of popular apprehension and repeats a story that has suffered the manifold revisions of a game of telephone that render the story factless, in service to a line of self-aggrandizing chest-beating at the expense of the truth. Stories many of us take for granted in the first place and, because we’ve never heard or bothered to find out the real story, assume to be accurate. We grow accustomed to thinking about these stories this way and then, when it might matter in ways we never anticipated, we don’t know that they have prepared the ground for us to swallow bigger misconstruals and even outright lies.

Into this, occasionally, steps someone who knows better and points out the flaws in the presentation. A curious thing usually happens. Either the conversation turns away from that topic or everyone gets angry. Not at the one disseminating the broken narrative but at the lone voice that contradicts the nonsense. Such reactions lead eventually one of two interpretations—either that people in general don’t care what the facts are or, adjacent, they like the error-laden chestnut more than the reality.

The reception of the corrective information can have a chilling effect on the one offering the facts. No one likes to be ostracized,

It can be puzzling. What is it about these skewed narratives that people preferred? Well, almost always there is something about them that makes people feel good about themselves—about their patriotism, they beliefs, their affiliations, but mostly about their ignorance. Once we leave school, most of us feel we are done with homework, which no one really liked anyway, and the idea that we may be less knowledgeable than perhaps we need to be just suggests that we need to do more homework. I suspect there’s an unexamined aspect of psychology that says that to be an Adult is to already have all the skills and knowledge we need. More study seems justifiable only if it leads to higher income. Even then, excuses can be made to avoid it.

But the reality is we need always to know more, especially about stories we think we know. The how, why, and wherefore of our history feeds into present issues in ways that, if we are ignorant, can lead to political and social traps.

When reading a first-class historian like Jill Lepore, one becomes aware of how tangled those webs into which we might fall can be. For those of us who may delight in being that one person at the party who will speak fact to ignorance, her books are a delight.

Her latest, The Deadline, is a collection of essays designed to counter the shallow, poorly-understood history that underlies so many of the canards foisted upon us daily as truth. As well, they are a delight to read.

In several books, Lepore has displayed an approach to her subjects that bypasses the various filters with which we view our history, opening side entrances into the underlying realities of which modern myths are formed. She examines the cultural touchstones by which we navigate the pathways of our presumably common identity.

Here, we find a range of essays that cover most of American history, topical subjects, thorny personal issues, memoir, and observations about the nature of knowing—or not knowing—what’s going on. Quite a few pieces are about the business of news itself, covering processes and personalities, and giving us a glimpse of how what we think we know comes to us too often “prepared” so a particular message is put forth, even while it is possible to find out what the other facts are. To that problem, we learn that there is nothing new about “fake news” other than the delivery vectors (and perhaps the speed with which it comes at us) but that even when such distortions seem impossible to counter, somehow we seem not to be fooled for long. That may be changing, though, and Lepore gives us her perspective on that as well.

Essentially, Lepore gives us a clear-eyed view of ourselves and our proclivities, often with the unpleasant but unsurprising conclusion that if we are fooled, it’s because we wish to be. But really there is no excuse for blindly reacting to hormone-spiking jabs at our panic buttons. We just need to know a little better.

As I say, Jill Lepore has become one of my favorite historians. She has a quirky set of interests (she did a marvelous book about the creator of Wonder Woman as well as penning one of the most interesting histories of the United States I’ve read in a long time) and this allows her to approach even the most convoluted subjects in ways that consistently illuminate. Along the way, she lets us know that one of the best ways to not be fooled is to refuse to accept the soundbite, the meme, or the two-minute report as the end of the story. While each may well contain a grain of truth, we have to understand that it’s only a grain and all that went into it is so much more interesting, richer, and liberating.

Cinema Versus ‘Theme Parks’

“I don’t see them. I tried, you know? But that’s not cinema.  Honestly, the closest I can think of them, as well made as they are, with actors doing the best they can under the circumstances, is theme parks. It isn’t the cinema of human beings trying to convey emotional, psychological experiences to another human being.”

Martin Scorcese said that in an interview about Marvel superhero movies. The observation has sparked some controversy. A lot of people heard him trashing their favorite form of movie, others—including Francis Ford Coppola—found resonance with his statements.

The part of his statement I disagree with is the part that I hear every time someone from the literary world suggests science fiction is not “real” literature—because it doesn’t deal with humans experiencing authentic emotions in a meaningful context. In its own way, Mr. Scorcese has recast the classic dismissal of science fiction and fantasy in regards to film.

To which I would say, “Care to justify that in terms of cinema as a whole?” It can be argued, I think, that the gangster film on which Scorcese made his reputation is not a milieu about ordinary people having emotional experiences in common with their audience, but about a distinct subset of humanity that distorts itself into an extreme condition to pit itself against the world. Their experience are by definition, at least in cinema, going to be over-the-top, magnified, and at odds with the common. The backgrounds are likewise going to be exaggerated and often surreal, set-pieces to support encounters of violence and passions pared down by adrenaline to caricatures of ordinary daily experience. They “entertain” for precisely those factors that for two hours remove us from our mundane lives and give us entreé into lives we will (hopefully) never take part in. The point of them is to allow a vicarious experience completely out of the ordinary.

They are anchored to us by asking “How would we react in the same circumstances?” and honestly following the thread of answers to what connects these people to us.

But the characters themselves, while often despicable, are extraordinary.

As are the characters of the gunslinger, the private detective, the cop, the soldier, the knight, the barbarian, etc.

It is their extraordinariness that attracts us, holds our attention, and carries us along through unlikely adventures to, one hopes, a satisfying and cathartic conclusion.

How is that any different than what we see in Captain America? Iron Man? Thor?

Oh, they come from the worlds of science and fantasy and wield unusual abilities.

So, once again, because they appear to us in the context of science fictional settings and offer challenges outside historical experience, they are not legitimate cinema…

To an extent, Scorcese has a point. They do offer “theme park” rides. It takes a rather extraordinary film like Winter Soldier or, stepping to a different franchise, Wonder Woman to see the genuine human story beneath the glossy, glitzy, hyper-realized settings, but it’s there. And for those films that fail to deliver that human element, well, it’s not that they aren’t cinema, they’re just bad cinema.

But “cinema” has always indulged the exotic, the novel, the visually unique to achieve what may be argued to be its primary advantage as a medium. The full embrace of the exotic cannot be used to reclassify certain films as “not cinema” because they utilize exactly that potential.

No, this is another version of reaction to a genre distinction because you don’t get it.  It’s the reason several excellent SF films failed to find notice with the Academy for years because they were that “spacey kid stuff.” Now good SF is finally being recognized by the Academy, leaving the position of poorly-regarded declassé genre in need of a new resident, and in this instance Mssrs Scorcese and Coppola elect the big superhero franchises.

Let’s face it—there have always been superhero films. Dirty Harry is a species of superhero, as is Jason Bourne and James Bond. Chuck Norris and Steven Segal have made their share of superhero films. And when you think about it, just about any Western where the hero faces impossible odds and wins is a superhero film. One could go down the list and find just cause to name any number of historical or quasi-historical epics as members of that club. Robin Hood is a superhero. The Lone Gunman story is a species of superhero film. And these all draw from various mythologies that are readily accessible as superhero stories. Hercules, Cuchulain, Gilgamesh, Samson…

Of course these films are cinema. Just as science fiction is literature.

You just have to speak the language.

The Downside of Expanded Participation?

It occurred to me the other day that there is a serious problem with the way audiences and films interact these days. It’s a relatively new problem, one that has grown up with social media, but it has roots in an older aspect of film production, namely the test screening. The idea being that before a general release, a film is shown to select audiences to gauge reactions and tweak the final cut before it is set free into the zeitgeist.  There’s logic to it, certainly, but I’ve always been uncomfortable with it because it’s an attempt to anticipate what should be an honest reaction to a work of art.  I try to imagine Rembrandt showing a painting to a client halfway or two-thirds finished and, depending on the reaction, going back to change it to conform to some inarticulate quibble on the part of someone who has no idea what should be on the canvas. Art, to a large extent, is a gamble, and test screenings are the equivalent of loading the dice or counting cards.

It’s understandable, of course, because a movie is not a painting done by one person, but a hugely expensive collaborative work with investors and questions of market share. But it still bothers me. (What if a test audience had complained that Bogey didn’t get Bergman at the end of Casablanca and the studio went back to change it to suit?)

Today there’s another phenomenon that is related to test audience but is even more invasively surreal. The pre-assessment by fans ahead of release. Sometimes years ahead.

This obsessive speculation has evolved into a form of semi-creative wheel-spinning that mimics a huge test audience, the key difference being that it is “testing” work not yet done. Fanfic seems to be part of this, but only as a minor, and apparently undervalued aspect. We have a large, active community engaged in predetermining what will, should, ought not, and might happen in forthcoming movies. Large enough and active enough that I believe it has affected how those movies are made, possibly unconsciously. The feedback loop is pernicious. The vindictiveness of the test audience can also be so severe as to impact decisions that have yet to be taken up.

The most visible way this has manifest—and this varies from franchise to franchise—is in the “look” of new films, especially in the effects, but also in the selection of cast, location, and choreography. Whether intentional or not, film makers pump things into next productions in an attempt to meet the expectations of this hypercritical superorganism.

This organism constructs alternate narratives, raises possible plot lines, critiques character development, and then, when the finished product fails on some level, engages in the kind of evisceration that cannot but give the creators pause to rethink, check themselves, question (often pointlessly) every choice made to that time.

I’m not sure this process happens at any conscious level, but it seems to mean the Doc Smith approach to bigger, splashier, louder, stranger films, at least in the Marvel and DC universes, and to a lesser extent the related products like Valerian or any given Bruce Willis vehicle of late, is a response to this incessant viral nattering. The anticipatory critical response must get through and affect the people in the main office.

Television has suffered less of this, it seems, because, at least in terms of story, these series suffer less from the kind of crippling second-guessing the motion pictures display.

Before all this near-instantaneous data back-and-forth, studios produced movies, people may have known they were being made, but little else got out to the general public until the trailers announcing upcoming releases. Based on those, you went or didn’t, and the movie was what it was, and you either liked it or didn’t. We were not treated to weekly box-office reports on news broadcasts. The films, with few exceptions, had a two-week first release run at the front line theaters, then moved down the hierarchy for one or two week engagements at smaller chains until they ended up at a tiny local theater, after which they vanished until popping up on tv at some point. You then went to the next and the next and the next. Choice was addressed by the fact that at any one time there might be a dozen new movies coming to the theaters a month. The film was what the producers made it. It was offered, you saw it, you took your response home, that was it.

A lot of the product was mediocre, but often reliably entertaining, and for the most part was made in a way that studios were not threatened with bankruptcy if they failed.  The really great ones came back from time to time or enjoyed extended runs in the theaters.

Fandom evolved and when the age of the internet dawned and the cable industry grew and the on-demand availability of movies was met by videotapes (later DVDs) and now streaming services, the products remained in front of self-selected audiences all the time.

This has changed the way these films are made. Not altogether to the bad, I hasten to add. I believe we’re passing through a kind of golden age of high quality films and certainly exceptional television.

But the budgets, the tendency to ignore better stories that lack the kind of epic myth-stuff of the major franchises, the endless bickering online and subsequently in conversations everywhere, and now this absurd war on what is, for wont of a better term, SJW content…

I can’t help it. Grow up.  So Doctor Who is a woman. Big deal. The character does not belong to you. Instead of chafing that some reification of idealized masculinity in being threatened, try just going with it and see where it takes you. That’s the whole purpose of storytelling! To be remade by narrative and offered a new view! To be challenged out of your day-to-day baseline assumptions!

Star Wars has been ruined by all the SJW crap! Really?

While I can see that discussion groups and this expanded dialogue can be fun and instructive, I think an unintended consequence has been to grant certain (very loud) people a sense of ownership over what is not theirs. The cacophony of anticipatory disappointment actually has a dampening effect on those of us who would simply like to be surprised and delighted all on our own.  There is utility in silence, purpose in the vacuum, a vacuum to be filled by a new film. Box office is (or can be) detrimentally affected by the chattering carps of disillusioned fan critics who are terrified of James Bond becoming black, of Thor being turned into a woman, of the Doctor showing us how gender prejudice applies in our own lives.

I’ve been disappointed with new manifestations of favorite characters in the past, don’t get me wrong. My response has been to turn to something else. Those characters don’t belong to me, I don’t have a right to expect their creators to do what I think they should, and I recognize that probably a whole lot of people are just fine with a new direction. Otherwise sales figures would push them to change it again. it’s the pettiest of sour grapes to try to preload a rejection in advance of actually seeing the product.

I have no numbers to back up my impression, but I think it worth considering that the “my life will end unless the next movie comes out exactly the way I want it” school of anticipatory criticism is having a distorting effect over time, both on the product and on the ability of audiences to simply encounter something “clean” and take a personal and unmitigated response away from it.

Just a thought.

Purity In Fiction (or, Jonathan Franzen’s Latest Attempt At…Something)

The most recent entry in the annals of attempted applied snobbery came recently from Jonathan Franzen, who, while certainly a gifted prose stylist, seems bent on making himself into the grumpiest white literary snob on the planet.

Disclaimer: I have read Mr. Franzen’s essays.  I have tried to read his fiction, but quite honestly found nothing much of particular interest. A cross, perhaps, between Dickens and Roth, with leanings toward Russo and Gardner. I admit to having been seriously put off by his antics back when Oprah Winfrey tried to draft him into her popular reading group series.

I also admit that I’ve never been quite sure what to make of all that. Till now.

He has offered Ten Rules For Novelist’s.  By the tenth you realize you are being lectured by someone who wishes to be regarded one way, suspects he may be regarded another way, but is afraid he is not being regarded at all, at least not as any kind of exemplar or Wise Head With Priceless Advice. His “rules” suffer from the curse of the “lit’rary.”  Distilled, it would seem he’s telling us that “if you don’t write like me, or try to write like me, you’re wasting your time and destroying the culture simultaneously.”

Others have weighed in on the problematic nature of these. It may seem self-serving to tilt at lawn ornaments with pretensions to windmill-ness, but frankly, I already know I’m not paid much attention to and nothing I say here will do anything for a career I do not have.

I am, however, much irked by this kind of thing. It’s disingenuous in it’s effect if not intent (how would I know how much of this he really believes?). By that I mean, it is not a set of rules to help aspiring writers, it is a set of reasons for not being a writer. Latent within these is the unspoken belief that, whoever you are, You Are Not Worthy.

Having said that, there are a couple of these I sort of agree with. Not, mind you, as proscriptions, but as matters of personal taste. Number 4, for instance: Write in third person unless a really distinctive first-person voice offers itself irresistibly.

I default to third-person because I tend to be a bit put off by first-person. To me, First Person is terribly artificial. No one goes through life narrating what they do. (Granted, all tenses, with regard to How People Actually Live, are artificial. Telling a story is an artifice, a Made Thing. Any “naturalness” to it all is part of the Art. It’s a seduction, convincing a reader to subsume his/her consciousness to the dictates of the narrative so that it feels natural.) Yes, once in a while, a story requires a different voice, even a different tense, because the writer is trying for a different effect. You make these choices for effect. You want the reader to go to a certain place in a specific way.

So while I agree with Rule # 4, I think Franzen phrased it in a way that tries to make it seem less of a matter of technique and more something that emerges from the zeitgeist. In other words, it’s a dodgy, deceptive way to say it.

There are too many little aphorisms and unexamined heuristics connected to writing that, if taken at face value, deter rather than aid the aspiring writer. We do not need more of them. For instance, Rule 1:  The reader is a friend, not an adversary, not a spectator.

Yeah…so? How is this a rule? An observation, yes, but in what way does this constitute necessary advice? And frankly, it’s not always true, nor is it even internally true. A “reader” is a stranger you hope to make a friend, of sorts, but they need convincing. Especially if you intend telling them hard truths, which seems to be what Mr. Franzen’s literary aim is. They will be, however briefly, a kind of adversary. And let’s face it, all art is initially a spectacle—requiring an audience, which is comprised of spectators. Many will stay for one game and never come back. They are not your friends. But they watched. As they read, they may shift often between these three conditions, and the adroit writer may wish them to do exactly that, because each state allows for different effects, which transfer aesthesis in different ways.  (And, really, James Joyce treated his potential audience not only as adversaries but occasionally as an angry mob with pitchforks—and by so doing created manifold aesthetic effects that are essential to the ongoing value of his works.)

Rather than go through them all, let me take the three “rules” I find most egregious. Numbers 2, 5, and 8.

Rule # 2: Fiction that isn’t an author’s personal adventure into the frightening or the unknown isn’t worth writing for anything but money.

I actually know what he’s trying to say here, but he said it in such a way as to betray the aristocratic self-image he wishes to convey and ends up doing disservice not only to a great deal of fictive output but reifies the academic nonsense about the nature of actually Writing For A Living.

What this is, firstly, is a variation on the Write What You Know, one of those aforementioned aphorisms that are less than useless. It seems to mean write only what you yourself have gone out into the world and experienced first-hand and even then be careful because you probably don’t know it as well as you think you do and in that case do another story about a writer suffering the self-doubt of the underappreciated. (Rules 5 and 8 underscore this, by the way.)

Secondly, it’s essentially claiming that writing, true writing, the pure quill, as it were, can only be done by the Elect. It’s a priesthood and defined by suffering and, often, by accidental success. I find it remarkable how many times we have been treated to lectures about the sordidness of writing for money from writers who have a Lot Of It. In other words, they are successful enough that they are offered platforms from which to tell the rest of us that we should just give it up.

Writing is, perhaps, a calling of sorts, but in its commission it is a craft and if one intends to do it as a vocation—which, in this instance, means having the opportunity to do as one’s primary activity—then you do it for pay. Otherwise, two things—you starve or no one ever hears of you because you choose not to starve and take a job that prevents you from writing all the time. (I can hear the rejoinder—“well, if no one wants to buy your work in sufficient quantity, then it must be inferior”—which both ignores the realities of the market and exhibits hypocrisy at the same time.)

Most of us never get the opportunity to make this our living.  We get paid poorly, distributed badly, and rarely get recognized outside our own little patch. To have someone whose books regularly debut on best seller lists tell us that writing for money is somehow disreputable and sullies our work is the height of snobbery.

Rule # 5: When information becomes free and universally accessible, voluminous research for a novel is devalued along with it.

What does that actually mean? When I do a great deal of research to make my work sing with verisimilitude and I find that my readers know enough to appreciate what I have done, it increases the value of that work.  Again, this is snobbery, based on the assumption that the True Novelist has the time and resources to do something the rest of us can’t do.

The only thing that a novelist can do that the hoi polloi can’t is tell a story that moves people. They can know or have access to everything the writer knows and has learned and yet the one thing they will still not be able to do is tell that writer’s story.*

But that rule offers a glimpse into the requirements of the priesthood. When you can go to the library and look up the secret handshake of the order, the value of joining that order—or, more pertinently, living in awe of that order—diminishes.

But people still might go to the temple for the pleasure of the spectacle. So make it good spectacle.

Rule # 5 is a bizarre kind of anti-intellectual classist elitism.  And a rule for what?  Hiding information from people so you can look more impressive?

Rule # 8:  It’s doubtful that anyone with an Internet connection at his workplace is writing good fiction.

This is a kind of corollary to #5 and suffers the same flawed reasoning.

This is of a species with the whole “the novel is dead” nonsense someone brings up every so often. The last time it came up, it was obvious that the Novel being obited was the Great American Novel Written By A White Male.  It ignored women, nonwhite writers, and genre.

(Oh, genre! My ghod, what a smear upon the face of Great Literary Art!)

I said above that I have read Mr. Franzen’s essays.  I have dipped into his fiction. He is quite a good writer. I concede he can write a scene and turn out a fine sentence. In his fiction, he writes about things in a way that I can find no traction. He might be saying some things I would be moved by, but his approach leaves me cold. For this reader, he commits the one unforgivable sin—he is uninteresting.

He also seems to lead with an expectation that he will be disappointed. In us, in the universe, in himself. His essays exhibit a glumness that becomes, after a while, a drag on my psyche.

These rules suggest an answer.  He seems really to believe he should be regarded in ways that he fears he is not—and probably isn’t. The nonsense with Oprah led me to see him as pretentious and these rules have convinced me. The regard of the general public moved on in the latter half of the 20th Century as the balkanization of fiction categories multiplied and the position of Great Writer as Conscience of the Culture sort of dissipated.

But that doesn’t mean regard for novels diminished, nor does it mean the value of those novels has lessened, it only means that no one group can dictate the Standard Model of Significant Fiction anymore. The podium has, in fact, expanded, and the work that constitutes what is most worthy now includes things the Pure Writer seems to feel is beneath them.

____________________________________________________________________________________

*I just realized, rereading this (12/24/18) that this implicitly conflates “information” with Truth. which is a complete misapprehension of the nature of what a writer does. We do not merely convey information (the thing which, if everyone has access to it, becomes devalued) but process our encounter with reality and reconfigure it into some kind of truth-telling, which is, while perhaps dependent on information, not the same thing as simple information.

 

 

Music and Popular Trends

Anyone who knows me for any length of time eventually learns of my sometimes intransigent tastes in music. (Not only music, but whereas other art forms prompt conversations about form and substance that remain largely theoretical, analytical, and impersonal, when it comes to music, especially popular music, things can get a bit touchy.)  I have a minor musical background, I play (or play AT) keyboard and guitar, and in my youth I had fantasies of being a rock star.

I grew up with a wide range of influences, although in the end it was a pretty static assembly. My parents had about fifty records. A wide mix, ranging from Strauss waltzes and Grieg, to Chet Atkins, Bobby Darin, movie soundtracks, Peggy Lee, one odd Tennessee Ernie Ford record, and Les Paul. A few other oddments, including some Gershwin, Nat King Cole, and a couple of jazz records I do not remember clearly. But there were also music programs on television then the like of which we rarely see anymore and I was raised with a huge variety. My father was, in his engineering way, a stickler for technique.

That last is important.

When I came of an age to start finding my own music, it lead me into some strange byways. When everyone else was going insane with the Beatles, I was listening to Walter Carlos.  When the Rolling Stones were the rebellion of choice, I’d stumbled on The Nice. And finally, when I had a budget, the first albums I purchased were Santana, Chicago, and…

Yes.

Fishing in the waters of new rock music, I had no idea who was in, who was out, what the roots of some of this music might be.  I only knew what caught my attention and made me feel good. I heard a Yes tune late one night on our local independent FM station and I never got over it.

Before I understood there were divisions and lines being drawn between various musical styles, I had a very eclectic collection anchored by what became known as Progressive Rock.  Along with James Taylor,  America, Cream, and the other assorted sonic adventures, you would find, by the mid Seventies, in my collection not only every Yes album then available but also ELP, Jethro Tull, Genesis, Renaissance, and a smattering of others, all of whom by the end of the decade were being heaped with derision in the music press and by a growing crowd of discontents who pursued Punk, Disco, or New Wave, declaring that Prog was pretentious snob music.

I never heard anything but grandeur and emotional transcendence.  Later, after the ash had settled and the music scene had burned to the ground and been rebuilt in dozens of smaller abodes, I realized that what I was hearing in Prog was a modern attempt to capture what one heard in Beethoven or Tchaikovsky or Sibelius. I wholly approved.

But it became a sore point over time when the inevitable conversations about “good”music became more and more balkanized over what I eventually decided was a kind of reverse snobbishness if not outright anti-intellectual protest against sophistication, skill, and imagination.  I heard the same kinds of criticisms from people who took regular potshots at science fiction.

But till now I never paid that much attention to the history. The What Happened aspect.

David Weigel’s new book, The Show That Never Ends, is a solid history of a form that, most people forget, dominated popular music for almost a decade.  Emerson, Lake, & Palmer were at one time the biggest act on the planet.  Yes, which almost never broke the Top 40 charts, filled arenas.

And then, seemingly overnight, everyone was listening to The Cars, The Sex Pistols, The Police, almost anything Other Than music with the kind of intricacy usually associated with either jazz or classical.

So what did happen?

Weigel writes unsentimentally but with sympathy about how  combination of audience exhaustion, music industry economics, and ultimately the self-destruction of some of the key artists and their own  creative exhaustion led to a melange of unsatisfactory products. Self-indulgence, musically and otherwise, introversion, and the jangling disconnect between market demands and pursuit of vision ended up making a bit of a mess that resulted in…

Well, oddly, not the end of progressive rock, because it is still with us, even in new manifestations, and many of the mainstays of the first wave progressives are now respected elder statesmen whose contributions to music are finally being acknowledged.  It is obvious in hindsight that many of the bands who pushed Prog aside in the Eighties and Nineties could not have done the kind of music they did without the tools invented by those Old Pretentious Guys.

When it comes to that music, Weigel displays an admirable understanding of it as composition.  He talks about the construction of these works and what set them apart theoretically from other forms.  It is a pleasure to read his descriptions of how many of the pieces that form the bedrock of progressive rock came about and what makes them fascinating to listen to.

One element of the “downfall” of Prog Weigel does not touch on, though it is there in the narrative if you care to tease it out, was the unsustainability of one of the effects of some of these acts.  Look at Yes, look at early Genesis, look even at ELP, and part of the glamor, the attraction, was that they had built a world. It was almost a literary act, the creation of a whole suite of aesthetic components that offered the illusion that one could enter into it, as if into Narnia, and live there. For a few hours on a concert night, the illusion could be powerful, and the dedicated album art and the philosophizing one read in interviews all added to the illusion.

But in the end it was not really possible, and in the morning there was the real world, and disappointment gradually encroached.  It wasn’t “just” a good concert, but a promise that could not be fulfilled.

For some, maybe many. You had in the end to be an “insider” to get it and finally the huge soap bubble simply could not be sustained.

Ultimately, though, this was the kind of stretching that popular music needed even if the beneficiaries of it did not continue to write and play in that idiom, and as pure music some of it is, indeed, transcendent.

Now that so many of these folks are beginning to pass from the scene, revisiting their contributions, reassessing their output as music rather than as some kind of cultural statement, would seem in order. Weigel’s book would be a good place to start.

 

2017

Looking at my list, I read, cover-to-cover, 51 books in 2017. That doesn’t seem like much to me, but knowing there are people, even people I know, who haven’t read one book in that time, it’s probably in the top something-or-other of national averages. At 63, I’m not sure I care anymore. It never was about quantity, as I’ve told myself time and again, but there are so many good books and I want to read them all!

We have engaged another study group this year. Rousseau. When we agreed to join, we thought we were doing just one of his works, his Second Discourse on Inequality. Come to find out, our guiding light wants to cover all the major Rousseau. Next up is Emile. I haven’t read Emile since high school. I remember little about it, other than it served to enrich a later reading of C. J. Cherryh’s Cyteen. Very well. Rousseau it is.

But in 2017, I felt torn between two kinds of reading. Reading to escape—because, really, look at the world—and reading to understand how to deal with reality.

A third category was the books for my science fiction group at Left Bank Books. Twelve titles, mostly selected by me, but voted on now by the whole group. My intention in this group is to read a mix of new(wish) and classic. This year we’ll be doing our first nonfiction title.

It’s given me a chance to reread some of my favorites. In almost every instance, I’ve found a practically new novel. For instance, Delany’s Trouble On Triton. I no longer recall clearly how I felt about it when I read it back in the Seventies, but this time through it was fascinating in that Delany opted to tell the story through the eyes of a person incapable of any kind of satisfaction in what in many ways is practically a paradise (never mind the little war going on). He wrote it as a kind of response to Le Guin’s The Dispossessed and it works quite well as that, flipping the relationship on its head at almost every point. Bron Helstrom is not a misunderstood everyman in a society unwilling to accommodate his uniqueness. Rather, he is a perpetually ill-fitting square peg that refuses, constitutionally, to be accommodated, even by a society that has no qualms trying to help him “fit.”

We also read Joan Vinge’s magisterial Snow Queen. I confess this was the first time I managed to get all the way through. While some of it is a bit dated, I found it a magnificent piece of world-building.

Then there was Ninefox Gambit by Yoon Ha Lee.  On the surface, this is an intense military SF novel, but it works with religious motifs, time and calendars, and the whole notion of long games and ghosts. The details of the world-building are impressive and the motivations for the conflicts are unusual to say the least. There is an element of Mayan cosmology buried beneath the surface, transformed into the kind of Otherness that gives the best science fiction its appeal.

Borne by Jeff Vandermeer is an odd novel. Compared to some of his work, which I find utterly unclassifiable (in the best sense), Borne is much more accessible, even though it presents as bizarre a world as any Vandermeer has ever offered. I came to the conclusion that this is a take on Alice Through The Looking Glass, done with an utterly different set of cultural expectations.

We read Keith Roberts’ Pavane, Chabon’s  The Yiddish Policeman’s Union Long Way To A Small Angry Planet by Becky Chambers, Autonomous by Annalee Newitz, Use Of Weapons b y Iain M. Banks, Too Like The Lightning by Ada Palmer, Tim Powers’ The Anubis Gates, all of which generated excellent discussion.

Along with the other newer SF I read this past year, I conclude that the genre has never been healthier, more fascinating and inventive. The quality of imagination and craft have combined to produce some of the best work ever done.

Likewise in science writing, as exemplified by Carlo Rovelli, whose Reality Is Not What It Seems impressed me with its clarity and concision. (I’d been wondering what happened to string theory the last few years and this book told me.)

The Book That Changed America by Randall Fuller is more history than science. It details the initial encounter with Darwin in America and examines its impact. Both its initial welcome by people who saw in it a sound argument against slavery and then its later rejection as the assault on biblical fealty it became.

Sidharta Mukerjee’s The Gene is likewise marvelously accessible, a history and examination of the Thing That Makes Us Us.

In the same vein, but much more exhaustive in its historicity, was David Wooton’s The Invention of Science, a chronicle of how science came into being. He demonstrates that it was not the revelation popular myth suggests, but a long accumulation of understandings that depended utterly on the social context in which they emerged. (For instance, we have no science as practice without the printing press.) Reviewing first appearances of words and concepts, the book shows how culture had to change before the possibility of the modern concept of science could even begin to take hold. Just realizing that prior to Columbus there was no common understanding of the concept of “discovery.”

Just as enlightening were Charles C. Mann’s pair of New World histories, 1491 and  1493, which combined tear away most of the preconceptions about the world Europe destroyed when it crossed the Atlantic.

I read some excellent novels outside genre—Jacqueline Winspear’s well done Maisy Dobbs series (three of them), The Hot Country by Robert Olen Butler, Kerry Greenwood’s Cocaine Blues, the first Miss Fisher mystery, and travel writer William Least Heat-Moon’s first foray into fiction, Celestial Mechanics. But primarily, I read nonfiction and SF.  It was that kind of a year.

As a bookseller, I noticed a sharp rise in the purchase of science books. Overall book sales were generally higher than in past years.  People are reading things which seem to offer a more solid grasp of reality. Perhaps this is in reaction to the state of the world or just the state of the country.  People seem to want To Know, in very concrete ways, if their selections are indicative. I see less escapism, and when I do see it, it is not the sort that would leave the psyché unchanged.

I already have half a dozen titles read for this year. It will be interesting to see how all this evolves by December.

Good reading to you all.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

End Times

The Sixties.

Depending on what your major concerns are, that period means different things.  For many people, it was revolution, civil rights, the peace movement.  For many others, it was music.

For Michael Walker, it was evidently the latter.  In his new book, What You Want Is In The Limo,  he chronicles what he considers the End of the Sixties through the 1973 tours of three major rock groups—The Who, Led Zeppelin, and Alice Cooper.

His claim, as summarized in the interview linked above, is that after Woodstock, the music industry realized how much money could be made with this noisy kid stuff (which by Woodstock it no longer was—kid stuff, that is) and started investing heavily, expanding the concert scene, turning it from a “cottage industry” into the mega-million-dollar monster it has become.  1973, according to Walker, is the year all this peaked for the kind of music that had dominated The Sixties, made the turn into rock star megalomania, and ushered in the excesses of the later Seventies and the crash-and-burn wasteland of the Punk and New Wave eras (with a brief foray into Disco and cocaine before the final meltdown).

The bands he chose are emblematic, certainly, but of the end of the Sixties?  I agree with him that 1973 is the year the Sixties ended, but the music aspect, as always, was merely a reflection, not a cause.  What happened in 1973 that brought it all to an ignominious close was this: Vietnam ended.

(Yes, I know we weren’t out until 1975, but in 1972 Nixon went to China, which resulted in the shut-down of the South China rail line by which Russia had been supplying North Vietnam, and in 1973 the draft ended, effectively deflating a goodly amount of the rage over the war.  The next year and a half were wind-down.)

Walker’s analysis of the cultural differences before and after 1973 are solid, but while the money was certainly a factor, a bigger one is exhaustion.  After a decade of upheaval over civil rights and the war in Vietnam, people were tired.  Vietnam ended and everyone went home.  Time to party.  Up to that point, the music—the important music, the music of heft and substance—was in solidarity with the social movements and protest was a major component of the elixir.  Concerts were occasions for coming together in a common aesthetic, the sounds that distinguished Woodstock acting as a kind of ur-conscious bubble, binding people together in common cause.

Once the primary issues seemed settled, the music was just music for many people, and the aspects which seemed to have informed the popularity of groups like Cream or the Stones or the Doors lost touch with the zeitgeist.  What had begun as an industry of one-hit wonders returned to that ethic and pseudo-revolutionary music began to be produced to feed the remaining nostalgia.

(Consider, for example, a group like Chicago, which began as socially-conscious, committed-to-revolution act—they even made a statement to that effect on the inside cover of their second album—and yet by 1975 were cashing in on power ballads and love songs, leaving the heavily experimental compositions of their first three albums behind and eschewing their counter-culture sensibilities.)

To my mind the album that truly signified the end of that whole era was The Moody Blues Seventh Sojourn, which was elegaic from beginning to end.  The last cut, I’m Just A Singer In A Rock’n’Roll Band, was a rejection of the mantle bestowed on many groups and performers during the Sixties of guru.  With that recording, the era was—for me—over.

Also for me, Alice Cooper never signified anything beyond the circus act he was.  Solid tunes, an edgy stage act, and all the raw on-the-road excess that was seen by many to characterize supergroups, but most of Cooper’s music was vacuous pop-smithing.  The Who and Led Zeppelin were something else and both of them signify much more in artistic terms.  Overreach.

But interestingly enough, different kinds of overreach.  Walker talks of the self-indulgence of 45-minute solos in the case of Zeppelin, but this was nothing new—Cream had set the standard for seemingly endless solos back in 1966 and Country Joe McDonald produced an album in the Nineties with extended compositions and solos.  Quadraphenia was The Who’s last “great” album, according to Walker, and I tend to agree, but two kinds of exhaustion are at work in these two examples.  Zeppelin exhausted themselves in the tours and the 110% performances.  The Who exhausted the form in which they worked.  After Quadraphenia, all they could do was return to a formula that had worked well before, but which now gained them no ground in terms of artistic achievement.  As artistic statement—as an example of how far they could push the idiom—that album was a high watermark that still stands.  But the later Who Are You?  is possibly their best-crafted work after Who”s Next.  “Greatness”—whatever that means in this context—had not abandoned them.  But the audience had changed.  Their later albums were money-makers with the occasional flash of brilliance.  They were feeding the pop machine while trying to compose on the edge, a skill few manage consistently for any length of time.

“Excess” is an interesting term as well.  Excess in what?  The combination of social movement with compositional daring had a moment in time.  When that time passed, two audiences parted company.  Those who wanted to party (often nostalgically) and those who were truly enamored of music as pure form.  They looked across the divide at each other and the accusation of excess was aimed by each at different things.  The one disdained the social excess of the other while the latter loathed the musical excess of the former.  People gleefully embracing Journey, disco, punk, and a gradually resurgent country-western genre thought the experimental explorations of the post-Sixties “art rock” scene were self-indulgent, elitist, and unlistenable.   People flocking to Yes and Emerson,Lake & Palmer concerts, cuing up Genesis and UK on their turntables, (and retroactively filling out their classical collections) found the whole disco scene and designer-drug culture grotesque.  Yet in many ways they had begun as the same social group, before the End of the Sixties.

The glue that had bound them together evaporated with the end of the political and social issues that had produced the counterculture and its attendant musical reflection in the first place.  Without that glue, diaspora.

And the forms keep breaking down into smaller and smaller categories, which is in its own way a kind of excess.  The excess of pointless selectiveness.