Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

On Heinlein and Expectations

William Patterson Jr. finished and delivered the second volume of his copious biography of Robert A. Heinlein not long before he passed away of a heart attack.  He was too young.  After reading his opus, he may well have had another book about Heinlein in him which we will now not see.

I base that on the fact that while volume 2—The Man Who Learned Better: 1948 to 1988—is filled with the minutiae of a crowded life, there seems little in-depth analysis and assessment of Heinlein’s work.  Given the few and scattered remarks about the shortcomings of other books of criticism published during Heinlein’s lifetime, one might reasonably expect such an assessment from a writer of evident skill and insight.  It is not out of the realm of probability that he may have intended such analyses for a third volume devoted exclusively to such an assessment.

To be sure, there are brief passages about several of the books of a critical nature that are useful.  (Detailing the travails of writing a given work, while fascinating to anyone interested in Heinlein’s life, is no substitute for a thorough study of the work in question.  This is not intended as a criticism of what is in the book, only that the wealth of information spurs a desire for more, especially when presented with tantalizing explanations of some problematic works that alter past perceptions.)  For instance, in discussing one of Heinlein’s most poorly understood later period novels, I Will Fear No Evil, Patterson reveals that Heinlein’s ambition in writing it was as response to postmodernism, taking apparently as inspiration John Barth’s Giles, Goat Boy and work by Philip Roth.  If true—and I have no reason to doubt him, as Heinlein himself discussed this in his own correspondence—this casts a very different light on what has become the Heinlein novel even ardent fans seem to dislike, often hate.

Although Heinlein rarely discussed his process with the story that became I Will Fear No Evil, …[i]t was as if he was working on crafting a New Wave kind of story that worked as story—the kind of thing for fiction that Frank Lloyd Wright had done with the Bauhaus when he designed Fallingwater in 1935…

He had Nabokov on his mind as well as the New Wave movement (this would have been right in the middle of it) and postmodernism, as well as reacting against the enshrinement going on in fandom of Campbellian Golden Age conventions.  He wanted to shake everyone up.

If in fact that was the nature of the work, it becomes clear why the book seemed to have no “natural” audience and served to confuse people more than reinforce Heinlein’s reputation as the “dean of space age fiction.”  The core readership of science fiction—fandom—would have loathed the postmodernist ambiguities while mainstream critics still treated science fiction as a fad and a not very good one at that.  Had someone told the New York Times reviewers that the book was a postmodern allegory, they would have (perhaps silently) laughed in dismay.

At this point a deeper analysis of the book might have been in order.

But Patterson was not doing literary analysis, he was chronicling a fascinating life.

Heinlein has long been the largest head on the Mount Rushmore of science fiction.  The myths about him, from his first sale to his unhindered success to his idolization of redheads to his supposed fascism, have stood in for any real knowledge about him, seasoned here and there with personal anecdotes.  In fact, Heinlein was almost pathologically private and resented anyone poking into his personal life.  He had a public persona, which he apparently enjoyed using, based on certain aspects of his character which those who saw only that took to be the whole man.  In later years his critics viewed him as hopelessly anachronistic, conservative to the point of feudalistic, a reactionary, and, despite sales figures, marginal to the field.  The service Patterson has done, besides the obvious demythologizing (especially in the first volume), is the extensive contextualizing of the man, the filling in of event, and the examination of how surfaces hide as much as reflect what lies behind what the public sees.

Heinlein was nothing if not experimental.  Often, because he was conducting his experiments at the times he did, the experiments were misperceived and misunderstood.  One can sympathize with his repeated desire not to have his work “analyzed” in an academic sense because he felt it would rob readers of seeing for themselves.  He likely disliked the idea of seeing his own motives and character analyzed through the lens of his work, something which happens often, especially in academic works.  He did not wish to be “psychologized” by people who may well not “get” what he was trying to do in the first place.

He was very much about control in this regard.

As in much of the rest of his life.  His detractors occasionally riff on the idea that he was in some ways a fraud, that his desire for control was only to mask a deep sense of incompetence or even incomprehension.  This is an unfortunately shallow reading.  Consider: Heinlein’s one ambition as a youth was to have a Navy career.  He worked himself into physical breakdown to get through Annapolis only to find out a short time into what he thought would be a lifetime calling that his own health was sabotaging him.  He had to leave the Navy because his body failed him.  The one thing he truly wanted to do was denied him.

Some people might give up and sell siding for the rest of their lives.  Heinlein tried many things.  He ran for political office, he tried mining, pursued his education, finally coming to writing.  Even after early success at that, he continued trying to serve his country and ran a research lab.

That he may have felt some ambivalence about the thing that eventually became his most successful endeavor might be understood given all this.  Rather than hiding incompetence, it is perhaps more accurate to say that he lived with continued fear that some new malady or accident might put an end to this as well.  It is not inconceivable that he expected, however minutely, that the bottom would fall out in the next step or two.  Reading about the speed with which he turned out clearly superior novels, it is not hard to imagine a nagging imp of doubt that he might not be able to do this next week for reasons completely out of his control

Misrepresentation and fraud have nothing to do with this.

What is most interesting in all this is seeing the bell curve of influence with each new book.  Heinlein’s work was audacious when written, groundbreaking when published, influential throughout the period when other writers reacted to it, and then reassigned as exemplary of some shortcoming on the author’s part as the culture caught up with it and passed it by.  In hindsight, the flaws are myriad, some profound, but I can think of no other science fiction writer to suffer such extremes of regard, especially within their lifetime.

What becomes apparent in reading the 1000 plus pages of Patterson’s work is that the one thing Heinlein intended with each book was to start a discussion.  What so many seem to have taken as pronouncements from on high, Heinlein intended as the opening gambit in a long conversation.  Instead of engaging in the argument, too many people made him their personal guru, something he consistently rejected, and when they realized finally that some of the things Heinlein said were problematic or downright inflammatory, they turned on him.  He wanted to be Socrates, not Aristotle as remade by the Church.  He wanted people to disagree, to engage.

How else to explain the wild variations of philosophy between works like Starship Troopers and Stranger In A Strange Land, Beyond This Horizon and Farnham’s Freehold, Methusaleh’s Children and The Moon Is A Harsh Mistress?

On the other hand, he seemed often to work in a vacuum of his own making.  He bridled at the confines of expected SF forms, yet he did not avail himself of relationships with the mainstream literary establishment he longed to be part of.  He wanted to write work that transcended genre boundaries—and read extensively outside the field—and yet he rarely seemed to engage in the cultural discourse going on outside the SF “ghetto.”  He and Virginia, his third wife, were usually politically isolated, even while trying to fully interact with the ongoing political dynamic.  Heinlein’s politics were more of the “curse on both your houses” variety than anything categorizably useful.  He claimed affinity with libertarianism, yet had no real respect for much that passed for political philosophy under that banner.  Neither fish nor fowl, it came to others to try to define him, and he gave them little assistance.  The country moved in directions with which he disagreed, but his reactions gave no support to others who thought the same way and wanted to do this or that to change it.  He lived by a definition of liberal that was being quickly left behind by those working under that label.  His consistent message through his fiction was “Think for yourself” and yet it came across more and more as “if you don’t think like me you’re an idiot.”  Those looking for ready-made answers in his work could only see the latter.

Narratively, volume 2 is packed too tightly to be as good a read as the first book.  No doubt this is a result of trying to keep it usefully in hand in combination with the increased wealth of information available about this forty year period.  But it nevertheless offers a fascinating look at a genuine iconoclast within his context, and for that it is a very worthy book.

Finally, as much as detractors would like to make Heinlein an irrelevancy, the very obsessiveness with which many of them attend his deconstruction suggests that while one may disagree over him profoundly, he is not easily ignored or dismissed.  Whatever else, he did succeed in getting a conversation going.  Sometimes it’s actually about what he considered important.

Inside Outside: Two Views of Science Fiction

Histories and analyses of science fiction are often fragmentary. Like histories of rock’n’roll, there are just too many different facets to be meaningfully comprehensive. That is not to say there aren’t excellent works that manage to deal with essential elements of science fiction, only that inevitably something will be left out or overlooked or, now and then, misunderstood.

I recently read two books about the subject that represent the poles of such analyses—those done from the inside and those done from the outside—and between them a clarity emerges about the fundamental misunderstandings that abound about the nature of science fiction.

Brian W. Aldiss’s almost majestic Billion Year Spree was published in 1973, a good year to attempt an overview like this, which covers precursor works as well as traces the development of the specific qualities of the genre through the 19th Century and then treats the major corpus of what we have come to recognize as science fiction from the 20th Century. Aldiss is very smart, very savvy, and his wit is equal to his intelligence in putting things in perspective. It is in this book that the idea that Mary Shelley’s Frankenstein is the first genuine science fiction novel was presented. Most dedicated readers of science fiction may be acquainted with this proposition, which has gone viral within the field, but may not have read Aldiss’s arguments in support. They are worth the time.

The second book is very recent. Margaret Atwood’s In Other Worlds, which does not purport to be an overview like Aldiss’s work. Instead it is a very personal history with opinions and judgments. It covers Atwood’s association with science fiction and showcases her take on it as a genre. In some ways it resembles a memoir. On the question of what the first SF work was, Atwood is much less rigorous and far more concerned with SF as myth than Aldiss, so we find allusions to Gilgamesh and several other works along the way, which she does not specifically name as the primogenitor.

Which makes perfect sense by the end of the book because—and she pretends to nothing else—she doesn’t know. She doesn’t seem to know what science fiction is as practiced by those who work mainly within the field, nor does she seem to understand the nature of the particular pleasure of SF for the dedicated fan. And as I say, she never claims to.

This would normally not even be an issue but for the fact that Atwood has been committing science fiction for some time now. But it’s not her primary interest, as represented by a long and successful career writing and publishing what is generally regarded as mainstream literary fiction and commentary upon it. It’s not her sandbox, even though she is clearly attracted to it and likes to come over and play.

The different focus of her appreciation of science fiction highlights aspects of the longrunning and disputatious relationship between the so-called literary establishment and the declassé realms of genre fiction. Especially after having read Aldiss on science fiction, the bases of mutual incomprehension across the fictive divide becomes clearer.

Aldiss establishes his premises early:

No true understanding of science fiction is possible until its origin and development are understood. In this respect, almost everyone who has written on science fiction has been (I believe) in error—for reasons of aggrandisement or ignorance. To speak of science fiction as beginning with the plays of Aristophanes or some Mycenean fragment concerning a flight to the Sun on a goose’s back is to confuse the central function of the genre; to speak of it as beginning in a pulp magazine in 1926 is equally misleading.

In chapter one he then sets out his operating definition:

Science fiction is the search for a definition of man and his status in the universe which will stand in our advanced but confused state of knowledge (science), and is characteristically cast in the Gothic or post-Gothic mould.

Contrast this to Atwood’s opening stab at definitions:

Much depends on your nomenclatural allegiances, or else on your system of literary taxonomy…I realized that I couldn’t make a stand at the answer because I didn’t really grasp what the term science fiction means anymore. Is this term a corral with real fences or is it merely a shelving aid, there to help workers in bookstores place the book in a semi-accurate or at least lucrative way?
…sci fic includes, as a matter of course, spaceships and Mad Scientists, and Experiments Gone Awfully Wrong…

Then later, this:

In a public discussion with Ursula K. Le Guin in the fall of 2010…I found that what she means by “science fiction” is speculative fiction about things that really could happen, whereas things that really could not happen she classifies under “fantasy.”
…In short, what Le Guin means by “science fiction” is what I mean by “speculative fiction,” and what she means by “fantasy” would include some of what I mean by “science fiction.”

There are harbingers in this which emerge meaningfully later in the book.

My own definition of science fiction is less specific than Aldiss’s and far more rigorous than Atwood’s—science fiction is at heart epistemological fiction: it is concerned with how knowledge (and subsequently technology) forces change on humans. You might argue that any good spy novel would meet that criteria, and certainly many spy novels (and movies) contain large dollops of science fiction, but only as collateral concerns. The change in a spy novel is earnestly resisted and often successfully so—the status quo is all important. Science fiction usually starts with (the authorial) belief that any status quo is an illusion and goes from there. Again, any surrealist novel might meet that definition, but I said epistemological, which is the tell-tale, because we’re talking about knowledge and knowing and acting, which is a communal experience, across society. And so the Federation of Star Trek qualifies as an epistemological proposition while the Isle of Avalon does not. And of course the second important condition—force—is essential in this regard. If there is a classical myth at the heart of SF it is Pandora’s Box. Open that lid—which is an act of will—and then deal with the consequences of uncontrollable environmental change.

I take it as read that there are other definitions of science fiction. This one is mine. It has the virtue of being completely independent of tropes—those spaceships and Mad Scientists of which Atwood speaks. Which brings something like Herman Hesse’s Magister Ludi into the fold quite plausibly while leaving something like Allen Drury’s Throne of Saturn out.

Aldiss proceeds in chapter one to make his case for Frankenstein and he does so adroitly. For SF to be true to itself, a change must be apparent that can be prompted and shaped no other way than by the conceit of the Sfnal idea. Dr. Frankenstein has learned how to reanimate dead tissue. The change this causes in him is to be faced quite unmetaphorically with the responsibility of being a god.

What separates this effectively from a straightforward horror novel is the utter humanity of Victor Frankenstein and the absence of any hint of either the divine or the demonic. What unfolds is a human drama anyone would face under similar circumstances. Frankenstein is not “mad” but becomes so. The Creature is not supernatural, it’s a construct. The questions of soul and moral responsibility permeate the drama—unresolved and unresolvable. Frankenstein has made a change in the world and has to figure out how to deal with it. He fails, but it’s the wrestling with it that brings the book into the fold of science fiction, because the change is both external and personal and depicted as humanly possible.

The rest of the novel is a Gothic—namely, it partakes of the tropes that define the Gothic: lonely castles, empty landscapes, isolation, darkness, and a kind of vastness that seems ponderously empty (but may not be). In that respect, Aldiss is correct about SF being in the tradition of the Gothic. It deals with vastness, isolation, the alien as landscape—and moral conundrum.

Atwood seems to think it’s all about utopias, which is why she seems unable to locate a definable beginning to the genre. There is a palpable reluctance throughout her book to deal with the subject directly, in a way that addresses the particular history of the stories that comprise the principle body of what we call science fiction, as if by searching around the perimeter she might find the point where it can all be subsumed into the larger, primary literary history of the last couple of millennia.

Aldiss talks throughout Billion Year Spree about the writers who informed the genre ever since it split off into its own distinct digs in 1926 with the founding of Amazing Stories by Hugo Gernsback, who Atwood barely mentions in passing. In Aldiss we have complete discussion of Gernsback, of Edgar Rice Burroughs, of E.E. “Doc” Smith, Leigh Brackett, A.E. Van Vogt, Heinlein, Clarke, Asimov—names which are oddly absent from the Atwood even though it is hardly possible to discuss SF meaningfully in their absence.

The writers they do cover, both of them, are Aldous Huxley, Jonathan Swift, George Orwell. Aldiss talks about them as what they are—literary writers who found useful tools in the SF toolbox, but who in most ways barely acknowledged the existence of the genre. (In Swift’s case, obviously so, since the genre did not exist in his day. But this itself is telling, since Swift is excluded by Aldiss as a precursor SF writer while Atwood sees him as primary.) Aldiss is remarking on how the same observations led to writers of quite different dispositions to do work recognizable to the main body of SF in its own day. To be sure, such writers are often used by the genre in a kind of reflexive self-defense, as if to say “See, serious writers do it, too!” But while Aldiss shows how these are basically one-offs, Atwood seems to think these writers represent the central goal of the genre—that all SF writers might be aspiring to the level of Huxley and Orwell. Perhaps in matters of craft and even art, but not necessarily in terms of theme or subject.

Atwood begins the biographical parts of her association with the genre in an understandable but curious place—in comics. (She also read H. Rider Haggard as a child, which left a distinct impression on her.) The trouble seems to be that she did not move from comics to the major magazines, and so what she shows is an attempt to make whole the literary connections between the superhero motifs of the 30s and 40s and classical myth. A valid and fruitful analysis, certainly, but it leaves one of the principle distinguishing features of the science fiction of the same period unaddressed—technology. Greek myths care not a fig for how Zeus generates his lightning bolts. They are super natural, beyond such understanding, as befits the divine. Science fiction is all over those bolts and how they are made—and, consequently why.

I would argue that while he did not create the first SF, Homer gave us the first SF character in Odysseus. In his own way, he was a technophile and a geek. He did not believe the gods were utterly inscrutable and unchallengeable and spent the length of the Odyssey figuring out how to beat them. He was a clever man, a man of reason, who clearly believed there was something to be understood about everything.

The mistake many literary critics make in their regard toward science fiction is in consistently assuming SF is all about its gadgets—i.e. its tropes—when it is really about the people who make them, understand them, use them, and all those who are changed by them.

Aldiss clearly understands this. He rarely argues for less science and tech, only for better human depictions. Because SF is about the world those tools are allowing us to make.

The question that springs to mind while reading Atwood’s examination is whether or not she ever read anything “of the canon,” so to speak—like Sturgeon or Herbert or Niven or Brin or Cherryh or even Butler—or if, having read it, she simply found it not worth discussing in the same breath as her token SF writer, Le Guin, and the others she selects to dissect, like Marge Piercy. Even in the case of Piercy, the work she chooses to examine is the one that can be read differently, Woman On The Edge Of Time, rather than the less ambiguous He, She, and It. In the closing paragraph of her examination on Piercy’s time travel-cum-woman-under-pressure novel, Atwood says:

Woman On The Edge Of Time is like a long inner dialogue in which Piercy answers her own questions about how a revised American society would work. The curious thing about serious utopias, as opposed to the satirical or entertainment variety, is that their authors never seem to write more than one of them; perhaps because they are products, finally, of the moral rather than the literary sense.

Even in praise, there seems to be a reservation about the work in question. Not literary, then, but a moral work. In this regard, Aldiss would seem to agree with her:

The great utopias have better claim to our attention, for utopianism or its opposite, dystopianism, is present in every vision of the future—there is little point in inventing a future state unless it provides a contrast with our present one. This is not to claim that the great utopias are science fiction. Their intentions are moral or political…
The idea of utopianists, like our town-planners, is to produce something that is orderly and functions well.

One of the chief drawbacks of utopias is this achievement of function. Basically, the whole point of them is to end history. They are “nowhere” because once attained there is theoretically no further need for people to change. In fact, they must not change, lest they destroy the perfection. As Aldiss goes on to say:

The trouble with utopias is that they are too orderly. They rule out the irrational in man, and the irrational is the great discovery of the last hundred years. They may be fantasy, but they reject fantasy as part of man—and this is a criticism that applies to most of the eighteenth-century literature…

Given this, one wonders what it is that Atwood is attempting in implicitly—and sometimes explicitly—treating SF as utopianism without a nod toward the thing at its core, namely the embrace of inexorable change. Because change is the driving fascination in science fiction and for it to have any valence in the imagination or utility in its constructs, it must present as something other than metaphor. Let me give you two quotes from a pair of SF writers, one of whom seems to be Atwood’s choice of exceptional ability:

Science fiction is a tool to help you think; and like anything that really helps you think, by definition is doesn’t do the thinking for you. It’s a tool to help you think about the present—a present that is always changing, a present in which change itself assures there is always a range of options for actions, actions presupposing different commitments, different beliefs, different efforts (of different qualities, different quantities) different conflicts, different processes, different joys. It doesn’t tell you what’s going to happen tomorrow. It presents alternative possible images of futures, and presents them in a way that allows you to question them as you read along in an interesting, moving, and exciting story.
Samuel R. Delany, The Necessity of Tomorrows

If science fiction has a major gift to offer literature, I think it is just this: the capacity to face an open universe. Physically open, psychically open. No doors shut.
What science, from physics to astronomy to history and psychology, has given us is the open universe: a cosmos that is not a simple, fixed hierarchy but an immensely complex process in time. All the doors stand open, from the prehuman past through the incredible present to the terrible and hopeful future. All connections are possible. All alternatives are thinkable. It is not a comfortable, reassuring place. It’s a very large house, a very drafty house. But it’s the house we live in…and science fiction seems to be the modern literary art which is capable of living in that huge and drafty house, and feeling at home there, and playing games up and down the stairs, from basement to attic.
Ursula K. Le Guin, Escape Routes

Taken together, these point to the disconnect with traditional literary forms, traditional literary expectations. Science fiction contains utopias, certainly (and dystopias, clearly) but it is not in the main about them. Nor is it about some desired escape from the present into an alternative world that may offer some kind of release for a mind at odds with itself, which seems to be the basis of so much neurotic fiction. The focus is on the wrong point here. It is about living in a changed milieu.

The problem with utopias was summed up concisely by Virginia Woolf “There are no Mrs. Brown’s in Utopia.” Like all superlatives, counterexamples can be found, but in the main this is a self-consistent criticism of the form which Atwood seems intent on using as her functional definition of science fiction. There is no room for ordinary people in Thomas More’s Utopia—if they are ordinary, they aren’t people, they’re memes. If they aren’t ordinary, Utopia doesn’t stand a chance of surviving.

And most ordinary people, when you get down to it, are not ordinary.

Which seems to be the major concern of most literary fiction—ordinary people. Which, by a tortuous logic of taxonomic reassessment, means, since Atwood seems to believe SF is principally utopian, that science fiction cannot deal with ordinary people and therefore, though she does not come right out and say this, cannot be considered relevant to mainstream literary concerns.

Welcome back to the ghetto.

In a blatantly dismissive review of Atwood’s own Oryx and Crake, Sven Birkerts asserted that SF can never be [true] literature because it “privileges premise over character.” In other words, the world at hand is more important than the people in it—which, of course, would make it utopian.

Henry James famously claimed “Landscape is character.” (Of course, he then criticized H.G. Wells for dealing more with “things” than characters—in other words, his landscapes.)

Birkerts and Atwood are on the same page, it seems, though Atwood is striving to come to terms with a form she clearly likes, even while misapprehending it. Perhaps had she found a stack of Astounding Stories instead of H. Rider Haggard and comics in the attic as a child she might have understood where the divergence happened and SF split off from two millennia of myth-driven fantasy. Novelty can overwhelm truth-seeking and a great deal of SF falls into the pit of self-involved gizmo geekery, but at those times when the work rises out of that pit to deal with the future and science and their immanence within the human soul it is unfair to not see its true worth. It’s like comparing Sherlock Holmes to the Hardy Boys and dismissing Holmes because he comes from the same stock.

It’s interesting that Atwood chooses Marge Piercy’s Woman On The Edge Of Time as her example, because Piercy worked a further subversion, perhaps unwittingly so, in the scenario she examines. Connie is regarded by everyone around her as insane. But she knows she isn’t, she’s dealing with a real situation, the future. But the world she lives in, the given world, her context, insists of denying the reality of that future and treating her involvement with it as symptom rather than legitimate experience. The parallel to the way in which the science fiction writer and his or her work is treated by those who see themselves as the keepers of context is remarkable. This is a metaphor which Atwood overlooks. The question of whether or not Piercy is writing what Atwood thinks she is or has understood the nature of the form she’s indulging is open.

The misunderstanding is simple but with complex consequences. Most genre fiction—mystery, western, war, spies, even romance—takes advantage of altered context to set mood or establish a range of possible action. Done well, these shifts target different thematic concerns and aim at specific moral (or telec) points. But in all but science fiction (and to a lesser extent the related genre of fantasy) the context would seem to be more attitudinal than material. Except in westerns, but we tend to treat the context of the western as “our” world insofar as it is historical and therefore, legitimately or not, we see it as familiar. The differences fade into background and the metaphor run out of our sight, almost as window dressing.

Science fiction dramatically reverses this relationship.

Which makes it a very uncomfortable place, especially for the writer who has spent his or her career writing from character rather than from landscape through character. Instead of seeing the world as a consequence of character, in science fiction the world is a character and must be dealt with concretely, as if to say “Here’s your new reality (context), now learn to live in it.”

It is precisely that discomfort that is the drug of choice for the reader of SF.

Attempts to corral it into a more familiar tradition run up against what must often seem like a perverse and intractable exoticism on the part of the writers.

Of the two books at hand, the Aldiss is the more taxonomically useful as well as æsthetically relevant. Aldiss, after all, is a science fiction writer. He has lived within the genre, knows it to its marrow, and, while critical of its excesses and irrelevancies, clearly loves it for itself, redheaded stepchild though it may be to others.

Which is not to say the Atwood is a failure. She is just as clearly fond of science fiction and has done considerable grappling with its conventions and conceits. But for her, it feels as if SF was an important love affair that last a summer or a year and then ended, leaving her with good memories and an impression of something missed, a road not taken. Nothing she regrets but it might have been nice for it to have lasted longer. She doesn’t know it the way Aldiss does, but she doesn’t fear it the way some of her colleagues have in the past and may still. So while her observations may seem coincidental, there’s worthy insight, if only of the tourist variety. Taken together, the two books give one a view of SF both from the inside and from the outside and the distinctions are telling.

Way back in my youth, when rock’n’roll had muscled its way into the serious attention of people who, not too many years earlier, once derided it as loud, obnoxious “kid’s stuff” I found an album by Andre Kostelanetz, who led an orchestra that specialized in symphonic renditions of popular music. He would take Sinatra or Como or Crosby or film themes or light jazz and turn them into quasi-classical pieces. This album was his take on the band Chicago. I remember listening to it bemused. It was interesting and it was “accurate” but it lacked some vitality that I at first couldn’t define. But then I realized that he had stripped everything out of it that said “rock’n’roll” and all that remained was the melody, the chord changes, and the form, but none of the guts. He’d taken music that could, in its original, get you churned up, excited, and agitated in a particular way and converted it into something palatable for the inspection of people who did not understand rock music but may have been curious about it. Unfortunately, he missed the point and the result was “interesting.”

I often feel that way about attempts at science fiction by people who do not understand it.

More importantly, however, is the dialogue between those who get it and those who don’t and in this respect Atwood has written a very useful book with considerable care and insight. It is, ultimately, less about science fiction than about her attempts to alchemically transform it into something familiar to her own early impressions of magical and dissociative fictive experiences. This is underscored by the Aldiss, which is about the heart and soul of science fiction. Reading them in tandem clarifies the ongoing misapprehensions and perhaps shows us how and why SF seems to be infecting much of today’s literary fiction. There must be a good reason why someone like Atwood now writes it, even if she doesn’t seem entirely to embrace it for itself.

 

Mixed Signals

I listen to music every day. Intentionally.  I choose something to set my internal harmonic brainscape and listen.  It was a difficult and startling revelation to me back in my youth to realize many people don’t. That is, even when they have music playing, they don’t listen.  For many, it’s wallpaper, and this just struck me as sad.

But it explained what I thought of then as the execrable taste a lot of my acquaintances seemed to display in music.  I have never cared for so-called Top 40 tunes, with rare exception, because in my experience such songs were either the least interesting pieces on their respective albums or they were the zenith of a mediocre musical imagination.  Boring.  Listen to them three or four times and their content is exhausted.

I also used to have an absolutely absurd prejudice that if I could manage to play it myself, on guitar or keyboard, with only a few practices, it was just too insignificant.  This was ridiculous, but I’d been raised to appreciate technical difficulty as a sign of quality in most things.  It took a long time for me to overcome this notion and I still have not completely.

For good or ill, though, it informs my taste to this day, and in the presence of the technically superb I am seduced.  I have found technically accomplished work that was simply not as good as its polish, but I have more rarely ever found sloppy work that was so much better than its presentation that it didn’t matter.  Technical ability, precision of execution, polish…these are not simply ancillary qualities.  The guitarist may know all the notes of the Bach piece but if the timing is wrong, the chording inaccurate, the strings squeak constantly, it will be a thoroughly unenjoyable performance.  Likewise, if the guitarist has composed a beautiful new piece but then can’t perform it as imagined…who will ever know how beautiful it is?

Ultimately, technical sloppiness gets in the way of the work.  The better the technique, the clearer the art shows through.

Which brings me to what I wanted to talk about here.

The other day I sat down with two works that for whatever reason seemed to counterpoint each other.  Put it down to my peculiar æsthetic, as I doubt anyone else would consider them complimentary.  And perhaps they aren’t, but they shared a common quality, the one I’ve been going on about—technical superiority.

Ansel Adams is a byword for precision in art, especially photographic art.  His images are studies in excellence, from their composition to their presentation.  There is a fine-tuned carefulness in many of them, if not all, that has set the standard for decades.  I have a number of his monographs on my shelf and I have been an admirer and follower since I was a boy.  His set of instructional books, the Basic Photo series, were among the first I read when becoming a photographer myself.  Every year I hang a new Ansel Adams calendar in my office.  I have a biography of him, one signed volume of his Yosemite images, and I find myself constantly drawn to his work.  These photographs are replenishing.

So when a new collection came out this past year—400 Photographs—it was a given that I would acquire it.  (I do not have all his books—there’s a heavy rotation of repeats strewn throughout his œvre.)  I had it for some weeks before I found time to sit down and really go through it.  When I did I was surprised.

The collection is broken down in periods, beginning with some of his earliest images made when he was a boy, reprinted directly from the scrapbooks in which they were pasted, all the way up to the very early 1970s when he, according to the commentary, stopped making “important” photographs and devoted his time to the darkroom.  Gathered are most if not all his iconic images, many that will be familiar to those who have more than a passing acquaintance with his work…

…but also a number of relatively unknown photographs, peppered throughout, many of which show a less than absolute control on Adams’ part.  They do not come up to par.  Some of them, the composition is slightly “off” or the tonal range is not fully captured.

Which is not to say they are not beautiful.  Adams at his worst is equal to most others at their best.  But historically it’s interesting and instructive to see the “not quites” and the “almost theres” among the otherwise perfect works we have all come to expect.  But rather than detract, these works actually enhance the overall impact of the collection, because there is variation, there is evidence of “better”, there is obvious progression.  The commentary between the periods by Andrea Stillman is concise, spare, and informative as to the distinctions in evidence.  This is a chronicle of an artist’s evolution.

Looking at an Ansel Adams photograph, one sometimes feels that the very air was different around him, that light passed from landscape to film plane through a more pristine medium, that nature itself stood still for a few moments longer so the image could be recorded with absolute fidelity in a way given to no other photographer.

As I went through the images, I listened to a new album.  New to me, at least, and in fact it was released this past year.  Levin Minnemann Rudess.

Who?

Of the three, two had been known to me before this year.  Tony Levin is a bassist of extraordinary range and ability.  Besides his own work, he seemed for a time the player the serious groups called in when their regular bassist was unavailable.  Which means he played bass for Pink Floyd in the wake of Roger Waters’ exit.  He played bass for Yes. Dire Straits, Alice Cooper, Warren Zevon, and even Paul Simon and Buddy Rich.

He was also one of the most prominent members of King Crimson during one of its best periods.  He is a session player in constant demand and his ability seems chameleonic.  He can play anything in almost any style.  He is one of those musicians who always works, is always in demand.

Given his associations, sometimes it is a surprise to hear his own work, which can either be described as a distillation of all his influences or as a complete departure from them.  Such would seem to be the case here.

Jordan Rudess plays keyboards and came out of the progressive schools of Keith Emerson, Rick Wakeman, UK, and others, although the first band with which he was associated was the Dixie Dregs. He later joined Dream Theater, but like Levin has been a much in demand session player whose name I’ve seen pop up many times since the early 90s.

Marco Minnemann, then, is the only name with which I am unfamiliar, but that’s changing.   As a drummer, he’s played with former members of UK—Eddie Jobson and Terry Bozzio—and has been doing session work with metal groups.  I learned of him just this past year in association with guitarist Guthrie Govan, with whom he has formed a trio with bassist Bryan Beller, The Aristocrats.  He seems committed to that unit, so I believe the album I’m discussing may be a one-off, an experiment for these three musicians.  He is an explosively complex, solid drummer.

What does this have to do with Ansel Adams?

Not much other than what I began with—precision.  There is an overwhelming technical precision here that, for the duration of my study of the Adams book, formed a complimentary experience of sharp-edged landscapes and absolute control.  The LMR album is largely instrumental (which has slotted it into my writing queue) but fits no particular genre exactly.  Jazz?  Sure.  Metal?  Somewhat.  Fusion, certainly, but fusion of what?  Rudess’s runs evoke classical associations, but no single track is identifiable with a particular Great Composer.  This is experimental work, theory-in-practice, done at a high level of musicianship and compositional daring.  An aural high-wire act that is constructing the landscape as it records it.

As I said earlier, it happens more often than not that technical prowess can substitute for significant content.  “Too many notes” can mask as absence of substance.  Too-fine a presentation can distract from the fact that an image contains nothing worthwhile.

But when substance and technique are combined at a stratospheric level of ability, when performance melds precision and depth, then we have something truly special.

All I needed that afternoon was a fine wine to complete the immersive experience.

Quantum Branching…As Literature Embraces Science Fiction, the Past is Again and Again

Kate Atkinson’s latest novel, Life After Life, is a remarkable achievement.  It’s several hundred pages of exquisitely controlled prose contain the story of Ursula Todd, who is in the course of the story, born again and again and again.  Each life, some so very brief, ends in a tragic death, accidental, malevolent, heroic, painful, and each time she starts over, comes to the point where that mistake was but is now sidestepped, turned away, avoided.  She lives multiple times, each one different, and yet she remains herself.

The novel opens with a shocking scene—Ursula, a young woman living in Berlin, enters a café wherein she finds Adolf Hitler, surrounded by sycophants, enjoying his celebrity.  She pulls a pistol and takes aim,

Then she is born.

It is 1910, in the English countryside, and snowing heavily.  The scene is reminiscent of Dickens.  She is born.  First she dies from strangulation, the umbilical cord wrapped around her with no  one around who knows what to do.  Then in the next life that obstacle is overcome.  And so it goes, as she ages, staggers through one life after another, growing a little older each time, her family battered by one damn thing after another.  Ursula herself, a middle child, watches as much as participates in the homely evolution of this middle class English family, and we are treated to an almost microscopic study of its composition—its hypocrisies, its crises, it successes, its failures.

Ursula endures.  As her name almost punningly suggests, she Bears Death, over and over.  She never quite remembers, though.  She has intense feelings of déjà vu, she knows such and such should be avoided, this and that must be manipulated, but she never quite knows why.  At times she comes perilously close to recognition, but like so much in life her actions are more ideas that seemed good at the time than any deeper understanding.

Unlike the rigor of traditional time travel, the past does change, but then this is not a time travel novel, at least not in any traditional sense.  You might almost say it’s a reincarnation story, but it’s not that, either, because Ursula never comes back as anyone other than herself.   At one point in the novel, time is described, not as circular but as a palimpsest—layers, one atop another, compiling.  The result here is a portrait more complete than most not of a life lived but of life as potential.  But for this or that, there wandered the future.  It is a portrait of possibility.

The big events of history are not changed, though.  Nothing Ursula does in her manifold existences alters the inevitability of WWII or Hitler or the Spanish Flu or any of the mammoth occurrences that dominate each and every life she experiences.

What she does change is herself.  And, by extension, her family, although all of them remain persistently themselves throughout.  It is only the consequences of their self expression that become shaped and altered.

We see who are the genuine heroes, who the fools, the cowards, the victims and victors as, where in one life none of this might emerge clearly, in the repeated dramas with minor changes character comes inexorably to the fore.

Atkinson does not explain how any of this happens.  It’s not important, because she isn’t doing the kind of fiction we might encounter as straight up science fiction, where the machinery matters.  She’s examining ramifications of the personal in a world that is in constant flux on the day to day level even as the accumulation of all that movement builds a kind of monolithic structure against which our only real choice is to choose what to do today.  Consequently, we have one of the most successful co-options of a science fiction-like conceit into a literary project of recent memory.

On a perhaps obvious level, isn’t this exactly what writers do?  Reimagine the personal histories of their characters in order to show up possibility?

Light Fallen

I’ve read three books in tandem which are connected by subtle yet strong filaments.  Choosing which one to begin with has been a bit vexatious, but in the end I’ve decided to do them in order of reading.

The first is an older book, handed me by a friend who thought I would find it very much worth my while.  I did, not, possibly, for the reasons he may have thought I would.  But it grounds a topic in which we’ve been engaged in occasionally vigorous debate for some time and adds a layer to it which I had not expected.

William Irwin Thompson’s  The Time Falling Bodies Take To Light  is about myth.  It is also about history.  It is also about grinding axes and challenging paradigms.  The subtitle declares: Mythology, Sexuality & the Origins of Culture.  This is a lot to cover in a mere 270-some pages, but Mr. Thompson tackles his subject with vigor and wrestles it almost into submission.

His thesis is twofold.  The first, that Myth is not something dead and in the past, but a living thing, an aggregate form of vital memes, if you will, which recover any lost force by their simple evocation, even as satire or to be dismissed.  Paying attention to myth, even as a laboratory study, brings it into play and informs our daily lives.

Which means that myth does not have a period.  It is ever-present, timeless, and most subtle in its influence.

His other thesis, which goes hand in hand with this, is that culture as we know it is derived entirely from the tension within us concerning sex.  Not sex as biology, although that is inextricably part of it, but sex as identifier and motivator. That the argument we’ve been having since, apparently, desire took on mythic power within us over what sex means, how it should be engaged, where it takes us has determined the shapes of our various cultural institutions, pursuits, and explications.

It all went somehow terribly wrong, however, when sex was conjoined with religious tropism and homo sapiens sapiens shifted from a goddess-centered basis to a god-centered one and elevated the male above the female.  The result has been the segregation of the female, the isolation of the feminine, and the restriction of intracultural movement based on the necessity to maintain what amounts to a master-slave paradigm in male-female relationships.

Throughout all this “fallen” power play, ancient myths concerning origins and the latent meanings of mutual apprehensions between men and women (and misapprehensions) have continued to inform the dialogue, often twisted into contortions barely recognizable one generation to the next but still in force.

There is much here to consider.  Thompson suggests the rise of the great monotheisms is a direct result of a kind of cultural lobotomy in which the Father-God figure must be made to account for All, subjugating if not eliminating the female force necessary for even simple continuation.  The necessity of women to propagate the species, in this view, is accommodated with reluctance and they are, as they have been, shoved into cramped confines and designated foul and evil and unclean in their turn, even as they are still desired.  The desire transforms the real into the ideal and takes on the aspects of a former goddess worship still latent in mythic tropes.

Certainly there is obvious force to this view.

The book is marred by two problems.  I mentioned the grinding of axes. Time was published originally in 1981 and, mostly in the first third, but sprinkled throughout, is an unmasked loathing of evolutionary psychology and sociobiology.  He takes especial aim at E.O. Wilson for promulgating certain reductive explanations for prehistoric cultural evolution based wholly on biological determinants.  Thompson’s prejudice is clear that he wants even early homo sapiens to be special in its cultural manifestations and he derides attempts at exclusively materialist explanations.  The fact that E.O,. Wilson himself has moved away from these earlier “purely” biological considerations one hopes would result in an updating.

But interestingly, part of Thompson’s rejection of such early modeling comes from an apparent belief in Race Memory.  Not, as I might find plausible, race memory as deeply-entrenched memes, but apparently as some undiscovered aspect of our genome.  He never quite comes out claims that such race memory is encoded in our DNA, but he leaves little room for alternative views.

Hence, he asserts, the genuine power of myth, since it is carried not only culturally, but quasi-biologically, as race memory.  Which we ignore at our peril.

He does not once mention Joseph Campbell, whose work on the power of myth I think goes farther than most in explicating how myth informs our lives, how myth is essentially meaning encoded in ideas carried in the fabric of civilization.  He does, however, credit Marija Gimbutas, whose work on goddess cultures extending back before the rise of Sumer and the constellation of civilizations commonly recognized as the “birth” of civilization was attacked by serious allegations of fraud in order to undermine her legitimacy and negate her thesis that early civilizations were certainly more gender equal if not outright female dominated.  (Just a comment on the so-called “birth” of civilization: it has been long remarked that ancient Sumeria appeared to “come out of nowhere”, a full-blown culture with art and some form of science.  But clearly common sense would tell us that such a “birth” had to be preceded by a long pregnancy, one which must have contained all the components of what emerged.  The “coming out of nowhere” trope, which sounds impressive on its face, would seem to be cultural equivalent of the virgin birth myth that has informed so many civilizations and myth cycles since…)

My complaint, if there is any, is that he undervalues the work of geneticists, biologists, and sociometricians, seeking apparently to find a causation that cannot be reduced to a series of pragmatic choices taken in a dramatically changing ecosystem or evolutionary responses to local conditions.  Fair enough, and as far as it goes, I agree.  Imagination, wherever and whenever it sprang into being, fits badly into the kind of steady-state hypothesizing of the harder sciences when it comes to how human society has evolved.  But to dismiss them as irrelevant in the face of an unverifiable and untestable proposition like Race Memory is to indulge in much the same kind of reductionist polemic that has handed us the autocratic theologies of “recorded history.”

Once Thompson moves out of the speculative field of, say, 8,000 B.C.E. and older and into the period wherein we have records, his attack on cherished paradigms acquires heft and momentum and the charm of the outsider.  (His mention, however, of Erich von Daniken threatens to undo the quite solid examination of the nature of “ancient” civilizations.)  It is easy enough to see, if we choose to step out of our own prejudices, how the march of civilization has been one of privileging male concerns and desires over the female and diminishing any attempt at egalitarianism in the name of power acquisition.  The justification of the powerful is and probably has always been that they are powerful, and therefore it is “natural” that they command.  Alternative scenarios suffer derision or oxygen deprivation until a civilization is old enough that the initial thrill and charm of conquest and dominance fades and more abstruse concerns acquire potency.

But the value of The Time Falling Bodies Take To Light  may be in its relentless evocation of institutional religion as a negation of the spiritual, as if to say that since we gave up any kind of natural and sane attitude toward sexuality and ignored the latent meaning in our mythologies we have been engaged in an ongoing and evermore destructive program to capture god in a bottle and settle once and for all what it is we are and should be.  When one looks around at the religious contention today, it is difficult if not impossible to say it is not all about men being in charge and women being property.  Here and there, from time to time, we hear a faint voice of reason crying out that this is a truly stupid thing to kill each other over.