Traditions and New Eyes

I recently finished rereading a book from last year, preparing to read the sequel. I should cop to the fact that my reading has rarely been what you might call “timely” and I’ve gotten worse over the last several years.  When I wrote reviews for actual pay this was not as much a problem, because I had to read current material.  But left to my own devices, I pick and choose from my to-be-read pile at random, pretty much the way I buy books to begin with.  So I might read an old Agatha Christie concurrently with a newer physics tome by Kip Thorne after having finished a very new history of the sinking of the Lusitania, then pick up a newish novel while at the same time rereading some Ted Sturgeon… So it goes.

So I am very much “behind” almost all the time.  I served as a judge for the PKD Award one year and managed to read an unbelievable number of recently-published SF novels.  I can commit and stay the course when required. But in general my reading keeps me in a kind of ever-imminent nontime in terms of how I encounter works.  I don’t sort by decade when I start reading, not unless something in the text forces me to recognize it.  So to me, it is not at all odd to see James Blish and Iain M. Banks and C.J. Cherryh and Ann Leckie as in some sense contemporaneous.

So when I encounter a novel like Charles E. Gannon’s Fire With Fire I have no trouble—in fact, take some delight—in seeing it as part of a continuous thread that connects Doc Smith to Poul Anderson to C.J. Cherryh to any number of others who over the past 70 + years have mined the fields of alien encounter/politicomilitary SF.  And when I say I found the closest affinity with Poul Anderson at the height of his Terran Empire/Flandry period, that is, for me, high praise.

I loved Dominic Flandry.  Not so much the character, though there is that, but the milieu Anderson created.  One of the appealing aspects of his future history, especially those stories, was the authentic “lived in” feel he achieved, rarely duplicated by his peers, and seldom realized to good effect now.  Gannon does this.

The story in Fire With Fire is nothing new.  Earth has begun to settle other worlds around other stars and it’s only a matter of time before we encounter other space-faring civilizations.  In fact, we have, only it isn’t public knowledge, and in some instances it’s not something the discoverers even want noticed.  While Anderson had the Cold War to work with, Gannon has the transnational world, with all its disquieting ambiguities over what constitutes nations and how they differ from corporations and the undeniable motivation of profit in almost all human endeavors, leading to an ever-shifting array of allies and enemies in arrangements not always easy to define much less see.  He takes us through all this quite handily.  It’s not so much that he knows the pitfalls of human civilizations than that he recognizes that the field is nothing but pitfalls.

“All that is necessary for evil to succeed is that good men do nothing.”  Hobbes’ dictum plays in the background throughout as good guys dupe both bad guys and good guys, people are moved around and used like game pieces, power is unleashed—or not—based on calculi often having little to nothing to do with ethics and morality.  This is politics writ large and individuals learn to surf the swells or drown.

Into which is tossed Caine Riordan, an investigative journalist who is also a good man.  He is unfortunately snatched out of his life through a security mishap, placed in cryogenic suspension, and awakened 14 years later with a hundred or so hours of missing memory dogging him through the rest of the book, memories having to do with the two men in whose thrall he seems now to be. Nolan Corcoran, retired Admiral, and Richard Downing, former SAS and often reluctant aid to Admiral Corcoran.  Not reluctant in being unwilling to serve, but reluctant about some of their methods.  They run a secret organization designed to prepare for exosapient first contact.  It practically doesn’t exist, sort of in the way gravity under certain conditions doesn’t exist, and now Caine has become their tool.

Without going into details, which are a major aspect of this novel, suffice to say it is about that first contact and the political ramifications thereof. This is a not a new idea and much of the book may, to some, feel like ground well trod, but there is ample pleasure to be had in the trek over familiar ground seen through fresh eyes.  What is done better here than the usual is the economic and political backgrounding and the debates over the impact of first contact.  Furthermore, the seemingly impossible disaffection among the various political entities comprising the world we know are displayed to lend a plangent note of nailbiting despair to the very idea that we might pull ourselves together sufficiently for anything remotely resembling a world government.

To be sure, Gannon adroitly addresses the hoary old notion that when we meet the aliens they themselves will already have worked all this out long since and be in a position to pass elder judgment on our upstart species.  They haven’t.  They have a (barely) workable framework among themselves, but the advent of introducing another new race into their club proves to be an opportunity for old issues to be forged into new knives.

Gannon handles all this well.  He clearly has a grasp of how politics works and has imaginatively extended that knowledge to how nonhuman species might showcase their own realpolitik. He has a flair for detail.  He handles description very well, sets scenes effectively, and even manages to disguise his infodumps as conversations we want to hear.  Most of the time, it has the pleasurable feel of listening to a good musician groove on an extended improvisation.  Throughout we feel sympatico with Caine and the people he cares for and the situation is certainly compelling.

For me, this was a walk down a street I haven’t visited in some time.  I read this novel with a considerable experience of nostalgia.  It is part of a tradition.  A well-executed piece of an ongoing examination over issues we, as SF fans, presumably hope one day to see in reality.  We keep turning this particular Rubik’s Cube over in our collective hands, finding variations and new combinations, looking for the right face with which to walk into that future confrontation.  One may be forgiven if this particular form of the contemplation seems so often to turn on the prospect of war.  After all, aren’t we supposed to be past all that by the time we develop star travel and put down roots elsewhere?

There are two (at least) answers to that.  The first, quite cynically, is “Why would we be?”  Granted that most wars have at least something to do with resources.  One side wants what the other side has, and you can do the research and find cause to argue that even the most gloriously honor-driven wars had deep economic aspects to them.  Certainly the conduct of all wars has deep economic consequences.  But while that is true and might be argued for most wars, it is also true that many wars need not have been fought as there were other means of securing those resources.  But that didn’t matter.  It was the war, for someone, that mattered more even than the well-being of the people, the polity.  That ill-define and in retrospect absurd thing Glory is a very real ambition down through history.  Wars get fought as much for that as for anything else.  Suddenly having all your resources needs met would do nothing to dampen that.  In fact, it might exacerbate the Napoleonic impulse in some instances.

Because the reality is that Going There will do that.  Not survey missions, no, but if we assume the level of technology and capacity that allows for colonies on world in other solar systems, then we can assume a post-scarcity economy.  It’s the only way it makes sense.  We will not solve economic problems with an interstellar empire, the empire will be the result of those solutions.

So that leaves us with the second reason we may still face war.  No less cynical but more intractable. Racism.  Not the kind of small-minded nonsense we deal with in terms of skin color and language, but the real deal—wholly different biologies confronting each other over the question of intelligence and legal rights and the desirability of association.  Deeper even than that is the history and tradition brought to the question by two civilizations with absolutely nothing in common, having developed in isolation more profound than any we might imagine on the face of the Earth.

Not that either of these are inevitable, and it may well be that sophistication of technology and its responsible use breeds requisite tolerances.  But this is, as likely as it sounds philosophically, not a given, any more than war with aliens is inevitable. So we talk about it, in the pages of fictions with long traditions. There are certainly other possibilities, other scenarios, and there are other writers dealing with those.  Gannon is dealing with this one.

And doing so with a thick cord of optimism that raises this above the level of the usual “Deltoid Phorce” clone in the tradition of Tom Clancy or some other purveyor of gadget-driven war porn.  Gannon asks some questions in the course of this novel which keep it from descending to the level of the field-manual-with-body-count-in-technicolor.  This is more like what Poul Anderson would have written, with no easy answers, and heroes who are not unalloyed icons.

It’s worth your time.

Nicely done, Mr. Gannon.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Time and Motion

William Gibson is, if nothing else, a careful writer.  You can feel it in the progress of any one of his novels and in the short stories.  Careful in his choice of topic, placement of characters, deployment of dialogue, style.  He sets each sentence in place with a jeweler’s eye to best effect.  The results often seem spare, even when they are not, and have invited comparisons to noir writers, minimalists, modernists.  Entering upon a Gibson novel is a step across a deceptively simple threshold into a finely-detailed maze that suggests multiple paths but inevitably leads to a conclusion that, in hindsight, was already determined had we but noticed just how sophisticated a writer it is with whom we’re dealing.

His last set of novels, the Bigend Trilogy, was not even science fiction, though they felt like it.  The application of a science-fictional perception of how the world works produced a dazzling bit of dissonance in which the ground itself became familiar through alienation.  He does that, shows us something we should be utterly familiar with as if it were an alien artifact.  As a result, the shock of recognition at the end contains a thick cord of nostalgia and a sense of loss mingled with new discovery.  The chief discovery, of course, is the realization just how close we are to what we think of as The Future.  Through this effect, he renders the future as both less alien and stranger at the same time.

Which is something he indulges fully in the opening chapters of his new novel, The Peripheral.

Author William Gibson. (by Michael O'Shea)

For a while you don’t know that the two points of view are not in the same world.  It’s a masterpiece of misdirection achieved through the intermediary of a game.

Flynn Fisher’s brother is ex-special ops military, living in an old airstream in a town in the middle of a mid-21st century rural America that is clearly struggling with the unstable economy.  To make extra money, he often moonlights as a beta tester on new games.  The novel opens when he brings Flynn in to sub for him one night while he goes off to confront a radical religious group he hates, known as Luke 4:5.  (The verse reads: Then leading him to a height, the devil showed him in a moment of time all the kingdoms of the world.  Even here, Gibson is playing at metaphors pertinent to the novel in its entirety.)  Flynn used to do this sort of work herself but quit when the games became more and more violent.  He assures her this isn’t like that, she’ll be running a security drone of some kind keeping paparazzi away from a high-rise luxury apartment.  He’ll pay her well, as he’s being likewise well-paid.  Just one night, maybe two.  She agrees.

The simulation seems to take place in a city she sort of recognizes and may be London, but it’s all different from the London she knows.  It’s as her brother claimed, flying interference, until the second night when the woman living there is murdered most horrifically and Flynn is a witness.  Thinking it’s still a game, she wants nothing more to do with it.

Meanwhile, Wilf Netherton, a publicist living in London, is working with a performance artist who has been tasked as a negotiator to a colony of self-modified humans living on an artificial island of reformed debris.  Wilf’s job is to keep her on task, which can be very difficult as she is very much a rebel and can go in unexpected directions without any warning.  As she confronts those with whom she is supposed to negotiate, something goes wrong and she ends up killing the leader.  Another murder.

Netherton’s associate, an operative in government intelligence, must divorce herself from the fiasco and cut ties with Netherton.  He goes to ground with a friend of his, a member of a powerful family of Russian descent, who has a unique hobby—he operates a “stub” in history.

At this point we realize that Flynn and Netherton are not simply divided by class and place but by time itself.  Netherton’s London is 70 years in Flynn’s future and is the London wherein Flynn witnessed the murder of the woman, who turns out to be the sister of the performance artist who just committed a second murder.  For her part, Flynn is in their past, a past Netherton’s friend has been playing with via a form of time travel that is based on the transfer of information.

And we are now fully in the grip of one of the cleverest time travel stories in recent memory.  Nothing physical travels, only information.  Gibson has taken a page from Benford’s classic Timescape and wrought changes upon it.  Flynn and Netherton “meet” once a police inspector of Netherton’s time becomes involved and starts running the stub Netherton’s friend has set up.  She needs a witness to the murder before she can act.  Flynn is that witness.  What follows is well-imagined set of antagonistic countermeasures that affect both worlds economically.

And that may be one of the most interesting subtexts.  Flynn finds herself the titular head of the American branch of a corporation which till then only existed as a device to explain the game she thought she was beta testing.  As such, she becomes enormously wealthy out necessity—she is under attack by the forces allied to the murderer in the future.  Politicians and corporations change hands, the economy is distorted, the world severed from its previous course, and everything is changed.

Gibson is indulging one of his favorite ideas, that information is possibly the most potent force.  Data has consequences.

Flynn is one of Gibson’s best creations since Molly Millions.  Smart, gutsy, practical, and loyal to family and friends, she adapts quickly to the staggering reality into which she and hers have stumbled.  She manages in both time zones admirably but not implausibly.  As counterpart, Netherton is an interesting case study of a man who hates the times in which he lives, is by far too intelligent to ignore it, and subsequently suffers a number of self-destructive flaws which he gradually comes to terms with as his interactions with Flynn progress.

At the heart of the novel is a question of causality, certainly, but also one of responsibility.  The pivotal point in history that separates Flynn’s world from Netherton’s is an event euphemistically called The Jackpot.  It’s a joke, of course, and a twisted one at that, as it was only a jackpot for a few who survived and became, ultimately, even wealthier than they had been.  The label refers to a collection of factors leading the deaths of billions and the loss of an entire era due to humanity’s inability to stop itself from doing all the things that guaranteed such an outcome.  It’s a cynical insight and not a particularly difficult one to achieve, but Gibson, as usual, portrays it with a dry assessment of how it will actually play out and how it will look to those who come after.  His conclusion seems to be, “Well, we really aren’t all in this together.”

The apparent simplicity of the narrative is another mask for the games Gibson plays.  It doesn’t feel like a profound or dense work.  Only afterward, in the assessment phase, do we begin to understand how much he says, how solid are his insights, and how rich are his conceits.  Gibson creates a surface over which the reader may glide easily.  But it’s a transparent surface and when you look down, there, below you, is a chasm of meaning, awaiting inspection, offered in a moment of time.

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

Greatless Illusion

The third book I read recently which resonated thematically with the previous two is one I have come somewhat late to given my inclinations.  But a new paperback edition was recently released and I considered buying it.  I hesitated as I was uncertain whether anything new or substantively unique was contained therein to make it worth having on my shelf.  I have other books along similar lines and while I am fond of the author, it seemed unlikely this book would offer anything not already covered.

Christopher Hitchens was a journalist and essayist and became one of our best commentators on current events, politics, and related subjects.  Even when I disagreed with him I have always found his arguments cogent and insightful and never less than solidly grounded on available fact.

So when he published a book of his views on religion, it seemed a natural addition to my library, yet I missed it when it first came out.  Instead, I read Richard Dawkins’ The God Delusion, which I found useful and well-reasoned, but pretty much a sermon to one who needed no convincing.  Such books are useful for the examples they offer to underpin their arguments.

Such is the case with God Is Not Great: How Religion Poisons Everything.  Hitchens’ extensive travels and his experiences in the face of conflict between opposing groups, often ideologically-driven, promised a surfeit of example and he did not fail to provide amply.

The title is a challenge, a gauntlet thrown at the feet of those with whom Hitchens had sizeable bones to pick.  In the years since its initial publication it has acquired a reputation, developed a set of expectations, and has become something of a cause celebré sufficient for people to take sides without having read it.  I found myself approaching the book with a set of expectations of my own and, with mild surprise, had those expectations undermined.

Yes, the book is a statement about the nature of religion as an abusive ideology—regardless of denomination, sect, theological origin—and offers a full range of examples of how conflicts, both between people and peoples, are generally made worse (or, more often than not, occur because of) by religious infusions into the situation.  It is in many ways a depressing catalog of misuse, misinterpretation, misstatement, misunderstanding, and sometimes misanthropy born out of religious conviction.  Hitchens analyzes the sources of these problems, charts some of the history, and gives us modern day examples.

But he tempers much of this by drawing a distinction between individuals and ideologies.

He also opens with a statement that in his opinion we shall never be rid of it.  This is quite unlike people like Dawkins who actually seem to feel humankind can be educated out of any need of religion.  Hitchens understood human nature all too well to have any hope that this was possible.

He does allow that possibly religion allows some good people to be better, but he does not believe religion makes anyone not already so inclined good.

By the end of the book, there will likely be two reactions.  One, possibly the more common, will be to dismiss much of his argument as one-sided.  “He overlooks all the good that has been done.”  It is interesting to me that such special pleading only ever gets applied consistently when religion is at issue.  In so much else, one or two missteps and trust is gone, but not so in religion, wherein an arena is offered in which not only mistakes but serious abuse can occur time and time again and yet the driving doctrine never called into question.  The other reaction will be to embrace the serious critique on offer, even the condemnations, and pay no attention to the quite sincere attempt to examine human nature in the grip of what can only be described as a pathology.

Because while Hitchens was a self-proclaimed atheist, he does take pains to point out that he is not talking about any sort of actual god in this book, only the god at the heart of human-made religions.  For some this may be a distinction without a difference, but for the thoughtful reader it is a telling distinction.  That at the end of it all, Hitchens see all—all—manifestations of gods through the terms of their religions as artifices.  And he wonders then why people continue to inflict upon themselves and each other straitjackets of behavior and ideology that, pushed to one extreme or another, seem to always result in some sort of harm, not only for the people who do not believe a given trope but for the believers themselves.

We are, being story-obsessed, caught in the amber of our narratives.  Per Mr. Thompson’s analysis of myth, we are never free of those stories—even their evocation for the purposes of ridicule bring us fully within them and determine the ground upon which we move.  The intractable differences over unprovable and ultimately unsubstantiated assumptions of religious dictate, per the history chronicled around the life Roger Smith, have left us upon a field of direst struggle with our fellows whose lack of belief often is perceived as a direct threat to a salvation we are unwilling ourselves to examine and question as valid, resulting in abuse and death borne out of tortured constructs of love.  Christopher Hitchens put together a bestiary of precedent demonstrating that treating as real the often inarticulate longings to be “right” in the sight of a god we ourselves have invented, too often leads to heartache, madness, and butchery.

The sanest religionists, it would seem by this testament, are those with the lightest affiliation, the flimsiest of dedications to doctrine.  They are the ones who can step back when the call to massacre the infidel goes out.

All of which is ultimately problematic due simply to the inexplicable nature of religion’s appeal to so many.

But it is, to my mind, an insincere devoteé who will not, in order to fairly assess the thing itself, look at all that has been wrought in the name of a stated belief.  Insincere and ultimately dangerous, especially when what under any other circumstance is completely wrong can be justified by that which is supposed to redeem us.

Monstrous Partiality

In keeping with the previous review, we turn now to a more modern myth, specifically that of our nation’s founding.  More specifically, one component which has from time to time erupted into controversy and distorted the civil landscape by its insistence on truth and right.

But first, a question:  did you know that once upon a time, in Massachussetts, it was illegal to live alone?

There was a law requiring all men and women to abide with families—either their own or others—and that no one, man or woman, was permitted to build a house and inhabit it by themselves.

John M. Barry details this and much more about early America which, to my knowledge, never makes it into history classes, at least not in primary or secondary schools, in his excellent book  Roger Williams and the Creation of the American Soul: Church, State, and the Birth of Liberty.

9780670023059_p0_v1_s260x420

Discussion of the Founding—and most particularly the Founding Fathers—centers upon the Revolutionary Era collection of savants who shaped what became the United States.  It is sometimes easy to forget that Europeans had been on these shores, attempting settlements, for almost two centuries by then.  It’s as if that period, encapsulated as it is in quaint myths of Puritans, Pocahontas, Squanto, John Smith, and Plymouth Rock, occupies a kind of nontime, a pre-political period of social innocence in which Individuals, whose personalities loom large yet isolated, like Greek Gods, prepared the landscape for our later emergence as a nation.  My own history classes I recall did little to connect the English Civil War to the Puritan settlements and even less to connect the major convulsions in English jurisprudence of that period to the the evolution of political ideas we tend to take for granted today.  In fact, it seems pains are taken to sever those very connections, as if to say that once here, on North American soil, what happened in Europe was inconsequential to our national mythos.

That illusion is shattered by Barry in this biography of not only one of the most overlooked and misunderstood Founders but of that entire morass of religious and political struggle which resulted in the beginnings of our modern understanding of the wall of separation between church and state.  More, he makes it viscerally real why  that wall not only came into being but had  to be.

If you learned about Roger Williams at all in high school, probably the extent of it was “Roger Williams was a Puritan who established the colony that became Rhode Island.  He contributed to the discussion over individual liberty.”  Or something like that.  While true, it grossly undervalues what Williams actually did and how important he was to everything that followed.

In a way, it’s understandable why this is the case.  Williams occupies a time in our history that is both chaotic and morally ambiguous.  We like to think differently of those who settled here than they actually were, and any deeper examination of that period threatens to open a fractal abyss of soul searching that might cast a shadow over the period we prefer to exalt.

But the seeds of Williams’ contribution were sown in the intellectual soil which to this day has produced a troubling crop of discontent between two different conceptions of what America is.

The Puritans (whom we often refer to as The Pilgrims) were religious malcontents who opposed the English church.  They had good reason to do so.  King James I (1566 – 1625) and then his son, Charles I (1600 – 1649), remade the Church of England into a political institution of unprecedented intrusive power, establishing it as the sole legitimate church in England and gradually driving out, delegitimizing, and anathematizing any and all deviant sects—including and often most especially the Puritans.  Loyalty oaths included mandatory attendance at Anglican services and the adoption of the Book of Common Prayer.  The reason this was such a big deal at the time was because England had become a Protestant nation under Queen Elizabeth I and everything James and Charles were doing smacked of Catholicism (or Romishness), which the majority of common folk had rejected, and not without cause.  The history of the religious whipsaw England endured in these years is a blood-soaked one.  How people prayed, whether or not they could read the Bible themselves, and their private affiliations to their religious conceptions became the stuff of vicious street politics and uglier national power plays.

So when we hear that the Pilgrims came to America in order to worship as they saw fit, we sympathize.  Naturally, we feel, everyone should be allowed to worship in their own way.  We have internalized the idea of private worship and the liberty of conscience—an idea that had no currency among the Puritans.

The Puritans were no more tolerant than the high church bishops enforcing Anglican conformity in England.  They thought—they believed—their view of christian worship was right and they had come to the New World to build their version of perfection.  A survey of the laws and practices of those early colonies gives us a picture of ideological gulags where deviation was treated as a dire threat, a disease, which sometimes required the amputation of the infected individual: banishment.

Hence the law forbidding anyone from living alone.  It was thought that in isolation, apart from people who could keep watch over you and each other, the mind’s natural proclivity to question would create nonconformity.

Conformity is sometimes a dirty word today.  We pursue it but we reserve the right to distance ourselves from what we perceive as intrusiveness in the name of conformity.  Among the Puritans, conformity was essential to bring closer the day of Jesus’ return.  Everyone had to be on the same page for that to occur.

(Which gave them a lot of work to do.  Not only did they have to establish absolute conformism among themselves, but they would at some point have to go back to England and overthrow the established—i.e. the King’s—order and convert their fellow Britons, and then invade the Continent and overthrow Catholicism, and all the while they had to go out into the wilderness of North America and convert all the Indians…but first things first, they needs must become One People within their own community—something they were finding increasingly difficult to do.)

Into this environment came Roger Williams and his family.  Williams was a Puritan.  But he also had a background as apprentice to one of the most formidable jurists in English history, Sir Edward Coke, the man who ultimately curtailed the power of the king and established the primacy of Parliament.  Coke was no Puritan—it’s a question if he was anything in terms of religious affiliation beyond a christian—but he was one of the sharpest minds and most consistent political theorists of his day.  He brought WIlliams into the fray where the boy saw first-hand how power actually worked.  He saw kings be petty, injustices imposed out of avarice, vice, and vengeance in the name of nobly-stated principles.  And, most importantly, he saw how the church was corrupted by direct involvement in state matters.

This is a crucial point of difference between Williams and later thinkers on this issue.  Williams was a devout christian.  What he objected to was the way politics poisoned the purity that was possible in religious observance.  He wanted a wall of separation in order to keep the state out of the church, not the other way around.  But eventually he came to see that the two, mingled for any reason, were ultimately destructive to each other.

Williams was an up-and-coming mover among the Puritans, but the situation for him and many others became untenable and he decamped to America in 1631, where he was warmly received by the governor of Massachussetts, John Winthrop.  In fact, he was eagerly expected by the whole established Puritan community—his reputation was that great—and was immediately offered a post.

Which he turned down.

Already he was thinking hard about what he had witnessed and learned and soon enough he came into conflict with the Puritan regime over matters of personal conscience.

What he codified eloquently was his observation that the worst abuses of religiously-informed politics (or politically motivated religion) was the inability of people to be objective.  A “monstrous partiality” inevitably emerged to distort reason in the name of sectarian partisanship and that this was destructive to communities, to conscience, to liberty.

For their part, the Puritans heard this as a trumpet call to anarchy.

The Massachussetts Puritans came very close to killing Williams.  He was forced to flee his home in the midst of a snowstorm while he was still recovering from a serious illness.  He was succored by the Indian friends he had made, primarily because he was one of the very few Europeans who had bothered to learn their language.  They gave him land, which eventually became Providence Plantation, and he attracted the misfits from all over.  Naturally, Massachussetts saw this as a danger to their entire program.  If there was a place where nonconformity could flourish, what then became of their City on the Hill and the advent toward which they most fervently worked?

The next several years saw Williams travel back and forth across the Atlantic to secure the charter for his colony.  He knew Cromwell and the others and wrote his most famous book, The Bloody Tenent of Persecution, for Cause of Conscience,
in 1644 right before returning to America to shepherd his new colony.  In this book for the first time is clearly stated the argument for a firm wall of separation.  It is the cornerstone upon which the later generation of Founders built and which today rests the history of religious freedom we take as a natural right.

But the struggle was anything but civil and the abuses to which Williams responded in his call for a “Liberty of conscience” are not the general picture we have of the quaint Pilgrims.

Barry sets this history out in vivid prose, extensively sourced research, and grounds the story in terms we can easily understand as applicable to our current dilemma.  One may wonder why Williams is not more widely known, why his contributions are obscured in the shadow of what came later.  Rhode Island was the first colony with a constitution that did not mention god and it was established for over fifty years before a church was built in Providence.

Williams himself was not a tolerant man.  He loathed Baptists and positively hated Quakers.  But he valued his principles more.  Perhaps he saw in his own intolerance the very reason for adoption of what then was not merely radical but revolutionary.

Great Blunders, Great Wars

High school history provides us with the basics of World War I and does so by making it appear that something akin to an earthquake happened.  Archduke Ferdinand, of the Austro-Hungarian Empire, is assassinated in Sarajevo and a month later Germany invaded France, triggering a catastrophic series of treaty-obligated interventions by Russia, England, and so forth.  Simple.

Except, what?  Why would Germany do that because the heir to a throne not theirs is shot by a lone assassin in a city in a country allied with Austria?

The connective tissue was always missing.  Something (mumble mutter) to do with Serbia and Austria blaming them for the murder (by an independent terrorist!) and Russia insisting Austria leave Serbia alone, Germany insisting Russia leave Austria alone, France insisting Germany leave Russia alone, and England insisting everyone leave Belgium alone (Belgium? How did Belgium get into this…?), and suddenly you have the international equivalent of a schoolyard pile-on.

Many books have been written attempting to explain the complicated set of relations between the so-called Great Powers and how they all triggered each others’ worst responses in what amounted to a game of chicken.  But that high school myth persists, that WWI happened almost out of the blue.

Sean McMeekin has produced a worthy examination of the month between the fateful assassination and the opening of hostilities on August 4th, 1914.  In July 1914:  Countdown To War he takes pains to show how all this transpired.  It happened quickly, to be sure, as international interactions go, but it was not either unexpected or inevitable.  The major element, besides considerable attention to a chronology which he lays out with admirable clarity, included is what so often is left out of history courses—personality.

McMeekin’s portraits of the players—Kaiser Wilhelm II, his chancellor, Bethmann, the Austria foreign minister Berchtold, army chief of staff Conrad, Russia’s Sazanov, Tsar Nicholas II, Foreign Minister Sir Edward Grey of Britain, and all the rest—open the curtains on how the fatal mix of personalities led to the catastrophe that reshaped Europe so much that in many ways we are still sorting through the rubble.

Starting with the ongoing hatred among the hawks in Austria toward Serbia.  Begin with that and the long history behind it and we begin to see that nothing was really a surprise other than the fact that it actually happened.  The first blunder was the connivance of the Austrians to obtain German backing for a punitive action against Serbia for sponsoring the assassination of the archduke—an archduke, by the way, who was unpopular in his own family and whose loss as a successor to the throne was something of a relief to the Emperor.  Begin with that and the next series of events—diplomatic wrangling, lying, obfuscation, and, above all, haste—makes sense.  Insane sense, but sense nevertheless.

And because McMeekin is dealing handily with the personalities of all these people, questions of reason, caution, experience, and the deliberative conservatism one might expect from old established states become moot as we watch them all jockeying for position to prove points, gain support, establish—or in the case of Austria, re-establish—reputations.

Reading this, one is put in mind of the rush to war in Iraq in 2003, under conditions wherein insufficient information, curtailment of debate, and a drive to do overrode all other considerations.  Hindsight is frustrating.

McMeekin’s concluding chapter, wherein he discusses responsibility and offers a variety of arguments over inevitabilities, is more than just a summation.  Rather it is a sobering analysis of the fragility of circumstance and the importance of character, which so many of us would like to pretend doesn’t matter.

A Country Of Distant Voices

In the opening scene of his new novel, And the Mountains Echoed,  Khaled Hosseini shows an Afghan father telling his children a story. The story is about life’s fragility in the face of an unpredictable and unnegotiable universe, the loss of children, and the tenacity of memory. In the story, the father is offered a choice—having a lost child returned to him to live a life he knows will never be more than difficult, often harsh, or leaving the child in the relative paradise to which it had been spirited while the father is granted the gift of forgetfulness, so he might return home with no memory of loss. The father chooses the latter.

But all around him, when he returns, memory remains, part of the landscape, continually troubling his life with fleeting moments of doubt about something he cannot name.

He has left his memories far away, in the mountains.  But the mountains are always there.

So, it turns out, are the memories, recognized or not. The real mountains in the novel are the tectonic accumulations of intersecting lives, which in some ways seem to have no real point to their connections, but over time—generations, really—build into massively instantiating forms, repositories of meaning.  Some of these characters climb over them, others live at their roots, still others move away from them, trying to lessen their dominance. But every word they speak echoes back laden with the textures of their beginnings.

The novel begins with the story of Abdullah and Pari, brother and sister who share a deep bond. Pari collects bird feather in a tin box, feathers Abdullah helps gather for her, and their playground is the village of Shadbagh. Life is crushingly hard for their parents. The father is a laborer. His first wife, Abdullah’s and Pari’s mother, died giving birth to Pari. His second wife has given him another son, Iqballah. Her brother, Nabi, lives in Kabul, the personal servant to a man of wealth who is married to  a woman more at home in Parisian society than in her native Afghanistan. The necessities and desires of these people bring them into association with each other in the most unexpected way, resulting in the separation of Abdullah and Pari.

Thus the series of separations which are the echoes of the novel.

Pari is taken into her new home, much too young for the memories of her time spent with Abdullah to be retained in other than a lifelong sense of hollowness.  The woman who becomes her mother is a poet, herself severed from the connections to home and family that might supply a sense of welcome in the world through which she moves.  Talented, beautiful, she is nevertheless a refugee even in her own country.  When her husband suffers a stroke, she takes the opportunity to flee, back to Paris.  With Pari, who over time forgets almost everything and is left with a persistent feeling of separation she cannot quite explain or ignore.

The trajectories all these people follow seem at a glance to have little to do with each other, even though certain events lie at the start of their paths.  Their lives settle into orbits that are tethered by those events, and no matter how far they go or where they settle in, a constellation forms of which each of them represents the rough boundary of a country that, while it seems to have no place on any map, claims them as native.  The echoes from that initiating event form the borders.

Which makes And the Mountains Echoed an exploration of that country, through the eyes of its unwitting inhabitants, all of whom, regardless of their point of origin, are native to a specific topography, bound by common experiences—of loss, abandonment, and escape.  He takes us on an expedition of a place of which the only maps are in the psyches of its residents.  Along the way he works a variation on the old aphorism “You can never go home” by showing that, in profound ways, we never leave it.

On another level, there is a very real country at the center of these explorations. Hosseini is writing, as always, about Afghanistan—its wonders, its tragedies, its costs, and its possibilities.  It is, he seems to tell us, a land of incredible potential, but to date the only possibility to realize it is for those with the talents and will to leave it, go where their particular gifts—themselves—can manifest, beyond the overwhelming gravity of a past that too often has no history of a future, no memory of what could be different that is not bound up in forgetting.

Like the story Abdullah’s and Pari’s father tells at the start.

At least one of his characters recognizes the innate conflict:

It saddens me because of what it reveals to me about Mama’s own neediness, her own anxiety, her feat of loneliness, her dread of being stranded, abandoned.  And what does it say about me that I know this about my mother, that I know precisely what she needs and yet how deliberately and unswervingly I have denied her, taking care to keep an ocean, a continent—or, preferably, both—between us for the better part of three decades?

Hosseini writes with an unflinching clarity of what Afghanistan is, tempered by hope for what it could be.  It is not that this potential Afghanistan does not exist—it does, he shows us, just not there.  It exists in the imaginations of those who have left and are nevertheless citizens of the country of their heart. That country is nascent in the echoes that will some day return from their journey.

Primary Influences

Reading and writing are inextricably linked, but it’s a lopsided relationship.  One can be a voracious reader without ever feeling the need to write, but being a writer by necessity demands voracious reading.  There are some who seem to believe they can write without having to read extensively (or at all!) but I imagine this is a self-correcting delusion.  It may be a more obvious problem in this age of self-publishing ease, when one’s shortcomings can make unfortunate and sometimes wide spread public displays, but the simple absence of any kind of artistic æsthetic on which to base the work is fatal to the endeavor.

Besides, what would be the point other than a profound narcissism.  Part of the fantasy of “being a writer” is to join a fraternity whose past membership has provided the delight you hope to offer, a delight you have presumably found in reading.

I imagine that for some writers, the desire grows gradually, a cumulative response emerging after many books.  Specific texts are less important than the experience itself.  For others, there’s a turning point, a moment when the reading experience in a given work sparks the “I want to do this!” response that grows, if nurtured, into a lifelong obsession.

I can pinpoint my own turning point.

foundation covers

Isaac Asimov’s Foundation and Empire was the book that decided me.  I bought it at the corner drug store in 1968.  Mr. Leukens had a spinner rack from which I’d been obtaining paperbacks for almost a year by then.  I can say quite honestly and without embarrassment that it was the cover that caught my attention.  That Don Punchatz rendering radiated “significance” in a way other covers failed to achieve.

I’d been reading science fiction in one form or another for as long as I could remember.  Comic books, mostly, but once I’d obtained my library card, the occasional SF novel came home with me.  A lot of them seemed…well, stodgy compared to the movies.  I admit to being disappointed with science fiction that was set in more or less the present day.  I was a kid, after all, I was after the gosh wow! more than the cerebral pleasures that are the chief attribute of the form, at least in those days.  I wanted Forbidden Planet and John Carter not stuff stuck on Earth.

Asimov I knew from another novel, Pebble In The Sky, which I had read earlier in the year.  I still wasn’t connecting authors with preferred experiences, at least not as a guide to find more of the same.  Partly this was because I had no reliable way of getting more by a given author.  Leukens Pharmacy was my primary source and the fact is he had no control over what ended up in that spinner rack.  It was hit or miss.

That month, the only one of the trilogy available was the second volume.  (I didn’t even know what “trilogies” were yet.)

Gradually, I came to regard Avon as the imprint that provided me with the kind of material I most wanted.  Along with the Foundation books, I got a lot of Robert Silverberg, B.N. Ball, James Blish, and later they published the Science Fiction Hall of Fame collections.  Their books had a particular “feel” and quality that seemed lacking (or at least different) from other imprints.  (So in a peculiar way I was initially more aware of publishers and editors than authors.)

Asimov sold the first Foundation story to John W. Campbell in 1941 and went on to write all the stories that comprised these three books by the early 1950s.  I read them out of order.  The middle book first, then the first one, finally, after months of searching, the last one.  The covers above are from a slightly later edition, but basically the same ones I eagerly sought and devoured.

They were everything, at the time, that I wanted from science fiction.

But what was that?

I was 13, almost 14.  My reading had been chaotic though wide and I had a smattering of history (not nearly enough to form any cogent opinions of events) and I had the sense that a lot of fiction, especially in the movies, was disconnected from all that went before whenever the events of the story took place.  Right off the bat, Asimov offered a simple, elegant way to imply a concrete history by the epigrams of his fictional Encyclopedia Galactica (an obvious but nevertheless effective play on Encyclopedia Britanica and American).  That “scholarship” existed on which the chronicler of these wholly fictional and fantastic events could draw provided a basis of  “authenticity” that completely sucked this reader in.

What followed was a self-consciously analytical treatment on the way history might work.  The premise is Cartesian—if one knows enough about enough, then one can make reliable predictions.  The sheer control offered by Seldon was profoundly seductive.

And then, of course, there was the Empire, spanning the entire galaxy, thousands of worlds, a massive civilization bound together by hyperdrive and the Imperial center on Trantor.  Trantor itself was such a startling idea, an entire planet completely covered by a single city.

Gaal Dornick’s arrival on Trantor, on later reflection, was the arrival of any young man from a more rural part of America to New York via Grand Central Station, and the awe of such a massive construct.  (Samuel R. Delany rather elegantly recapitulated this in the opening scenes of his Atlantis: Three Tales with the actual New York.)  In a way, Dornick’s reaction is very like the reaction of a new reader who suddenly “gets” it.

Considerations of cost and the unlikelihood of achieving any fraction of the kind of homogeneity, political or otherwise, never entered into it.  Asimov had loosely based his Galactic Empire on the Roman Empire and that itself was a highly improbable collection of provinces under a single banner.  If you could accept the one (which had actually existed) you could accept the other, especially since as the story opens the Empire is beginning to crumble.  By this device, Asimov acknowledged the latent impossibility of a “galactic empire” by letting us watch its demise from sheer social and political entropy.

New things are born from the ruins of the old, and the rest of the series is about these new things.  What I found so appealing was the inherent historicity of the Foundation stories.

Of course, the idea of mathematically predicting future events with the kind of precision suggested in these stories is fantastic at best.  The notion behind it is not fanciful, there is something to the dynamics of large groups in motion that lends itself to patterning.  Asimov simply worked a variation on actuarial math and raised to dizzying heights.  It is a criticism of which he was well aware, one I already agreed with since I’d begun with the middle volume—the one in which The Mule appears to completely overturn everything Seldon had constructed.  The fey element, the unpredictable, the unaccountable.  Asimov subverted his own premise.

But that opened the narrative up to a more sinister thread, one which has also been geared into history: the secret society, the hidden group which from time to time people believe to be the real rulers.  In this, Asimov was still playing with the plausibilities of accepted historical narrative.

It was easy then to accept that Asimov was writing about the collapse of the Roman Empire—and the perfectly agreeable desire to shorten the inevitable “dark age” following the fall of such a huge and apparently monolithic construct.  But as one grows older and continues the kind of necessarily broad and voracious reading essential to being a writer of any worth, such simple comparisons erode.  The falls of empires probably always follow certain patterns, but in the details they differ.  I now suspect Asimov, if he was being intentional in his subtexts at all, was writing about the vanity of empire rather than of any particular one, and the costs of such things to those who become dependent.  Asimov was a refugee, born in Russia.  Perhaps too young to remember anything of his early childhood there, no doubt he heard the stories, and of course there was World War One, the first death blow of a European Order that went back a millennia at least.  By the time Hitler was trying to establish a new Roman Empire (at least in terms of territory if not intent), it was obvious that the old regimes were done for, and the future was about to be in the hands of the bureaucrats, apparatchits, and opportunists in a way never before seen.  In such a world, the idea of preservation itself might be seen as the only worthwhile enterprise—the preservation of knowledge, which would make Seldon’s Encyclopedists the first moral actors in a post Imperial age.

I think Asimov was writing about the world he lived in rather than either the Roman Empire (or Republic) or the Galactic Empire.  Naturally, insofar as science fiction is always really about the present, viewed through the distorting lens of a future tense.  But more than that, because he was establishing priorities.  Empires rise and fall—the Foundation itself becomes an empire (much as America did after WWII, if not in fact at least in influence) and all empires become pieces on a larger chess board in a game played by those behind the scenes—but what matters is the continuity of knowledge and access to it for all those people who must survive the changes in political fashion.

I couldn’t possibly have recognized all this when I first read these books.  Some of my peers, and certainly many of the adults around me then, dismissed them as they did all SF as “mere” entertainment, idle speculation, and, at worst, a waste of time.  But for me, what may or may not have been latent in the text was sufficiently present to inspire.  The seriousness with which Asimov approached his subject was very different in tone and effect from, say, Doc Smith.  Insofar as I have ever been scholarly, the Foundation series spoke to me on that level, and triggered the response that led me to start writing my own stories.

It’s telling that in Asimov’s autobiography, In Memory Still Green, he claims that he had no idea what he intended to do after writing and selling that first Foundation story.  But he had put a hook at the end of it which demanded a second story, thinking himself clever that he had in some way trapped Campbell into having to buy the sequel in order to answer the question, without quite realizing that he then had to deliver.  He goes on to claim that he never could work from an outline, not then and not later.  Maybe not on paper, but there was an outline in his head somewhere that provided a reliable template.

Of all the SF I read back then, I find few I can reread with any pleasure.  This is one of them.  It still enthralls me.  I can still see the vast deeps between the stars and the terrible force of history unfolding and enfolding across time the matrices in which we nevertheless decide for ourselves what we want and struggle to accomplish.

That, at least, is my story.

Culture’s End (The Ends of Culture)

Once in a while, work comes along that, while not doing anything apparently new, turns a settled form inside out and frees possibilities.   In writing, this generally means that, in the wake of such work, the things it is possible to say and the ways in which they are said broaden.  Branchings occur, reactions, new growth, inspiration ripples along.

Iain M. Banks triggered—at least for me—a renewal of an old science fiction mainstay, the Space Opera.  Practically from the beginning of the modern form in the 1920s, interstellar adventures have been woven into the DNA of the genre, replete with strange planets, exotic aliens, and occasional examinations of political systems, albeit not on a very sophisticated level.  Everything from the Roman Empire to a kind of United Nations model has been used, sometimes to unintentionally silly effect.  Given the suppositions on hand, it is not a small task to plausibly imagine such a universe.  Some of the best works have ignored the details, lest unwanted hilarity result, suspension of disbelief sabotaged by, of all things, the wallpaper.

Space Opera lost some of its cachet in the Seventies in the wake of Star Trek, which combined much of the long history of the form in a single popular television show, and made it difficult to write anything that didn’t look like Star Trek.  In written SF, Space Opera receded in prominence.  Then in the early Eighties, with Neuromancer by William Gibson, Cyberpunk muscled its way into prominence and one of those moments of expansion occurred.  For the next two decades, it seemed,  reaction to Cyberpunk dominated the field.

But in 1987 a novel was published in England (a year later in America) that signaled the coming resurgence of good ol’ fashioned Space Opera.

Consider Phlebas was a thick, densely-detailed, elegantly-penned adventure that seemed to have come from the mind of a literary writer who had no real idea there had ever been such a thing as Space Opera.  But that was impossible, since it handled the conventions of the form with such grace and sympathy as to suggest a lifelong devoteé.  Iain Banks simply didn’t write from a traditional æsthetic, even when it seemed he did.

One of the most interesting choices he made in the novel was putting his major invention—the Culture—in both a background position and as an antagonist.  One might be forgiven if, from reading just this book, one thought the Culture was a throw-away idea, never to appear again.  Because the other civilizations depicted, several of which are at war, are so vividly and thoroughly imagined that any one or five of them might have served as the solid foundation for a series of breathtaking novels.

To be clear, what the Culture subsequently became, in novel after novel (and a handful of short stories) was not a hero’s preserve.  The Culture seems often like the Good Guy, but just as often they are a meddlesome, arrogant, dangerous collection of diplomatic bullies.  What Banks constructed with the Culture is a kind of Swiss Army Knife of an interstellar empire.  It is what it needs to be in any given circumstance.  And like any real government, expedience is its chief operating mode.

But.  And this is a large exception.  Because the Culture actually has no material needs—it is what we’ve come to term a “post scarcity civilization”—its political motivations are a bit more abstract.  The Culture has a moral compass, one which it seems to ignore as often as it follows, and has, in complete contradiction to the famous and also often ignored Prime Directive of Star Trek, no compunction about interfering with another civilization at all.  In this way, Banks created the perfect sociopolitical tool to examine what might be termed Moral Expedience.

Rather than confirm the essential uselessness of Space Opera, Banks made it relevant by making cases for right action within a vast and complicated set of interlocking political, social, and ethical systems.  Philosophy 101, in many cases, but deftly handled and often pointedly specific in its potential relevancies.

By further expanding the players to include wholly autonomous machine intelligences—ships that owned themselves and acted according to their own interests, AI advisers, habitats both awake and involved—he opened the dialogue on the question of rights as a, if you’ll forgive the seeming contradiction, concrete abstraction.

If one of the primary attractions of science fiction is the examination of the question “How, then, shall we live?” then one could do much worse than Iain M. Banks as a complete buffet of fascinating riffs, postulates, improvisations, and dialogues on exactly that question—which, at its heart, is the primary concern of what shall be done with virtually unlimited power?

All this would imply a dry, discursive study, plodding expositions, info-dumps that slow the action (what there may be) to a near halt.  That would be a mistake.  Banks’ skill has been to lay all this depth and contemplative meat, bone, and gristle into exceptional adventures with high stakes and finely-drawn characters.  Everything in a Banks novel is profoundly personal.

Space Opera has enjoyed a come-back since that first Culture novel came out.  Banks is now one of many well-respected practioners of the form.  It may be that the field was ready to revisit it anyway.  But without Banks, it may be wondered how satisfying such a visit might have been.

As we shall be wondering when there are no more Culture novels.

Iain M. Banks has announced his last novel (not a Culture novel) because he has terminal cancer.  The 59-year-old writer of eleven Culture books and sixteen other novels says he has perhaps a year to live and his new novel, as yet unreleased, will be his last.

An appreciation of Banks’ Culture stories is only the half of it.  He has enjoyed the enviable ability to write so-called “mainstream” works under “Iain Banks” all along.  His first novel, The Wasp Factory was an experimental work that bordered on SF, reminiscent of both J.G. Ballard and Philip K. Dick.  He has written thrillers, literary novels, satires.  Since 1984 his work has made a significant impression in the U.K. and has gained a large following in the United States.

He is only 59.  If there is any justice, he will be long remembered as a pivotal voice in Western Letters.  Treat yourself.  Go read one of his novels.  Then read another.  Repeat.