Easy Habits and the Consequences of Belief

At first glance, the two books could not be more different. Subject, tone, everything seems different. Not at odds so much as…nonoverlapping.

Which is ironic, since both deal, in their separate ways, with that very idea, the separation of areas of knowing.

Probably because I read them so close together I recognized their shared concerns as clearly as I did. Whatever the reason, it struck me as obvious in so many ways that I began to recall all the other books over the last few years that could be likewise gathered within this same subset, all on distinct topics and yet all based on, to some degree, an analysis of the same human propensity to disregard evidence when it contradicts belief.

Let me begin with the more general of the two.

Jerry A. Coyne is an evolutionary biologist.  Compared to others with public profiles, he has published few books.  Three, to be precise.  His latest, Faith Vs. Fact: Why Science and Religion Are Incompatible, is a direct challenge to Stephen Jay Gould’s idea of “nonoverlapping magisteria.”  Gould’s premise is that science and religion should not conflict with each other because they are concerned with entirely separate realms of knowing—hence the nonoverlapping part—and except for certain agenda-driven partisans, there is no reason for them to be in conflict.  Coyne sees this as accommodationism, which he thoroughly discredits in his book.

My claim is this: science and religion are incompatible because they have different methods for getting knowledge about reality, have different ways of assessing the reliability of that knowledge, and, in the end, arrive at conflicting conclusions about the universe.  “Knowledge” acquired by religion is at odds not only with scientific knowledge, but also with knowledge professed by other religions.  In the end, religion’s methods, unlike those of science, are useless for understanding reality.

Coyne identifies Accommodationism as an attempt not to stir the hornet’s nest, because scientists are often dependent on politically sensitive funding.  Science, especially Big Science dealing with questions of origins, is expensive and the days of the independently wealthy scientist are largely gone.  Rocking the boat by annoying those who hold the purse strings would seem ill-advised.

The conclusions, he goes on to argue, of scientific inquiry continually poke holes in those claims by religion that still assert authority over secular matters.

But more than that, such deferral to authority erodes our critical capacity and can lead to false conclusions and bad judgments, all because we accept the hegemony of “faith.”

…a word defined in the New Testament as “the substance of things hoped for, the evidence of things not seen.”  The philosopher Walter Kaufmann characterized it as “intense, usually confident, belief that is not based on evidence sufficient to command assent from every reasonable person.”

The sticking point for many would be that “reasonable person” proviso, for surely, as Coyne concedes—often—there are many reasonable people who nevertheless espouse a religious faith.  What are we to make of that?  Is the basis for the assessment of “reasonable” perhaps too ill-defined or is the scope of “belief” too broad?

Coyne takes us through the process, giving a thorough explication of why science and religion are incompatible if not a convincing argument for the down-side of belief for belief’s sake.  He tells an anecdote early in the book about how he yearly teaches undergraduates a course in evolution, which the majority do very well at, but admit, after earning their A, to not believing a word of what he taught.  Because it contradicts their religious belief.

It is this that Coyne sees as the dangerous aspect of faith as promulgated through religion, the a priori rejection of evidence in favor of a set of usually unexamined beliefs.  He takes us through a catalogue of negative manifestations, from the rejection of medicines to the acts of terrorists to the rejection of solid science (like climate change), that have their underlying justifications in religion.

Coyne, an admitted atheist, puts all this forward while taking pains to admit the personal comfort to be found in religion by many people.  From a certain point of view he tends to go the extra kilometer to be fair.  Of course, going through the comments on various review sites, those predisposed to reject such arguments accuse him of profound bias if not outright malicious intent.  One cannot help but wonder if they bothered to read the book, all or even in part.

The book makes its case clearly and concisely.  It avoids the polemic outrage to be found in other tomes by big name atheists by sticking largely to evidentiary concerns and philosophical arguments.

But, one may ask, so what?  Religion and science are two realms in which most people would assume they have no stake. Castles in the air, esoteric arguments about things that have no impact on our daily lives.  Most people seem to keep a religion much the same way they have a pet and certainly in the West the majority live secular lives and only rarely feel compelled to apply their religious convictions to anything.  As for science, as long as the technology we depend on works, all the rest is so much theoretical handwaving.  It makes no difference if we have almost no understanding of quantum mechanics and certainly evolution is just tenure-building nonsense having to do with million-year-old bones and what kind of textbooks the school district might use next year.  Nothing to do with us in our daily lives. So what if people rely more on faith and belief in making their daily judgments than on evidence-based science?  We operate more on heuristics defended by aphorism than by reason and applied understanding, or so Daniel Kahneman tells us in his excellent study, Thinking, Fast and Slow, and we by and large get along perfectly well that way.

How does this argument concern me?

Johann Hari’s new book, Chasing The Scream, has an answer to that.  Of sorts, if we but make the epistemological leap.

Hari is writing about the drug war. It is, for him, as much a personal examination as a journalistic one, as he admits to having family and friends who are addicts.  He begins with a simple question.

I scribble down some questions that had puzzled me for years.  Why did the drug war start, and why does it continue?  Why can some people use drugs without any problems, while others can’t?  What really causes addiction?  What happens if you choose a radically different policy?

Okay, three questions.  Still, simple, basic questions, to which any reasonable person might reasonably expect a reasonable answer.

Yet we spend billions, bully small countries, destroy thousands if not millions of lives, all in pursuit of policies which rest on an appalling lack of informed justification.  By the end of the book you come to see that there are no answers to those simjple questions which in any way validate our continuing on with things as they are.  As they have been for a hundred years.

Hari goes back to beginning of the war, before drugs were illegal, and takes use through the history.  Back to Harry Anslinger, the head of the Federal Bureau of Narcotics, who fueled the drug war for the twin purposes of maintaining his agency and exorcizing demons that had possessed him since childhood, even in the face of substantive research and sound arguments denying his approach to the problem had any merit and more than ample evidence—Prohibition—that it would lead to catastrophe, not only on a national but on a global level.  Hari details Anslingers battle to destroy Billie Holiday and his use of intimidation and police tactics and, subsequently, U.S. foreign policy to assure the continued crusade to eradicate drugs and destroy addicts.

Not because of any evidence Anslinger understood that led him to think this was the only if not the best way, but because he believed it in spite of growing mountains of evidence that this was a wrongheaded approach.  He suppressed evidence, hounded physicians who dared present alternative models to the drug problem, intimidated politicians—except those who could secure him funding—and strenuously denied the validity of any evidence that  contradicted his belief.

He believed.

As many did and still do.  It is this that trumps hard evidence.

Even as a young adolescent I thought the argument for our drug policies was lacking.  I thought at the time that I just didn’t understand.  Never having been in the least interested in drugs and knowing few if any who were involved with anything more serious than marijuana, it seemed not to concern me.  Later, I did a short stint as a volunteer drug counselor, but the work was far more than I could handle at the time.  I trusted the people in charge knew what they were doing and certainly the gang violence associated with drugs seemed to make a persuasive case that this was a worthwhile and often desperate fight.

But as the years went by and the war continued, I began to notice bits of research here and there and how certain politicians contradicted themselves and how the prison population, especially in the wake of Nixon’s near militarization of the police community and the drug war ethos, was growing and in very worrisome ways.  I began to seriously rethink my position with Reagan’s zero tolerance policies and the mandatory sentencing guidelines he established through Ed Meese, one of the notable Puritans of the modern age.  Even so, I had the nagging suspicion that maybe I was just missing something.  Certainly I didn’t approve of drug addiction, but I more and more came to believe that these people needed help, not punishment.  We understand that process very well with alcohol, why is it different with narcotics?

And besides, there are plenty of people who receive perfectly legal prescription narcotics and never become addicts.

The number of holes in the picture kept growing.  I no longer trusted stated drug policy, but I didn’t understand the instransigence of people over reform.

Hari’s book lays it out very clearly.  Money is high on the list.  We fund too many of the wrong people at too high a level for them to be easily weaned from the teat.  Foreign policy is also tied to this, especially in these days of international terrorism which has a strong drug component.  But the factor that ties this in to Jerry A. Coyne’s book is the one Hari covers only glancingly.

Belief.

It is easier to rely on what we have always believed than to look at evidence that requires us to change our mind.

Many aspects of our lives are covered by this observation, but where problems arise are those with political and social ramifications.  The persistent beliefs about the poor, about minorities, about people with different beliefs, even when evidence is provided which significantly challenges such beliefs and suggests strongly that not only are they wrong but that we would be better off discarding them, we cling to them.

Hence my conflation of these two books and the suggestion that they share a common idea.

Not that I argue that all beliefs are wrong.  What is wrong is the intractable nature of unquestioned belief.  The only reason the drug war continues is that it has considerable popular support and the only reason it has that is that many people cannot bring themselves to change their minds about it in the face of not only evidence but what we euphemistically call common sense.

But that can be said of so many things which directly and indirectly impact our lives.

Perhaps it is a stretch and perhaps I argue out of my own biases, but it seems to me the most valuable tool we can have in our intellectual toolbox is the ability to say “well, I believe that but I might be wrong.”  Faith cannot maintain, however, among people who are able to say that and apply it to anything and everything.  Science is built on exactly that principle—all knowledge is conditional—but belief, as exemplified by religion, thrives by the absence of that principle.  It says some knowledge is absolute, not to be questioned.

In the contemplation of matters of theological concern, this perhaps offers consolation, comfort, a certain utility, to be sure.  But it is easy for unquestioning acceptance of arguments from authority to become habit and then apply it to anything that our prejudices suggest we may wish to avoid examining.  Drug addiction is an unfortunate affliction and it may be uncomfortable for people to see it for what it is—a disease, like alcoholism.  That discomfort makes room for assertions from authority offered by people who claim to be working on our behalf.  Our unwillingness to ask evidentiary questions is a cozy environment for demagogues and despots.

When you ask “What does this have to do with me?” you might be surprised at the consequences of avoidance and the price of unquestioning trust.  Why should we learn science and how to apply the discipline of examining evidence?  So we don’t hurt ourselves out of the easy habit of belief.

Traditions and New Eyes

I recently finished rereading a book from last year, preparing to read the sequel. I should cop to the fact that my reading has rarely been what you might call “timely” and I’ve gotten worse over the last several years.  When I wrote reviews for actual pay this was not as much a problem, because I had to read current material.  But left to my own devices, I pick and choose from my to-be-read pile at random, pretty much the way I buy books to begin with.  So I might read an old Agatha Christie concurrently with a newer physics tome by Kip Thorne after having finished a very new history of the sinking of the Lusitania, then pick up a newish novel while at the same time rereading some Ted Sturgeon… So it goes.

So I am very much “behind” almost all the time.  I served as a judge for the PKD Award one year and managed to read an unbelievable number of recently-published SF novels.  I can commit and stay the course when required. But in general my reading keeps me in a kind of ever-imminent nontime in terms of how I encounter works.  I don’t sort by decade when I start reading, not unless something in the text forces me to recognize it.  So to me, it is not at all odd to see James Blish and Iain M. Banks and C.J. Cherryh and Ann Leckie as in some sense contemporaneous.

So when I encounter a novel like Charles E. Gannon’s Fire With Fire I have no trouble—in fact, take some delight—in seeing it as part of a continuous thread that connects Doc Smith to Poul Anderson to C.J. Cherryh to any number of others who over the past 70 + years have mined the fields of alien encounter/politicomilitary SF.  And when I say I found the closest affinity with Poul Anderson at the height of his Terran Empire/Flandry period, that is, for me, high praise.

I loved Dominic Flandry.  Not so much the character, though there is that, but the milieu Anderson created.  One of the appealing aspects of his future history, especially those stories, was the authentic “lived in” feel he achieved, rarely duplicated by his peers, and seldom realized to good effect now.  Gannon does this.

The story in Fire With Fire is nothing new.  Earth has begun to settle other worlds around other stars and it’s only a matter of time before we encounter other space-faring civilizations.  In fact, we have, only it isn’t public knowledge, and in some instances it’s not something the discoverers even want noticed.  While Anderson had the Cold War to work with, Gannon has the transnational world, with all its disquieting ambiguities over what constitutes nations and how they differ from corporations and the undeniable motivation of profit in almost all human endeavors, leading to an ever-shifting array of allies and enemies in arrangements not always easy to define much less see.  He takes us through all this quite handily.  It’s not so much that he knows the pitfalls of human civilizations than that he recognizes that the field is nothing but pitfalls.

“All that is necessary for evil to succeed is that good men do nothing.”  Hobbes’ dictum plays in the background throughout as good guys dupe both bad guys and good guys, people are moved around and used like game pieces, power is unleashed—or not—based on calculi often having little to nothing to do with ethics and morality.  This is politics writ large and individuals learn to surf the swells or drown.

Into which is tossed Caine Riordan, an investigative journalist who is also a good man.  He is unfortunately snatched out of his life through a security mishap, placed in cryogenic suspension, and awakened 14 years later with a hundred or so hours of missing memory dogging him through the rest of the book, memories having to do with the two men in whose thrall he seems now to be. Nolan Corcoran, retired Admiral, and Richard Downing, former SAS and often reluctant aid to Admiral Corcoran.  Not reluctant in being unwilling to serve, but reluctant about some of their methods.  They run a secret organization designed to prepare for exosapient first contact.  It practically doesn’t exist, sort of in the way gravity under certain conditions doesn’t exist, and now Caine has become their tool.

Without going into details, which are a major aspect of this novel, suffice to say it is about that first contact and the political ramifications thereof. This is a not a new idea and much of the book may, to some, feel like ground well trod, but there is ample pleasure to be had in the trek over familiar ground seen through fresh eyes.  What is done better here than the usual is the economic and political backgrounding and the debates over the impact of first contact.  Furthermore, the seemingly impossible disaffection among the various political entities comprising the world we know are displayed to lend a plangent note of nailbiting despair to the very idea that we might pull ourselves together sufficiently for anything remotely resembling a world government.

To be sure, Gannon adroitly addresses the hoary old notion that when we meet the aliens they themselves will already have worked all this out long since and be in a position to pass elder judgment on our upstart species.  They haven’t.  They have a (barely) workable framework among themselves, but the advent of introducing another new race into their club proves to be an opportunity for old issues to be forged into new knives.

Gannon handles all this well.  He clearly has a grasp of how politics works and has imaginatively extended that knowledge to how nonhuman species might showcase their own realpolitik. He has a flair for detail.  He handles description very well, sets scenes effectively, and even manages to disguise his infodumps as conversations we want to hear.  Most of the time, it has the pleasurable feel of listening to a good musician groove on an extended improvisation.  Throughout we feel sympatico with Caine and the people he cares for and the situation is certainly compelling.

For me, this was a walk down a street I haven’t visited in some time.  I read this novel with a considerable experience of nostalgia.  It is part of a tradition.  A well-executed piece of an ongoing examination over issues we, as SF fans, presumably hope one day to see in reality.  We keep turning this particular Rubik’s Cube over in our collective hands, finding variations and new combinations, looking for the right face with which to walk into that future confrontation.  One may be forgiven if this particular form of the contemplation seems so often to turn on the prospect of war.  After all, aren’t we supposed to be past all that by the time we develop star travel and put down roots elsewhere?

There are two (at least) answers to that.  The first, quite cynically, is “Why would we be?”  Granted that most wars have at least something to do with resources.  One side wants what the other side has, and you can do the research and find cause to argue that even the most gloriously honor-driven wars had deep economic aspects to them.  Certainly the conduct of all wars has deep economic consequences.  But while that is true and might be argued for most wars, it is also true that many wars need not have been fought as there were other means of securing those resources.  But that didn’t matter.  It was the war, for someone, that mattered more even than the well-being of the people, the polity.  That ill-define and in retrospect absurd thing Glory is a very real ambition down through history.  Wars get fought as much for that as for anything else.  Suddenly having all your resources needs met would do nothing to dampen that.  In fact, it might exacerbate the Napoleonic impulse in some instances.

Because the reality is that Going There will do that.  Not survey missions, no, but if we assume the level of technology and capacity that allows for colonies on world in other solar systems, then we can assume a post-scarcity economy.  It’s the only way it makes sense.  We will not solve economic problems with an interstellar empire, the empire will be the result of those solutions.

So that leaves us with the second reason we may still face war.  No less cynical but more intractable. Racism.  Not the kind of small-minded nonsense we deal with in terms of skin color and language, but the real deal—wholly different biologies confronting each other over the question of intelligence and legal rights and the desirability of association.  Deeper even than that is the history and tradition brought to the question by two civilizations with absolutely nothing in common, having developed in isolation more profound than any we might imagine on the face of the Earth.

Not that either of these are inevitable, and it may well be that sophistication of technology and its responsible use breeds requisite tolerances.  But this is, as likely as it sounds philosophically, not a given, any more than war with aliens is inevitable. So we talk about it, in the pages of fictions with long traditions. There are certainly other possibilities, other scenarios, and there are other writers dealing with those.  Gannon is dealing with this one.

And doing so with a thick cord of optimism that raises this above the level of the usual “Deltoid Phorce” clone in the tradition of Tom Clancy or some other purveyor of gadget-driven war porn.  Gannon asks some questions in the course of this novel which keep it from descending to the level of the field-manual-with-body-count-in-technicolor.  This is more like what Poul Anderson would have written, with no easy answers, and heroes who are not unalloyed icons.

It’s worth your time.

Nicely done, Mr. Gannon.

Motives and Revelations

There is a remarkable scene—one of many—in James Morrow’s new novel, Galapagos Regained, wherein the final straw is broken for Charles Darwin and we are shown the moment he decided to back his radical new view of nature and its processes. Wholly fictional, no doubt, yet based on reality, Darwin has come to London to confront a young woman who has betrayed his trust while working in his household. The confrontation with the fictional Chloe Bathhurst is not the one that matters.  Rather, it is the confrontation Darwin is having with the edifice of a loving god.  His daughter is dying—tuberculosis—and the scientist in him knows there is nothing to be done, that an indifferent nature cares nothing for her goodness, her innocence, and any human claim on justice and fairness is but the empty babblings of a minor species only recently transcendent upon the ancient stage of life.  Darwin is angry and resentful.  The transgressions which resulted in his dismissing Miss Bathhurst are insignificant now against this greater, vaster crime which, he believes, has no actual perpetrator.  The only thing he can do, he decides, is to give her his blessing in pursuit of her own goal, which pursuit got her fired from his service.

Morrow-785x510

She was fired for attempting to steal the sketch he had written concerning the transmutation of species, a precursor work to his epic On The Origin of Species.  She did this in order to procure a means to free her errant father from debtors prison by using the work as the basis for winning the Shelley Prize, for which competition has been ongoing for some time in Oxford.  The purpose of the prize to reward anyone who can prove or disprove the existence of God.  Chloe, during her employ as Darwin’s zookeeper, became aware of his theory and thought it ideal to present and win the prize.

Darwin refused.  When she elected then to steal the notes and present it on her own, she was caught and dismissed.  Darwin was at the time unaware that she had already made a copy of the paper and thought he had caught her in the act.

Now, in the lobby of a London playhouse, where Chloe had once been employed as an actress, Darwin, aware that she in fact had stolen his treatise, is sanctioning her quest.

“Don’t overestimate my sympathy.  Had I two thousand surplus pounds, I would cover your father’s debts, then arrange for you to tell the world you no longer believe in transmutationism.  That said, I must allow as how a part of me wants you to claim the prize, for it happens that my relationship with God—“

“Assuming He exists.”

“Assuming He exists, our relationship is in such disarray that I should be glad to see Him thrown down…Get thee to South America, Miss Bathhurst.  Find your inverse Eden.  Who am I to judge your overweening ambition?  We’re a damned desperate species, the lot of us, adrift on a wretched raft, scanning the horizon with bloodshot eyes and hollow expectations.  Go to the Encantadas.  Go with my blessing.”

Because this is what Chloe has determined to do.  Go to the Galapagos Islands to gather specimens to support the argument for transmutation of species.  The Shelley Society fronts her the money to do so, she enlists her card-sharp brother in the expedition, they find a ship, and set sail.  The Society had already bankrolled an expedition to Turkey for the purpose of finding the remnants of Noah’s Ark, so this was only fair.

Accompanying her ship is Reverend Malcolm Chadwick, anglican minister and formerly one of the judges of the Shelley contest—on the side of the deity.  He steps down from that post at the request of Bishop Wilberforce and sent on this new mission to oversee what Chloe will do.  He departs with uneasy conscience, made so by the second part of Bishop Wilberforce’s plot, which sends another minister in another ship with the intention to go to the Encantadas and set in motion the ultimate destruction by slaughter of all the animals on the islands, thus to deprive the forces of atheism their troublesome evidence.  Chadwick finds this idea appalling, but he is faithful and says nothing.  He joins Chloe’s expedition, which becomes Odyssean in its complications and obstacles.

The novel proceeds from one adventure to another until Chloe herself, stricken ill in the Amazon basin, undergoes a kind of religious conversion, and decides she is wrong in her conviction that there is no god.  Morrow then expands on the struggle she engages with her fellow travelers and her own considerable intelligence.

What we are treated to in this novel is a thorough examination of human motivation in the face of shifting paradigms.  It may be clear where his sympathies lie, but he is too good a writer to load the dice in favor of his preferred viewpoint.  He gives his characters their own and follows them where they would naturally lead.  He never denigrates faith, only the fickleness of our intentions in the face of conflicting desires and awkward choices.  Tempting as it may have been in the end to simply declare a winner, Morrow instead takes a more difficult and fulfilling tack by portraying the times in which this debate flared into full flame with the advent of a solid theory of evolution.

Chloe Bathhurst herself is an admirable character.  An actress, adept as a quick study, she proves herself intellectually versatile and equal to any challenge.  As well, those who both aid and oppose her are equally well-drawn and Morrow deftly clarifies their motives.

Along the way, he gives a field demonstration in observation and interpretation, showing us the process whereby new understanding takes us over and how revelation can be a problematic gift.

Morrow is one of our best writers plowing the ground of controversy.  He never takes the simplistic road.  The pleasure in reading one of his novels is that of being allowed free range of the imagination in pursuit of specific truths stripped of dogma.  In fact, he disassembles dogma in the course of his yarns, a fact that is often not apparent while we’re in the grip of his artifice.

An artifice made warm by the complete humanness of his characters.  One his best creations is Chloe Bathhurst.  In her, several clichés and canards are undone, as well as many perhaps uncomfortable but rewarding questions asked.  She exemplifies the first rule of the explorer—never be afraid to go and see for yourself.  Do so and you’ll be amazed at what is revealed.

And what is lost.

The title parodies Milton’s Paradise Regained, from which perhaps Morrow took a bit of inspiration:

I, when no other durst, sole undertook
The dismal expedition to find out
And ruine Adam, and the exploit perform’d
Successfully; a calmer voyage now
Will waft me; and the way found prosperous once
Induces best to hope of like success.

Perhaps not so much to “ruin Adam” as to give us a view into a vaster garden, older and truer, and less a burden to our capacity for wonder.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Seeing At f.64

I have been involved in photography since I was 14 years old. My father gave me his Korean War-era Canon 35mm, a knock-off of a Leica model D, and I began a somewhat haphazard career which never quite went where I thought it would or should.  By the time I was 21 I had replaced that Canon with a pair of Minoltas, a Mamiya Universal (2 1/4 format), and a Linhof 4X5 view camera, plus the darkroom accoutrements to go with it all, including a massive Besseler 45M enlarger (which I still have) and pretensions of being a modern manifestation of a member of the Group f.64.

All I really knew about this group were the photographs and some names, and not even all of them.  Ansel Adams, Edward Weston, Imogen Cunningham.  Names to conjure with.  Sharp, full-tonal range images of landscapes, portraits, and details of objects made sublime by the attention of these artists of the lens.

In time I learned about other names—Alfred Stieglitz, Edward Steichen, Paul Strand, Walker Evans, Dorothea Lange, Ben Shahn—and picked up a bit of the history.  To be honest, photographing and printing were what interested me and I used the work of these past giants as goals and guides.  I bought a full set of Ansel Adams’s Basic Photo series, gleaned what I could, and set forth to do what he did, at least in my own small and considerably more modest way.  He had Yosemite, I had, when I got to it, the Ozarks.  Edward Weston had Big Sur, I had the Missouri and Meramec Rivers.  I spent too much time trying to duplicate their work and too little time finding what was worthwhile in my own backyard.

Which I did eventually and I’ve done some work of which I’m proud.  Ah, but there were giants in those days, and I wasn’t even getting to their shoulders.  The magical, mystical Group f.64 occupied in some sense the same place as the Fellowship of the Ring.  They were the first, the best, the most interesting.

And yet, little was available to be known about them between two covers.  Articles here and there, anecdotes, essays in their monographs, coincidental stories.  A great deal of myth can evolve from such piecemeal history.

Mary Street Alinder, who has previously written a biography of Ansel Adams, has published a history of the Group. Group f.64: Edward Weston, Ansel Adams, Imogen Cunningham, and the Community of Artists Who Revolutionized American Photography fills in the gaps of biography and history of this unlikely cadre of artists who took photography away from the Pictorialists who were dominant in the 1910s and 20s and gave it back to itself.

Beginning with the disdain East Coast photographers, dominated by the towering personality of Alfred Stieglitz, who was one of the principle gallerists and arbiters of what constituted art in America, for so-called “western photographers,” the primary movers came together in and around San Fransisco to establish their own hallmark on what they thought photography should be.  Dedicated to “pure” photography—that is, photographs that were themselves, making full use of the unique medium of lens, film, and chemistry, rather than the popular form of photographs that imitated paintings, with soft-focus and elaborate stage settings—Ansel Adams, Willard Van Dyke, and, somewhat reluctantly, Edward Weston (who was simply not a joiner, and yet philosophically supported what the group was trying to do, and ended up lending his name to it because of them all he was already well known and respected) formed the Group, Adams wrote a manifesto, and they set about arranging exhibits and establishing themselves as arbiters of a new vision of photography.

Young, enthusiastic, and perhaps a bit pretentious, the group eventually included Connie Kanaga, Dorothea Lange, Henry Swift, Alma Lavenson, Brett Weston, and a few others.  Both Van Dyke and Adams were aggressive about exhibits, and even opened their own (short-lived) galleries.  Eventually, though, they knew they had to break into the East Coast and convince Stieglitz of their worth.  Adams was the first to do so and after that the tastes of the country began to shift.

What emerges from the pages of Alinder’s book is in many ways the same struggles and passions that come with any artistic movement.  Also the conflicting personalities, the petty wars, the affairs, and the eventually dissolution of the Group as its members, all gifted and hard-working artistic obsessives, went their own ways.  The kind of work they did today is accepted as the given in what photography should be, at least as a base.  Adams and Weston shared a philosophy that the photograph should allow the subject to reveal its true nature.  For that to happen, the kind of standard obliteration of detail that the Pictorialists held as necessary had to go away.  Clarity, sharpness, richness of detail and tonality.  Hence the name of the group, f.64, which is one of the smallest apertures a large format lens can achieve (though not the smallest—f.128 is common for large format lenses), and by which the widest possible depth-of-field (meaning the range of acuity from nearest to farthest) and therefore the greates sharpness throughout the image.  By calling themselves this, they announced that stood in direct opposition to the fuzzy, false images that dominated galleries and public awareness.  (The elaborate staging of the Pictorialists moved easily and naturally into Madison Avenue advertising photography, but even there Group f.64 became the dominant æsthetic in terms of detail and clarity.)

The successes of all these artists would probably have occurred in any event, but not this way and not so soon and not so magnificently.  Obviously, some emerged as superstars while others have faded from our memory.  Alinder does service here by reminding us of them and placing them in their proper relation to the history.

One remarkable aspect of the Group is, in some ways perhaps, minor, but led to major consequences as the 20th Century unfolded.  Namely, they were the first artistic movement in the United States that made no distinction between male and female members.  All they cared about was the work and none of them seemed to have any retrograde ideas about women being lesser artists.  Right from the start more or less half the group was female, and we remember them well.  Imogen Cunningham and Dorothea Lange, certainly, but others now somewhat forgotten, like Consuela Kanaga and Sonia Noskowiak and Alma Lavenson informed the dialogue of 20th Century photography in powerful and significant ways.

The other thing that was a surprise to me was to now realize who wasn’t in the Group.  Based on the work alone, seeing it all the first time as a young photographer, I just assumed membership on the part of photographers like Walker Evan, Paul Strand, and Margaret Bourke-White.  Walker Evan, according to Alinder, did not care for Group f.64.  Ironically, he thought they were too  concerned with making”pretty” pictures, which was one of the complaints Group f.64 had toward the Pictorialists.  Evan was a chronicler—nothing should distract from the subject as found, not cropping, not composition, not fine printing.  Strand predated the group (and actually inspired Adams to change how he made images and prints) and Margaret Bourke-White was always a working commercial photographer following her own path.

Reading this and seeing how these people at the time had to modify and adapt their philosophy to account for some of the inconsistencies in their original intent and to make sure they did not exclude excellent work by too narrow a set of constraints was revelatory.  Alinder’s book has allowed me to better understand my own sources of inspiration.  As well, her prose are, like the people she writes about, examples of clarity, sharpness, and acuity.

Now A Word From Paradise Island

Of all the unlikely confluences you might stumble over in life, here’s a set: what do Margaret Sanger, lie detectors, and superheroes have in common?

If you answered “Wonder Woman!” you would be correct.  And if you could answer that before Jill Lepore’s The Secret History of Wonder Woman came out, you would be in rare company knowing that.  lepore_wonder_woman_cover

Wonder Woman was the creation of one William Moulton Marston, of the Massachussetts Marstons. If that familial pedigree escapes you, don’t feel bad, you didn’t miss it in history class.  The Marstons had come down a ways from their 19th Century heights, enough so that William was forced to apply for grants to finish his degree at Harvard in 1915 in psychology (Ph.D. earned in 1921).

Psychology was still a relatively “new” science and Harvard was in the vanguard.  Marston attached himself to the recently founded Harvard Psychological Laboratory and eventually ended up running it.  But only for a short while.  And that career curve would follow him the rest of his life.

Marston emerges in the pages of Lepore’s book as a brilliant man-child who just couldn’t seem to make a success that lasted of anything.  It becomes evident as his story unfolds that he simply confused personal fame with genuine success and made a series of bad calls with regards to where and how to apply his energies.  He was a self-promoter but when it came to backing up his claims he was lacking in substance.  He “invented” the lie detector and established it as a viable methodology and then ruined its use in court for decades through sloppy application and shoddy research.  Not only that, but the court case which apparently set the precedent for the exclusion of lie detector results from court rooms cast a pall on him personally, a circumstance he never quite recovered from.  If not for the three women he eventually lived with, it seems likely he would have been indigent.

That is where the story takes a turn toward the culturally amazing.  Marston was a hardcore feminist, of the “old school” that marched and advocated and disrupted (although being a man this rarely led to any inconvenience for him).  He seems to have been genuinely dedicated to the idea of equality for women and also a devotee of free love, a parallel movement to the equal rights movement that led to considerable discord among feminists.  He met and eventually married Elizabeth Holloway, who shared his philosophy.  She earned a law degree from Boston University and of the two possessed the discipline to apply herself to her chosen tasks, a trait Marston apparently lacked.  Eventually, Holloway (who kept her maiden name) was the principle breadwinner in the house, which grew to include Marjorie WIlkes Huntley, a woman he met while doing work for the Army under Robert Yerkes, and then later Olive Byrne, who was Margaret Sanger’s niece.

At a time when “alternate lifestyles” could land you in prison, this was, to say the least, unorthodox.  Marston apparently possessed enormous personal charmisma and none of the women ever left him.  After his death, the three more or less remained together almost until death.  Byrne and Holloway each bore two of Marston’s four children, but Byrne claimed to be a widowed cousin and refused to tell her own children that she was their mother.

How did all this produce one of the most well-known superheroes of all time?

Lepore, who obtained access to archives and letters never before made public, has detailed this remarkable story with clarity, humor, and insight.  Wonder Woman is probably one of the most rigorously constructed superheroes of the Golden Age of comics.  Marston did not see her as entertainment but as a tool for social propaganda, namely to advance the cause of equal rights for women.  In fact, Marston thought women should be in charge.  His Wonder Woman was no one’s servant and the first couple of years of her publication reads like an indictment of everything broken in society’s attitude toward women.

DC Comics was in a bit of a bind.  They were by and large uncomfortable with the feminist “messaging” but Wonder Woman, almost from the first issue, was one of the three most popular superheroes of all time, right behind Batman and Superman.  They were making too much money to just kill her off.  But once Marston lost control of the comic (due to illness, first polio then cancer) the editor put in charge seemed to do everything he could to declaw this difficult woman.  That she survived to once again be the powerful symbol of womanhood is remarkable, but it also goes some way to explain the bizarre manifestation of the Seventies TV show.  Comparing the Lynda Carter characterization to the forthcoming Gal Gadot realization shows a marked difference in attitude and approach.

TCDWOWO EC007 Wonder Woman Gal GadotThe holdovers from the Silver Age era in Carter’s look, while “true” to the traditional Wonder Woman, nevertheless supported the highly sexualized aspect of the character, while the new look shows the underlying mythology that was originally loaded into the construct, that of a warrior, and Amazon, something out of the bloodier, martial past of the ancient Greeks.

Lepore’s book examines the period as much as the principles and shows us the problematic world in which suffrage, feminism, and tradition often came into conflict.  The urgency to “put Wonder Woman in her place” especially after WWII played out in the comics in often tragicomic ways.  Wonder Woman becomes a full-fledged member of the Justice League only to be made the secretary.

Some of this may seem silly to modern readers.  Why couldn’t people just get over their prejudices?  Wasn’t it obvious, etc etc.?  Maybe.  And maybe that it was obvious was part of the problem.  Justice often conflicts with privilege, ethics with desire, and change is never welcome to the fearful.

Marston himself, as a human being, seems by turns charming and repulsive.  Art rarely emerges from pristine sources, unalloyed with the questionable.  But then, Marston probably did not see himself as an artist or Wonder Woman as a work of art. He had causes to pursue, beliefs to validate, and…well, bills to pay.

Lest anyone believe that Wonder Woman was all Marston’s invention, though, Lepore makes it abundantly clear that the women around him contributed just as much if not more and served in many ways as models upon which he based the character.

The Secret History of Wonder Woman is a fascinating look at a time in America and an aspect of culture that for too long was underappreciated.  It is a serious and thorough examination of popular mythology and from whence many of our modern concepts came.  In these times when women are once more required to defend their rights and the very idea of equality, it might be useful to see how and in what ways this struggle has been going on for a long time and under some of the most unexpected banners in some of the most unusual ways.

Survival, Strategy, and Shakespeare

Of all the things imagined surviving past a global apocalypse, Shakespeare may be an obvious choice but not one often noted in the scores of stories and novels devoted to the idea of starting over.

That is, after all, the chief impulse behind such stories, that the slate is wiped clean and humanity has a chance to begin again.  A few works have gone further, most notably Nevil Schute’s On The Beach, to wipe humanity completely off the stage, or Cormac McCarthy’s The Road.  But for the most part, someone must trod upon that newly set stage to continue the story, and who better to serve notice that this is exactly what such stories are about than Shakespeare.  “All the world’s a stage…”

Shakespeare haunts Emily St. John Mandel’s Station Eleven like Banquo’s ghost from beginning to end. The novel begins with the death of Lear—more precisely, the actor portraying Lear on stage in a theater in Toronto the very day a devastating virus explodes across the planet, going to kill 90% or more of the human race.  It’s never quite clear if Arthur Leander is a victim of the flu or a heart attack, but his demise signals the beginning of the end for all that is familiar, and establishes the primacy of irony that runs through the novel.

StationElevenHCUS2

Mandel has kept her focus on a fairly tight and circumstantial circle of people to tell her story. Arthur Leander, actor and a bit of a patriarch, anchors the narrative.  In some sense his life is Shakespearean—as a young man he escapes from an island which holds all that anyone could ever want, and his retelling of it takes on the glow of a mythic place people imagine as an impossible paradise.  The island, while wonderful in many ways, is not where he wants to spend the rest of his life.  He returns to foreign shores to seek his identity and becomes a mask of himself, an actor.  As he becomes famous he keeps returning, at least in memory and often in epistle, to that island.  He marries a woman who came from there, an artist who ends up working for a transnational corporation but privately draws a comic about a lost outpost in space, Station Eleven, that in many ways resembles Prospero’s island.  This is Miranda, the most stable of his three wives, all of whom are in some sense “rescues.”  But Miranda is of them all the most real, the most important.  As Prospero’s daughter, she is the foil to the worst of her father’s machinations.

As Leander is dying, the play is in the middle of act 4, scene 6, of King Lear, and the audience knows something is wrong when he delivers a line out of sequence.  But it’s a telling line for what is to follow.  “Down from the waist they are centaurs,” he says but then does not finish it and instead says “The wren goes to’t,” which is from earlier in the scene when Lear is comparing the progeny of adultery to his “lawfully got daughters” in their treatment of their father.  It’s a confused reordering but pertinent given what is later revealed.  The first quote, complete, reads: “Down from the waist they are centaurs, though women all above. But to the girdle do the gods inherit; beneath is all the fiends’. There’s hell, there’s darkness, there’s the sulfurous pit— burning, scalding, stench, consumption!”

Given Arthur Leander’s penchant throughout his career of drifting from one woman to another, ending finally with three bad marriages and apparently about to embark on a fourth, this may be nothing more than the fevered remorse of momentary self-analysis, but it serves too as a metaphor for all the misplaced confidence our civilization instills in its devices, which look so dependable and yet…the remorse is poorly placed.  Arthur Leander seems much like his namesake, an idealist, in love, swimming a narrow strait every night to be with his love who loses his way and drowns.

Like Lear, his apparent mistrust of women is also wrongly placed, as it would be women who ultimately save not only his memory but that which is important to him.

But really this isn’t about women, not in this context, but about the matrix of civilization.

Twenty years after the collapse, we join a company of players, the Traveling Symphony, which makes the rounds near Lake Michigan, playing music and performing plays.  Shakespeare has proved the most popular with their small audiences, made up of survivors who have settled in odd places—abandoned airports, old motels, campgrounds—and are relearning how to live without electricity or running water or antibiotics.  The Georgian Flu that killed so many left too few to maintain all the complex systems.  Civilization is retrenching at an 18th Century level, but the artifacts of that globe-spanning civilization are all around.

One of the principle members of the Traveling Symphony is Kirsten, who as a child was in that final performance of Lear by Arthur Leander.  While she remembers almost nothing from that time, she collects celebrity magazine articles and other trivia about him.  She also has in her possession two issues of the comic book Leander’s first wife, Miranda, self-published.  Station Eleven will become a bizarre point of connection with another character who takes an even stranger path after the collapse.

At this point I’ll stop describing the plot.  Metaphors abound, the book is rich in irony. Shakespeare would recognize the various perversities and tragedies as Mandel flashes back over Leander’s life and those who surrounded or intersected with him, some of whom survive.  (There is a fascinating thread involving a paparazzi who appears in the first scene as a newly-minted paramedic who tries to administer CPR to Leander on stage.)  Mandel establishes her connections and the lay-lines of the chronicle very well and very plausibly.  The individual stories are affecting and compelling.

Rather I would like to talk about how this differs from what many readers may expect from such a novel, namely in its choice of conceit concerning the central idea, namely that well-trod path of starting over.

Many worthwhile novels have been written in this vein.  I mentioned On The Beach, but a quick list of others includes Alas Babylon, Earth Abides, A Canticle For Leibowitz, The Postman, The Stand, The Long Tomorrow, Davy…the list is long because it’s such a tempting fantasy, the idea that we can dispense in a stroke with the contemporary world with all its problems and its uncooperative aspects and its stubborn, entrenched people and their privileges and start over.  It’s a desert island fantasy writ large.

Much of the canon is about how human ingenuity, exemplified by a plucky group of very smart survivors, manage to rebuild some semblance of the civilization just lost—only without all the pesky problems, like neurotic people or politicians and usually there are no taxes in sight.  The science fiction approach is on the wresting from the ruin worthwhile components of civilization and setting the stage for doing things right, however one might conceive of right.  Perhaps H.G. Wells was the first to put this view forward in his Shape of Things to Come with his corps of engineers that rebuilds a high-tech civilization in the burnt-out remnants of the old.

The ones that stay with you, though, accept that this is fantasy and that reality never affords opportunity for such neat solutions.  That a collapse like this will be exactly that—a collapse, an end.  Some stories assume humanity can’t survive this final doom.  Most acknowledge that a few will but nothing will be preserved in any recognizable form.

For some this may seem like a thoroughgoing calamity.  For others, justice served.  Mandel—like Walter Miller, like Leigh Brackett, like, recently, Robert Charles Wilson in his Julian Comstock—recognizes that it is simply something that may happen. The question then is “What now?”

So her story is about how that first 20 years might look for a small group of people who are predisposed to preserving stories.

There was the flu that exploded like a neutron bomb over the surface of the Earth and the shock of the collapse that followed, the first unspeakable years when everyone was traveling, before everyone caught on that there was no place they could walk to where life continued as it had before and settled where they could, clustered close together for safety in truck stops and former restaurants and old motels…most people had settle somewhere, because the gasoline had all gone stale by Year Three and you can’t keep walking forever.  After six months of traveling from town to town—the word town used loosely; some of the these places were four or five families living together in a former truck stop…

The landscape is peppered with the remnants of what came before and a new generation is growing up having never experienced any of it when it worked, only hearing stories of what it had once been like.  One can already see the rough shapes of future myth and lore emerging from the tales the older folks are telling the youngsters.

But over and through all this Mandel is telling stories about how people come to be where they end up and how they take meaning from that.  They all have escaped, in one way or another, from Prospero’s island, only to find themselves, like Viola, washed up on a foreign shore, another island, and having to improvise a new identity to fit a life they never expected to live.

That there is no technological answer to anything in Station Eleven should be no surprise. Mandel’s purposes aren’t there.  She’s not actually rescuing anything.  Nor is she rebuilding.  If anything she’s portraying a kind of evolution.  Start here, with these elements, and run them through those changes.  Where do we end up?

Subsequently she has written a very good novel which happens to be science fiction (as opposed, perhaps, to science fiction which happens to be a good novel) and has laid out a number of intriguing questions for our contemplation.

Shakespeare, for instance, understood irony and tragedy, perhaps from the Greeks who first perfected the form, who built on myths.  What kind of myths might emerge from a tradition based first on Shakespeare?

One of the purposes of stories like this is to dramatize in stark relief something that goes on all the time, namely the replacement of one world with another.  We tend not to experience that way because the changes happen sporadically, cumulatively, resulting in one day appreciating the quaintness of a past that no longer pertains.  But there is no sudden shock of change since the break points are small and myriad and feel “natural.”  Post apocalyptic stories are about that very change, except overnight and all at once.  They all ask the same question, though—if you were washed up on an island, cut off from the world you always knew, what would you wish to find washed up with you?  And what do you think you might be able to rescue from a past you frankly might know very little about, even though you inhabited it as a citizen in good standing?

Of course, while you were fretting about that, life would, as it does, happen, and you would have to deal with it, as always.

Mandel avoids the trap of prescription.  She has no idea how things will turn out.  But she displays a sharp understanding of how people respond to shock.  That and a Shakespearean sense of irony elevates Station Eleven several rungs above the average.

Sword, Double-Edged, Metaphorical Steel

I finished reading Ann Leckie’s Ancillary Sword weeks ago and have been turning it over in my mind ever since trying to decide on the best way to talk about it.  As sequel to her surprisingly popular Ancillary Justice, it is exceptional and unexpected.  Yes, it carries forward the story of Breq, the lone surviving aspect of what was once a vast AI, a ship possessing a cadre of ancillaries which formed the extensible components of its intelligence.  Yes, it continues on with an examination of the universe she established and the civil war that is in the process of breaking out.  Yes, we find a continuation of many of the narrative devices and their concomitant concerns.

No, it does not actually go where one might expect such a sequel to go.

This begs the question of expectations, however, which also has to do with whether or not art is obligated to meet specific expectations.  Surprise, after all, is supposed to be one of the chief pleasures of art.  The surprise of the new, of discovery, of revelation.

That Leckie’s sequel does not bend to the predictable is a good thing.  That it then takes us to another level of questioning not only the premise of the work but of our own civilization is a bonus.  That it does this so well is triumph.

Which brings me to my entreé into this review, because going where one might not expect is part of the overall pleasure of this series, which has at least one more novel to come, but clearly offers possibilities for satellite works if not direct continuations.

One of Leckie’s tactics has been to replace the male pronoun with the female throughout.  A simple change, designating all people as “she” and “her” rather than “he” and “his” without venturing upon the complications of actual gender transformations.  A simple change…but with apparently complex consequences for readers.

You wouldn’t think, among self-identified science fictions readers, such a modification would have significant effects.  We should all be used to shifts in perspective.  Writing about the alien, after all, is what a good deal of SF is all about.  We have even grown accustomed to the idea of the familiar being alien, so much so that tales of possession, of cyborgs, of cloning, of genetic modification, even of next-stage evolution are part and parcel of the idiom with which we’ve been dealing for decades.  Yet somehow, it seems, the idea of a human possessed by an alien or an alien masquerading as a human is far more decodeable for some than of the human in all its familiarity being alien.

We write from the basis of the culture with which we are most familiar.  More importantly, we read on the basis of that culture.  For or against or across, our culture, whether we like it or not, supplies the language, the metaphors, the analogies, the foundations of how we perceive all that we encounter.  Science fiction has been one of the most consistent forms of turning that foundation upside down and inside out in the pursuit of its primary effects—cognitive dissonance among them.

So it’s fascinating when a work does something that upsets even the well-traveled carts of the experienced SF reader and leaves confusion in its wake.

Confusion is an effect as well.  Often inadvertent and unintended, it’s a breakdown in the connection between Our World and the world of the text.  James Joyce is probably the one who used this (and both benefited and suffered from it by turns) to greatest effect, and although Ulysses is not science fiction per se, it nevertheless shares many æsthetic conceits with SF.

But Joyce dug deep and deployed many a device to skin the world and show us what lies beneath the comforting patina of “civilization” and his constructs are complex and sometimes labyrinthine.  In contrast, and by virtue of this language called SF, Leckie made a simple surface change and skinned us.

Upon first encountering the device in Ancillary Justice I was at first confused.  But once I realized that confusion came from my subconscious desire to easily and readily “visualize” each and every character without having to bother with “character,” I was delighted.  The device brought me face to face with my own biases and showed me just how dependent I was on a simple biological binary.

But not quite so simple.  As I read on I realized that this reliance on male-female identification markers allowed a certain laziness to creep in to my experience of the world, not because male and female are in any way divisive so much as that they substitute for a suite of often unexamined expectations that come under the headings “normal” and “special.”  Leckie’s substitution of one standard pronoun for another erased those too-easy sets of assumptions and forced one to read everyone as “normal” unless otherwise designated by characterization.

Once I recognized what was happening in my own reception of the proffered device, I was delighted, and subsequently read more carefully, amused at each instance where my default assumptions were overturned and I was forced time and again to deal with each character as unique.  I accepted the text from that point on as a challenge to the norm and have since found occasion to be dismayed and delighted by other reactions which, in their turn, baffle.

Apparently, for some, the initial confusion never abates.  That persistent “she” throughout causes annoyance without ever becoming a normative aspect of the culture depicted.  The reader finds it difficult to either shed the bias being challenged or accept that this is simply a mirror image of a culture norm we already live with with the addition of a special category.  Leckie includes that special category, of course, as an aspect of outside cultures that still retain separate male-female designations, and her main characters must check themselves in such encounters so they do not cause offense by getting the designation wrong.  The very confusion and annoyance complained of by some readers is right there, part of the background of the story.

Because, whether we choose to admit it or not, in our culture, “she” and “her” are special category labels having little to do with the purposes of biology and everything to do with the sociology of biology.  The male pronoun is normative, the default.  One need never remark on someone’s maleness in conversation to comment on a distinction which may or may not be important.  But the introduction of the female pronoun prompts a repositioning of mental stance, a reassessment, however unconscious, that “allows for” a difference our culture says is important regardless of context.

On its simplest level, we expect a binary representation of what is human—the norm and the other. Encountering Leckie’s work, the other is obscured almost to the point of nonexistence, and our expectation that one half of the population should be designated in some way special by virtue of biology is frustrated.  We’re forced to see each and every character as a person, period.

It can, indeed, be a bit annoying, especially when other markers are absent or obscured.  One finds oneself making assumptions about which is male and which female which one suspects are all wrong.  This becomes even more interesting with the inclusion of ancillaries, which are mere biological extensions of an artificial intelligence, sex characteristics rendered irrelevant by this fact.  In Ancillary Sword we find a fully human crew aspiring to behave like ancillaries as a sign of distinction, which sort of adds a third gender into an already obscured mix.

Naturally, sleeping arrangements become problematic.

All of which plays elegantly into the matter at hand in this second novel, which proves to be not only theoretically fascinating but serendipitously topical.  Ancillary Sword is a social justice story.

In the aftermath of the events in the first novel, Breq is made a fleet captain by the Lord of the Radch and charged with securing one of the systems still connected through a functioning gate.  The nature of the civil war beginning to unfold is in itself a twisted bit of political legerdemain—the Lord of the Radch, thousands of years old, is herself a distributed intelligence who has become divided over a policy question involving an alien race.  She is now at war with herself, each side feeling she is the legitimate repository of right action.  The entirety of the Radch (which in many ways reminds one of Austria-Hungary at its peak) is caught between the factions of what once was the embodiment of its identity.  Breq allies herself—itself, since Breq still feels not human, but a surviving ancillary-cum-ship—with the faction that seems to represent a measure of sanity in terms of the realpolitick at hand.  It’s a conditional alliance, to be sure, because Breq has little regard anymore for the Lord of the Radch in any context.

Arriving at the system, Breq finds a world with many problems buried beneath a surface that shimmers with the sophistication and wealth of all that the Radch is supposed to be.  The system itself was annexed in relatively recent history and there are communities of other cultures that were imported as workers.  What soon becomes clear is this is a plantation system and the overlords have become so entrenched in their privilege they do not seem to be remotely aware of the oppression they oversee.

Leckie adroitly sets privilege in opposition not only to right but also as a dangerous distraction in a potential war with an alien race.  Revealing the deeply-imbedded dysfunction is necessary to preparing the system for larger problems ahead, but it is also something Breq, who has seen firsthand what petty power plays over position and privilege can cost, simply will not tolerate.  Overturning an entire system of behavior, though, cannot be done by simple fiat and the subversion Breq employs to undo it is as trenchantly relevant to present politics as it is satisfying drama.

What proves equally satisfying is at the end discovering that the simple device deployed with a pronoun proves as necessary to the revelation—for the reader—of the nature of oppression because it establishes a norm of equity difficult to imagine shorn of the biases we bring to the story.  Because that pronoun challenges us and taunts us to continually pay attention to how we’re reacting and what justifications we use to ignore what may be similar problems within our own society.  It’s a lesson in labels and how potent they can be, especially when unexamined and unchallenged.  Leckie is using the female pronoun to establish a norm we honestly do not embrace and against that norm shows us the asymmetry with which we live quite willingly, powerless to change not because of the force of social pressure but because we often just can’t see a reason to.

Now, that’s what science fiction does at its finest.

Roundup 2014

Time for a year in review.  I am bound to say, though, that my reading once more has been disappointingly thin.

When I am working on a novel, time for leisure reading necessarily goes down. Reading for research goes up, but that rarely requires me to finish an entire book.  I look at my reading list for the year and the only titles I ever include are those I’ve completed, so on such years I appear to be under-achieving.

That said, I completed 42 titles this year. (To be sure, I’ve probably read, by volume, closer to 90, but most of those I did not finish.  For instance, I am still plodding my way through Thomas Piketty’s Capital in the 21st Century.  I’ll likely have to start it over.)

There were several that were rereads for me.  Unusual in that I seldom if ever reread a book. I don’t read fast enough to feel good about covering old ground when there’s so much new to be trod.  But I started up a reading group at Left Bank Books—Great Novels of the 22nd Century—and I’ve been choosing classics to discuss, so among the rereads were: Dying of the Light by George R.R. Martin (I wanted to show people that he could write, write well, and write economically about something other than the War of the Roses, although to my surprise I found many of the same themes playing out in this, his first novel); Slow River by Nicola Griffith (her Nebula winner and still, I’m happy to say, a powerful, poignant novel); Downbelow Station by C.J. Cherryh, one of the best interstellar warfare novels ever penned and very much an inspiration in my own work (for one thing, one has seldom found such solid treatment of working class issues in such a novel); Burning Chrome by William Gibson, which just made me wish he still did short fiction; Timescape by Gregory Benford, one of the best time travel novels ever written, although I’m bound to say it felt socially dated, though not fatally so; Nova by Samuel R. Delany, a lyrical, multilayered congeries of mixed mythos in an exuberantly realized interstellar setting; A Case of Conscience by James Blish; Gateway by Frederik Pohl; and now Leviathan Wakes by James S.A. Corey.

While some of these provided me with revelatory experiences (I missed that the first time through! and I never thought about it this way before) the chief benefit of this exercise for me was in seeing how these books have informed what came after.  Over the past three-plus decades since it’s original publication, Timescape reads like a novel which escaped much of social consciousness progress even of its own time.  Not egregiously so, but there is only one female scientist in the story and she is very much in the supporting cast category.  Certain political strands feel thin.  None of this is a detraction from the primary story or from the fact that Benford is one of our better stylists (which really makes me wonder who was doing what in his recent collaboration with Larry Niven, which I found virtually unreadable because of simple clunkiness in the prose) and paid attention to character more than many of his contemporaries—or, I should say, realized such attention better.  On the page, his people feel real, whole, fleshed out.

The time travel device in the novel leads directly into one of the best books I read this past year, Gibson’s new one, The Peripheral, just recently reviewed here.  Along with H.G. Wells’ The Time Machine and a handful of others, this enters my personal canon as one of the finest time travel works ever written, even though the plot seems deceptively commercial.

The most telling revelation of my rereads has been in finding my own reactions to the texts so different.  I remember my initial response to many of these as being quite different.  True, I missed many very good things in retrospect, but also I forgave a lot more than I do now.  There are books I come across today which I find off-putting which I know 20 or 30 or 40 years ago I would have raved about.  Much of this comes down to simple artistry.

Or perhaps not so simple.  I found it interesting that my more positive response to Delany’s Nova for its elegance and its precision left others a bit cold.  One brings a history of reading to a book which largely determines how one’s expectations will be satisfied…or disappointed.

I did reread James Schmitz’s Demon Breed.  Not for the reading group—it is sadly unavailable—but to refresh my memory for another project, and I still found it to be an exhilarating book, well ahead of it’s day in its basic assumptions about gender roles.  This is one I have now read four times since first discovering it as an Ace Special way back in 1969 and each time I’ve found it holds up extremely well and attests to an underappreciated genius.

Knowing now more clearly that elegance of execution is vitally important to me, my patience for certain kinds of writing has diminished.  I mentioned the Niven/Benford collaboration which I found impossible to get through, although it crackled with ideas.  What I have learned (for myself) is that the entire argument over style versus substance is a straw man.  It assumes they are not the same thing.  Quite the contrary, they are inextricably entwined.  Very simply, style emerges from a clear grasp of substance.  A sentence works at several levels, revealing information of different kinds in the way it presents its contents to the reader.  A lack of substance will show in a stylistic failure.  Too often we erroneously hear “style” as code for “decorative.”  Not at all.  The style is all important to the conveying of mood, of character, of setting, of theme.  But style cannot impose any of these things—the style is a result of the writer having a solid knowledge of what needs to be conveyed and an attention to how the sentence should be written in order to convey it.

Which is why I say style is an emergent property.  Almost no one gets to this level without a lot of practice, over time.  Which is also why most writers become clearer—“better”—as they go on.  They’re learning what matters, paring their words down, and revealing more.

For example, two novels I read this year which could not be more different serve to show how that experience and growing clarity result in unique styles.  Jim Harrison’s Brown Dog (which is a collection of linked novellas about the title character) and Richard Powers’ Orfeo.  On the page, the writing could not be more different.  Brown Dog is a semi-literate, often-itinerant aging naif who tells his story in what appears to be simple-minded affectlessness.  Things happen, he’s bounced around by events, lands (inexplicably) on his feet (wobbling often) and while clever is so guileless that one begins to believe in guardian angels.  The style reflects this.  Read carefully, though, and a world is revealed in each passing sentence.  Powers, on the other hand, reads like a musician scoring a great symphonic cycle.  The language is rich, evocative, challenging—and yet absolutely transparent, consistent with the story.  It can only be what it is in the telling of this particular tale of a failed composer who at the end of his life finds himself on the run and becoming an icon of his own life, with one more song to write and perform.  Each sentence reveals a different world, just as clearly, just as uniquely.

Style comes largely, therefore, from perspective.  Perspective informed a pair of books I read about the genre in which I labor, science fiction.  I finally read Brian Aldiss’s Billion Year Spree, which is an excellent history-qua-analysis of science fiction.  Because I had it to hand, I then read Margaret Atwood’s collection of essays about her experience of SF, In Other Worlds.  I wrote a longish examination of my gleanings from these two very different-yet-similar works, but let me just say that in them is revealed the font and consequence of perspective.  Atwood, for all her professed appreciation of science fiction, does not “get it” while Aldiss, who breathed it in like air in his youth, does, leading them both to unique understandings.

Another “paired reading” I did this year was Dorothy Sayers’ Lord Peter Wimsy novels, Gaudy Night and Whose Body?  It was fascinating because the latter is the first Wimsy novel and the former is late in the cycle.  What I found fascinating was the growth of the character.  The late Wimsy is very different from the early and yet are clearly the same man.  (Another instance where style is essential to the content, the revelation of such growth.)

One of the most interestingly-written novels I found was Wives of Los Alamos by Tarashea Nesbit, which can be said to be all about style, and yet nothing about style.  It is written in first-person plural, an ever-present “we” as the story is told from a collective point of view which nevertheless reveals individual character.  The “wives” form an amalgam of experience in opposition to, judgment of, and distance from the events that formed the core of their subsequent lives as they followed their scientist and engineer husbands to Los Alamos to work on the atomic bomb.  A stunningly gutsy thing to do for a first novel, marvelously successful.

I finished the immense Heinlein biography with volume 2 of the late William Patterson’s work on one of the major figures in science fiction.

There was also Thomas Pynchon’s newest, The Bleeding Edge, which exhibits many of Pynchon’s trademark stylistic acrobatics in what may be one of his most accessible convolutions on the American obsession with conspiracy.  Often one encounters a Pynchon novel rather than reads it and you come away with a sense of having toured a vast foreign country, appreciating many things, but knowing you haven’t grasped it, possibly not even its most salient features, but glad you made the trip.  Not this one.  It felt whole, penetrable, complete, and possessed a satisfying conclusion.

One of the most pleasant pair of readings this year was Ann Leckie’s Ancillary Justice and its sequel, Ancillary Sword.  Ambitious and superbly realized, set in an interstellar milieu with fascinating aspects and a unique approach to empire, both books tell their tales from the viewpoint of an ancillary—basically a human-made-robot extension of a much larger AI, a ship mind (borrowing a bit perhaps from Iain M. Banks) that is destroyed in the first book with a single ancillary survivor.  Breq remembers being a ship, being one facet among hundreds, having access to vast data resources, but now much function as a single consciousness in a lone body.  Leckie is indulging an examination of the nature of empire, of morality, of political expedience, and what it means to be a part of something and also what it means to be outside of that something.  What I found most gratifying was that the second volume, while picking up the story a heartbeat after the first book, was a very different kind of book, about…well, not about something completely different, but about a completely different aspect of this enormous subject she’s chosen to tackle.  Serendipitously, a timely book as well, dealing as it does (effectively) with social justice and minority oppression.  I find myself looking very much forward to the third book.

One of the biggest surprises of the year was Meg Wolitzer’s The Interestings.  I reviewed this as well and have nothing to add to that.

I don’t think I read, cover to cover, a bad book.  I’ve largely gotten over the compulsion to finish any book I start.  If it’s bad, it isn’t worth the time.  I readily admit I may and probably am wrong about many books that strike me this way.  I’ll talk about them if I find something instructive in my negative reaction, but otherwise I’ll just put it down to taste.

A good number of the nonfiction books I read this year concern the Napoleonic Era because of one of the novels I’m working on.  One I can recommend whole-heartedly is Tom Reiss’s The Black Count, a biography of Alexandre Dumas’s father, a creole who became general under Napoleon.

I am hoping to read more next year.  I have a to-be-read pile on the verge of daunting.  Working in a bookstore as I now do is also a problem because every day I see another book or two I want to read.  When? I ask myself.  It’s not always sufficient to dissuade me.  As I said, I read slowly these days.  It’s been a long time since I’ve read a book in one sitting.  That said, though, I think I’m getting more out of them now than I used to.  An illusion, maybe, but…

Have a safe, bookfilled 2015.

Time and Motion

William Gibson is, if nothing else, a careful writer.  You can feel it in the progress of any one of his novels and in the short stories.  Careful in his choice of topic, placement of characters, deployment of dialogue, style.  He sets each sentence in place with a jeweler’s eye to best effect.  The results often seem spare, even when they are not, and have invited comparisons to noir writers, minimalists, modernists.  Entering upon a Gibson novel is a step across a deceptively simple threshold into a finely-detailed maze that suggests multiple paths but inevitably leads to a conclusion that, in hindsight, was already determined had we but noticed just how sophisticated a writer it is with whom we’re dealing.

His last set of novels, the Bigend Trilogy, was not even science fiction, though they felt like it.  The application of a science-fictional perception of how the world works produced a dazzling bit of dissonance in which the ground itself became familiar through alienation.  He does that, shows us something we should be utterly familiar with as if it were an alien artifact.  As a result, the shock of recognition at the end contains a thick cord of nostalgia and a sense of loss mingled with new discovery.  The chief discovery, of course, is the realization just how close we are to what we think of as The Future.  Through this effect, he renders the future as both less alien and stranger at the same time.

Which is something he indulges fully in the opening chapters of his new novel, The Peripheral.

Author William Gibson. (by Michael O'Shea)

For a while you don’t know that the two points of view are not in the same world.  It’s a masterpiece of misdirection achieved through the intermediary of a game.

Flynn Fisher’s brother is ex-special ops military, living in an old airstream in a town in the middle of a mid-21st century rural America that is clearly struggling with the unstable economy.  To make extra money, he often moonlights as a beta tester on new games.  The novel opens when he brings Flynn in to sub for him one night while he goes off to confront a radical religious group he hates, known as Luke 4:5.  (The verse reads: Then leading him to a height, the devil showed him in a moment of time all the kingdoms of the world.  Even here, Gibson is playing at metaphors pertinent to the novel in its entirety.)  Flynn used to do this sort of work herself but quit when the games became more and more violent.  He assures her this isn’t like that, she’ll be running a security drone of some kind keeping paparazzi away from a high-rise luxury apartment.  He’ll pay her well, as he’s being likewise well-paid.  Just one night, maybe two.  She agrees.

The simulation seems to take place in a city she sort of recognizes and may be London, but it’s all different from the London she knows.  It’s as her brother claimed, flying interference, until the second night when the woman living there is murdered most horrifically and Flynn is a witness.  Thinking it’s still a game, she wants nothing more to do with it.

Meanwhile, Wilf Netherton, a publicist living in London, is working with a performance artist who has been tasked as a negotiator to a colony of self-modified humans living on an artificial island of reformed debris.  Wilf’s job is to keep her on task, which can be very difficult as she is very much a rebel and can go in unexpected directions without any warning.  As she confronts those with whom she is supposed to negotiate, something goes wrong and she ends up killing the leader.  Another murder.

Netherton’s associate, an operative in government intelligence, must divorce herself from the fiasco and cut ties with Netherton.  He goes to ground with a friend of his, a member of a powerful family of Russian descent, who has a unique hobby—he operates a “stub” in history.

At this point we realize that Flynn and Netherton are not simply divided by class and place but by time itself.  Netherton’s London is 70 years in Flynn’s future and is the London wherein Flynn witnessed the murder of the woman, who turns out to be the sister of the performance artist who just committed a second murder.  For her part, Flynn is in their past, a past Netherton’s friend has been playing with via a form of time travel that is based on the transfer of information.

And we are now fully in the grip of one of the cleverest time travel stories in recent memory.  Nothing physical travels, only information.  Gibson has taken a page from Benford’s classic Timescape and wrought changes upon it.  Flynn and Netherton “meet” once a police inspector of Netherton’s time becomes involved and starts running the stub Netherton’s friend has set up.  She needs a witness to the murder before she can act.  Flynn is that witness.  What follows is well-imagined set of antagonistic countermeasures that affect both worlds economically.

And that may be one of the most interesting subtexts.  Flynn finds herself the titular head of the American branch of a corporation which till then only existed as a device to explain the game she thought she was beta testing.  As such, she becomes enormously wealthy out necessity—she is under attack by the forces allied to the murderer in the future.  Politicians and corporations change hands, the economy is distorted, the world severed from its previous course, and everything is changed.

Gibson is indulging one of his favorite ideas, that information is possibly the most potent force.  Data has consequences.

Flynn is one of Gibson’s best creations since Molly Millions.  Smart, gutsy, practical, and loyal to family and friends, she adapts quickly to the staggering reality into which she and hers have stumbled.  She manages in both time zones admirably but not implausibly.  As counterpart, Netherton is an interesting case study of a man who hates the times in which he lives, is by far too intelligent to ignore it, and subsequently suffers a number of self-destructive flaws which he gradually comes to terms with as his interactions with Flynn progress.

At the heart of the novel is a question of causality, certainly, but also one of responsibility.  The pivotal point in history that separates Flynn’s world from Netherton’s is an event euphemistically called The Jackpot.  It’s a joke, of course, and a twisted one at that, as it was only a jackpot for a few who survived and became, ultimately, even wealthier than they had been.  The label refers to a collection of factors leading the deaths of billions and the loss of an entire era due to humanity’s inability to stop itself from doing all the things that guaranteed such an outcome.  It’s a cynical insight and not a particularly difficult one to achieve, but Gibson, as usual, portrays it with a dry assessment of how it will actually play out and how it will look to those who come after.  His conclusion seems to be, “Well, we really aren’t all in this together.”

The apparent simplicity of the narrative is another mask for the games Gibson plays.  It doesn’t feel like a profound or dense work.  Only afterward, in the assessment phase, do we begin to understand how much he says, how solid are his insights, and how rich are his conceits.  Gibson creates a surface over which the reader may glide easily.  But it’s a transparent surface and when you look down, there, below you, is a chasm of meaning, awaiting inspection, offered in a moment of time.

Follow

Get every new post delivered to your Inbox.

Join 5,438 other followers