Easy Habits and the Consequences of Belief

At first glance, the two books could not be more different. Subject, tone, everything seems different. Not at odds so much as…nonoverlapping.

Which is ironic, since both deal, in their separate ways, with that very idea, the separation of areas of knowing.

Probably because I read them so close together I recognized their shared concerns as clearly as I did. Whatever the reason, it struck me as obvious in so many ways that I began to recall all the other books over the last few years that could be likewise gathered within this same subset, all on distinct topics and yet all based on, to some degree, an analysis of the same human propensity to disregard evidence when it contradicts belief.

Let me begin with the more general of the two.

Jerry A. Coyne is an evolutionary biologist.  Compared to others with public profiles, he has published few books.  Three, to be precise.  His latest, Faith Vs. Fact: Why Science and Religion Are Incompatible, is a direct challenge to Stephen Jay Gould’s idea of “nonoverlapping magisteria.”  Gould’s premise is that science and religion should not conflict with each other because they are concerned with entirely separate realms of knowing—hence the nonoverlapping part—and except for certain agenda-driven partisans, there is no reason for them to be in conflict.  Coyne sees this as accommodationism, which he thoroughly discredits in his book.

My claim is this: science and religion are incompatible because they have different methods for getting knowledge about reality, have different ways of assessing the reliability of that knowledge, and, in the end, arrive at conflicting conclusions about the universe.  “Knowledge” acquired by religion is at odds not only with scientific knowledge, but also with knowledge professed by other religions.  In the end, religion’s methods, unlike those of science, are useless for understanding reality.

Coyne identifies Accommodationism as an attempt not to stir the hornet’s nest, because scientists are often dependent on politically sensitive funding.  Science, especially Big Science dealing with questions of origins, is expensive and the days of the independently wealthy scientist are largely gone.  Rocking the boat by annoying those who hold the purse strings would seem ill-advised.

The conclusions, he goes on to argue, of scientific inquiry continually poke holes in those claims by religion that still assert authority over secular matters.

But more than that, such deferral to authority erodes our critical capacity and can lead to false conclusions and bad judgments, all because we accept the hegemony of “faith.”

…a word defined in the New Testament as “the substance of things hoped for, the evidence of things not seen.”  The philosopher Walter Kaufmann characterized it as “intense, usually confident, belief that is not based on evidence sufficient to command assent from every reasonable person.”

The sticking point for many would be that “reasonable person” proviso, for surely, as Coyne concedes—often—there are many reasonable people who nevertheless espouse a religious faith.  What are we to make of that?  Is the basis for the assessment of “reasonable” perhaps too ill-defined or is the scope of “belief” too broad?

Coyne takes us through the process, giving a thorough explication of why science and religion are incompatible if not a convincing argument for the down-side of belief for belief’s sake.  He tells an anecdote early in the book about how he yearly teaches undergraduates a course in evolution, which the majority do very well at, but admit, after earning their A, to not believing a word of what he taught.  Because it contradicts their religious belief.

It is this that Coyne sees as the dangerous aspect of faith as promulgated through religion, the a priori rejection of evidence in favor of a set of usually unexamined beliefs.  He takes us through a catalogue of negative manifestations, from the rejection of medicines to the acts of terrorists to the rejection of solid science (like climate change), that have their underlying justifications in religion.

Coyne, an admitted atheist, puts all this forward while taking pains to admit the personal comfort to be found in religion by many people.  From a certain point of view he tends to go the extra kilometer to be fair.  Of course, going through the comments on various review sites, those predisposed to reject such arguments accuse him of profound bias if not outright malicious intent.  One cannot help but wonder if they bothered to read the book, all or even in part.

The book makes its case clearly and concisely.  It avoids the polemic outrage to be found in other tomes by big name atheists by sticking largely to evidentiary concerns and philosophical arguments.

But, one may ask, so what?  Religion and science are two realms in which most people would assume they have no stake. Castles in the air, esoteric arguments about things that have no impact on our daily lives.  Most people seem to keep a religion much the same way they have a pet and certainly in the West the majority live secular lives and only rarely feel compelled to apply their religious convictions to anything.  As for science, as long as the technology we depend on works, all the rest is so much theoretical handwaving.  It makes no difference if we have almost no understanding of quantum mechanics and certainly evolution is just tenure-building nonsense having to do with million-year-old bones and what kind of textbooks the school district might use next year.  Nothing to do with us in our daily lives. So what if people rely more on faith and belief in making their daily judgments than on evidence-based science?  We operate more on heuristics defended by aphorism than by reason and applied understanding, or so Daniel Kahneman tells us in his excellent study, Thinking, Fast and Slow, and we by and large get along perfectly well that way.

How does this argument concern me?

Johann Hari’s new book, Chasing The Scream, has an answer to that.  Of sorts, if we but make the epistemological leap.

Hari is writing about the drug war. It is, for him, as much a personal examination as a journalistic one, as he admits to having family and friends who are addicts.  He begins with a simple question.

I scribble down some questions that had puzzled me for years.  Why did the drug war start, and why does it continue?  Why can some people use drugs without any problems, while others can’t?  What really causes addiction?  What happens if you choose a radically different policy?

Okay, three questions.  Still, simple, basic questions, to which any reasonable person might reasonably expect a reasonable answer.

Yet we spend billions, bully small countries, destroy thousands if not millions of lives, all in pursuit of policies which rest on an appalling lack of informed justification.  By the end of the book you come to see that there are no answers to those simjple questions which in any way validate our continuing on with things as they are.  As they have been for a hundred years.

Hari goes back to beginning of the war, before drugs were illegal, and takes use through the history.  Back to Harry Anslinger, the head of the Federal Bureau of Narcotics, who fueled the drug war for the twin purposes of maintaining his agency and exorcizing demons that had possessed him since childhood, even in the face of substantive research and sound arguments denying his approach to the problem had any merit and more than ample evidence—Prohibition—that it would lead to catastrophe, not only on a national but on a global level.  Hari details Anslingers battle to destroy Billie Holiday and his use of intimidation and police tactics and, subsequently, U.S. foreign policy to assure the continued crusade to eradicate drugs and destroy addicts.

Not because of any evidence Anslinger understood that led him to think this was the only if not the best way, but because he believed it in spite of growing mountains of evidence that this was a wrongheaded approach.  He suppressed evidence, hounded physicians who dared present alternative models to the drug problem, intimidated politicians—except those who could secure him funding—and strenuously denied the validity of any evidence that  contradicted his belief.

He believed.

As many did and still do.  It is this that trumps hard evidence.

Even as a young adolescent I thought the argument for our drug policies was lacking.  I thought at the time that I just didn’t understand.  Never having been in the least interested in drugs and knowing few if any who were involved with anything more serious than marijuana, it seemed not to concern me.  Later, I did a short stint as a volunteer drug counselor, but the work was far more than I could handle at the time.  I trusted the people in charge knew what they were doing and certainly the gang violence associated with drugs seemed to make a persuasive case that this was a worthwhile and often desperate fight.

But as the years went by and the war continued, I began to notice bits of research here and there and how certain politicians contradicted themselves and how the prison population, especially in the wake of Nixon’s near militarization of the police community and the drug war ethos, was growing and in very worrisome ways.  I began to seriously rethink my position with Reagan’s zero tolerance policies and the mandatory sentencing guidelines he established through Ed Meese, one of the notable Puritans of the modern age.  Even so, I had the nagging suspicion that maybe I was just missing something.  Certainly I didn’t approve of drug addiction, but I more and more came to believe that these people needed help, not punishment.  We understand that process very well with alcohol, why is it different with narcotics?

And besides, there are plenty of people who receive perfectly legal prescription narcotics and never become addicts.

The number of holes in the picture kept growing.  I no longer trusted stated drug policy, but I didn’t understand the instransigence of people over reform.

Hari’s book lays it out very clearly.  Money is high on the list.  We fund too many of the wrong people at too high a level for them to be easily weaned from the teat.  Foreign policy is also tied to this, especially in these days of international terrorism which has a strong drug component.  But the factor that ties this in to Jerry A. Coyne’s book is the one Hari covers only glancingly.

Belief.

It is easier to rely on what we have always believed than to look at evidence that requires us to change our mind.

Many aspects of our lives are covered by this observation, but where problems arise are those with political and social ramifications.  The persistent beliefs about the poor, about minorities, about people with different beliefs, even when evidence is provided which significantly challenges such beliefs and suggests strongly that not only are they wrong but that we would be better off discarding them, we cling to them.

Hence my conflation of these two books and the suggestion that they share a common idea.

Not that I argue that all beliefs are wrong.  What is wrong is the intractable nature of unquestioned belief.  The only reason the drug war continues is that it has considerable popular support and the only reason it has that is that many people cannot bring themselves to change their minds about it in the face of not only evidence but what we euphemistically call common sense.

But that can be said of so many things which directly and indirectly impact our lives.

Perhaps it is a stretch and perhaps I argue out of my own biases, but it seems to me the most valuable tool we can have in our intellectual toolbox is the ability to say “well, I believe that but I might be wrong.”  Faith cannot maintain, however, among people who are able to say that and apply it to anything and everything.  Science is built on exactly that principle—all knowledge is conditional—but belief, as exemplified by religion, thrives by the absence of that principle.  It says some knowledge is absolute, not to be questioned.

In the contemplation of matters of theological concern, this perhaps offers consolation, comfort, a certain utility, to be sure.  But it is easy for unquestioning acceptance of arguments from authority to become habit and then apply it to anything that our prejudices suggest we may wish to avoid examining.  Drug addiction is an unfortunate affliction and it may be uncomfortable for people to see it for what it is—a disease, like alcoholism.  That discomfort makes room for assertions from authority offered by people who claim to be working on our behalf.  Our unwillingness to ask evidentiary questions is a cozy environment for demagogues and despots.

When you ask “What does this have to do with me?” you might be surprised at the consequences of avoidance and the price of unquestioning trust.  Why should we learn science and how to apply the discipline of examining evidence?  So we don’t hurt ourselves out of the easy habit of belief.

Motives and Revelations

There is a remarkable scene—one of many—in James Morrow’s new novel, Galapagos Regained, wherein the final straw is broken for Charles Darwin and we are shown the moment he decided to back his radical new view of nature and its processes. Wholly fictional, no doubt, yet based on reality, Darwin has come to London to confront a young woman who has betrayed his trust while working in his household. The confrontation with the fictional Chloe Bathhurst is not the one that matters.  Rather, it is the confrontation Darwin is having with the edifice of a loving god.  His daughter is dying—tuberculosis—and the scientist in him knows there is nothing to be done, that an indifferent nature cares nothing for her goodness, her innocence, and any human claim on justice and fairness is but the empty babblings of a minor species only recently transcendent upon the ancient stage of life.  Darwin is angry and resentful.  The transgressions which resulted in his dismissing Miss Bathhurst are insignificant now against this greater, vaster crime which, he believes, has no actual perpetrator.  The only thing he can do, he decides, is to give her his blessing in pursuit of her own goal, which pursuit got her fired from his service.

Morrow-785x510

She was fired for attempting to steal the sketch he had written concerning the transmutation of species, a precursor work to his epic On The Origin of Species.  She did this in order to procure a means to free her errant father from debtors prison by using the work as the basis for winning the Shelley Prize, for which competition has been ongoing for some time in Oxford.  The purpose of the prize to reward anyone who can prove or disprove the existence of God.  Chloe, during her employ as Darwin’s zookeeper, became aware of his theory and thought it ideal to present and win the prize.

Darwin refused.  When she elected then to steal the notes and present it on her own, she was caught and dismissed.  Darwin was at the time unaware that she had already made a copy of the paper and thought he had caught her in the act.

Now, in the lobby of a London playhouse, where Chloe had once been employed as an actress, Darwin, aware that she in fact had stolen his treatise, is sanctioning her quest.

“Don’t overestimate my sympathy.  Had I two thousand surplus pounds, I would cover your father’s debts, then arrange for you to tell the world you no longer believe in transmutationism.  That said, I must allow as how a part of me wants you to claim the prize, for it happens that my relationship with God—“

“Assuming He exists.”

“Assuming He exists, our relationship is in such disarray that I should be glad to see Him thrown down…Get thee to South America, Miss Bathhurst.  Find your inverse Eden.  Who am I to judge your overweening ambition?  We’re a damned desperate species, the lot of us, adrift on a wretched raft, scanning the horizon with bloodshot eyes and hollow expectations.  Go to the Encantadas.  Go with my blessing.”

Because this is what Chloe has determined to do.  Go to the Galapagos Islands to gather specimens to support the argument for transmutation of species.  The Shelley Society fronts her the money to do so, she enlists her card-sharp brother in the expedition, they find a ship, and set sail.  The Society had already bankrolled an expedition to Turkey for the purpose of finding the remnants of Noah’s Ark, so this was only fair.

Accompanying her ship is Reverend Malcolm Chadwick, anglican minister and formerly one of the judges of the Shelley contest—on the side of the deity.  He steps down from that post at the request of Bishop Wilberforce and sent on this new mission to oversee what Chloe will do.  He departs with uneasy conscience, made so by the second part of Bishop Wilberforce’s plot, which sends another minister in another ship with the intention to go to the Encantadas and set in motion the ultimate destruction by slaughter of all the animals on the islands, thus to deprive the forces of atheism their troublesome evidence.  Chadwick finds this idea appalling, but he is faithful and says nothing.  He joins Chloe’s expedition, which becomes Odyssean in its complications and obstacles.

The novel proceeds from one adventure to another until Chloe herself, stricken ill in the Amazon basin, undergoes a kind of religious conversion, and decides she is wrong in her conviction that there is no god.  Morrow then expands on the struggle she engages with her fellow travelers and her own considerable intelligence.

What we are treated to in this novel is a thorough examination of human motivation in the face of shifting paradigms.  It may be clear where his sympathies lie, but he is too good a writer to load the dice in favor of his preferred viewpoint.  He gives his characters their own and follows them where they would naturally lead.  He never denigrates faith, only the fickleness of our intentions in the face of conflicting desires and awkward choices.  Tempting as it may have been in the end to simply declare a winner, Morrow instead takes a more difficult and fulfilling tack by portraying the times in which this debate flared into full flame with the advent of a solid theory of evolution.

Chloe Bathhurst herself is an admirable character.  An actress, adept as a quick study, she proves herself intellectually versatile and equal to any challenge.  As well, those who both aid and oppose her are equally well-drawn and Morrow deftly clarifies their motives.

Along the way, he gives a field demonstration in observation and interpretation, showing us the process whereby new understanding takes us over and how revelation can be a problematic gift.

Morrow is one of our best writers plowing the ground of controversy.  He never takes the simplistic road.  The pleasure in reading one of his novels is that of being allowed free range of the imagination in pursuit of specific truths stripped of dogma.  In fact, he disassembles dogma in the course of his yarns, a fact that is often not apparent while we’re in the grip of his artifice.

An artifice made warm by the complete humanness of his characters.  One his best creations is Chloe Bathhurst.  In her, several clichés and canards are undone, as well as many perhaps uncomfortable but rewarding questions asked.  She exemplifies the first rule of the explorer—never be afraid to go and see for yourself.  Do so and you’ll be amazed at what is revealed.

And what is lost.

The title parodies Milton’s Paradise Regained, from which perhaps Morrow took a bit of inspiration:

I, when no other durst, sole undertook
The dismal expedition to find out
And ruine Adam, and the exploit perform’d
Successfully; a calmer voyage now
Will waft me; and the way found prosperous once
Induces best to hope of like success.

Perhaps not so much to “ruin Adam” as to give us a view into a vaster garden, older and truer, and less a burden to our capacity for wonder.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Seeing At f.64

I have been involved in photography since I was 14 years old. My father gave me his Korean War-era Canon 35mm, a knock-off of a Leica model D, and I began a somewhat haphazard career which never quite went where I thought it would or should.  By the time I was 21 I had replaced that Canon with a pair of Minoltas, a Mamiya Universal (2 1/4 format), and a Linhof 4X5 view camera, plus the darkroom accoutrements to go with it all, including a massive Besseler 45M enlarger (which I still have) and pretensions of being a modern manifestation of a member of the Group f.64.

All I really knew about this group were the photographs and some names, and not even all of them.  Ansel Adams, Edward Weston, Imogen Cunningham.  Names to conjure with.  Sharp, full-tonal range images of landscapes, portraits, and details of objects made sublime by the attention of these artists of the lens.

In time I learned about other names—Alfred Stieglitz, Edward Steichen, Paul Strand, Walker Evans, Dorothea Lange, Ben Shahn—and picked up a bit of the history.  To be honest, photographing and printing were what interested me and I used the work of these past giants as goals and guides.  I bought a full set of Ansel Adams’s Basic Photo series, gleaned what I could, and set forth to do what he did, at least in my own small and considerably more modest way.  He had Yosemite, I had, when I got to it, the Ozarks.  Edward Weston had Big Sur, I had the Missouri and Meramec Rivers.  I spent too much time trying to duplicate their work and too little time finding what was worthwhile in my own backyard.

Which I did eventually and I’ve done some work of which I’m proud.  Ah, but there were giants in those days, and I wasn’t even getting to their shoulders.  The magical, mystical Group f.64 occupied in some sense the same place as the Fellowship of the Ring.  They were the first, the best, the most interesting.

And yet, little was available to be known about them between two covers.  Articles here and there, anecdotes, essays in their monographs, coincidental stories.  A great deal of myth can evolve from such piecemeal history.

Mary Street Alinder, who has previously written a biography of Ansel Adams, has published a history of the Group. Group f.64: Edward Weston, Ansel Adams, Imogen Cunningham, and the Community of Artists Who Revolutionized American Photography fills in the gaps of biography and history of this unlikely cadre of artists who took photography away from the Pictorialists who were dominant in the 1910s and 20s and gave it back to itself.

Beginning with the disdain East Coast photographers, dominated by the towering personality of Alfred Stieglitz, who was one of the principle gallerists and arbiters of what constituted art in America, for so-called “western photographers,” the primary movers came together in and around San Fransisco to establish their own hallmark on what they thought photography should be.  Dedicated to “pure” photography—that is, photographs that were themselves, making full use of the unique medium of lens, film, and chemistry, rather than the popular form of photographs that imitated paintings, with soft-focus and elaborate stage settings—Ansel Adams, Willard Van Dyke, and, somewhat reluctantly, Edward Weston (who was simply not a joiner, and yet philosophically supported what the group was trying to do, and ended up lending his name to it because of them all he was already well known and respected) formed the Group, Adams wrote a manifesto, and they set about arranging exhibits and establishing themselves as arbiters of a new vision of photography.

Young, enthusiastic, and perhaps a bit pretentious, the group eventually included Connie Kanaga, Dorothea Lange, Henry Swift, Alma Lavenson, Brett Weston, and a few others.  Both Van Dyke and Adams were aggressive about exhibits, and even opened their own (short-lived) galleries.  Eventually, though, they knew they had to break into the East Coast and convince Stieglitz of their worth.  Adams was the first to do so and after that the tastes of the country began to shift.

What emerges from the pages of Alinder’s book is in many ways the same struggles and passions that come with any artistic movement.  Also the conflicting personalities, the petty wars, the affairs, and the eventually dissolution of the Group as its members, all gifted and hard-working artistic obsessives, went their own ways.  The kind of work they did today is accepted as the given in what photography should be, at least as a base.  Adams and Weston shared a philosophy that the photograph should allow the subject to reveal its true nature.  For that to happen, the kind of standard obliteration of detail that the Pictorialists held as necessary had to go away.  Clarity, sharpness, richness of detail and tonality.  Hence the name of the group, f.64, which is one of the smallest apertures a large format lens can achieve (though not the smallest—f.128 is common for large format lenses), and by which the widest possible depth-of-field (meaning the range of acuity from nearest to farthest) and therefore the greates sharpness throughout the image.  By calling themselves this, they announced that stood in direct opposition to the fuzzy, false images that dominated galleries and public awareness.  (The elaborate staging of the Pictorialists moved easily and naturally into Madison Avenue advertising photography, but even there Group f.64 became the dominant æsthetic in terms of detail and clarity.)

The successes of all these artists would probably have occurred in any event, but not this way and not so soon and not so magnificently.  Obviously, some emerged as superstars while others have faded from our memory.  Alinder does service here by reminding us of them and placing them in their proper relation to the history.

One remarkable aspect of the Group is, in some ways perhaps, minor, but led to major consequences as the 20th Century unfolded.  Namely, they were the first artistic movement in the United States that made no distinction between male and female members.  All they cared about was the work and none of them seemed to have any retrograde ideas about women being lesser artists.  Right from the start more or less half the group was female, and we remember them well.  Imogen Cunningham and Dorothea Lange, certainly, but others now somewhat forgotten, like Consuela Kanaga and Sonia Noskowiak and Alma Lavenson informed the dialogue of 20th Century photography in powerful and significant ways.

The other thing that was a surprise to me was to now realize who wasn’t in the Group.  Based on the work alone, seeing it all the first time as a young photographer, I just assumed membership on the part of photographers like Walker Evan, Paul Strand, and Margaret Bourke-White.  Walker Evan, according to Alinder, did not care for Group f.64.  Ironically, he thought they were too  concerned with making”pretty” pictures, which was one of the complaints Group f.64 had toward the Pictorialists.  Evan was a chronicler—nothing should distract from the subject as found, not cropping, not composition, not fine printing.  Strand predated the group (and actually inspired Adams to change how he made images and prints) and Margaret Bourke-White was always a working commercial photographer following her own path.

Reading this and seeing how these people at the time had to modify and adapt their philosophy to account for some of the inconsistencies in their original intent and to make sure they did not exclude excellent work by too narrow a set of constraints was revelatory.  Alinder’s book has allowed me to better understand my own sources of inspiration.  As well, her prose are, like the people she writes about, examples of clarity, sharpness, and acuity.

Now A Word From Paradise Island

Of all the unlikely confluences you might stumble over in life, here’s a set: what do Margaret Sanger, lie detectors, and superheroes have in common?

If you answered “Wonder Woman!” you would be correct.  And if you could answer that before Jill Lepore’s The Secret History of Wonder Woman came out, you would be in rare company knowing that.  lepore_wonder_woman_cover

Wonder Woman was the creation of one William Moulton Marston, of the Massachussetts Marstons. If that familial pedigree escapes you, don’t feel bad, you didn’t miss it in history class.  The Marstons had come down a ways from their 19th Century heights, enough so that William was forced to apply for grants to finish his degree at Harvard in 1915 in psychology (Ph.D. earned in 1921).

Psychology was still a relatively “new” science and Harvard was in the vanguard.  Marston attached himself to the recently founded Harvard Psychological Laboratory and eventually ended up running it.  But only for a short while.  And that career curve would follow him the rest of his life.

Marston emerges in the pages of Lepore’s book as a brilliant man-child who just couldn’t seem to make a success that lasted of anything.  It becomes evident as his story unfolds that he simply confused personal fame with genuine success and made a series of bad calls with regards to where and how to apply his energies.  He was a self-promoter but when it came to backing up his claims he was lacking in substance.  He “invented” the lie detector and established it as a viable methodology and then ruined its use in court for decades through sloppy application and shoddy research.  Not only that, but the court case which apparently set the precedent for the exclusion of lie detector results from court rooms cast a pall on him personally, a circumstance he never quite recovered from.  If not for the three women he eventually lived with, it seems likely he would have been indigent.

That is where the story takes a turn toward the culturally amazing.  Marston was a hardcore feminist, of the “old school” that marched and advocated and disrupted (although being a man this rarely led to any inconvenience for him).  He seems to have been genuinely dedicated to the idea of equality for women and also a devotee of free love, a parallel movement to the equal rights movement that led to considerable discord among feminists.  He met and eventually married Elizabeth Holloway, who shared his philosophy.  She earned a law degree from Boston University and of the two possessed the discipline to apply herself to her chosen tasks, a trait Marston apparently lacked.  Eventually, Holloway (who kept her maiden name) was the principle breadwinner in the house, which grew to include Marjorie WIlkes Huntley, a woman he met while doing work for the Army under Robert Yerkes, and then later Olive Byrne, who was Margaret Sanger’s niece.

At a time when “alternate lifestyles” could land you in prison, this was, to say the least, unorthodox.  Marston apparently possessed enormous personal charmisma and none of the women ever left him.  After his death, the three more or less remained together almost until death.  Byrne and Holloway each bore two of Marston’s four children, but Byrne claimed to be a widowed cousin and refused to tell her own children that she was their mother.

How did all this produce one of the most well-known superheroes of all time?

Lepore, who obtained access to archives and letters never before made public, has detailed this remarkable story with clarity, humor, and insight.  Wonder Woman is probably one of the most rigorously constructed superheroes of the Golden Age of comics.  Marston did not see her as entertainment but as a tool for social propaganda, namely to advance the cause of equal rights for women.  In fact, Marston thought women should be in charge.  His Wonder Woman was no one’s servant and the first couple of years of her publication reads like an indictment of everything broken in society’s attitude toward women.

DC Comics was in a bit of a bind.  They were by and large uncomfortable with the feminist “messaging” but Wonder Woman, almost from the first issue, was one of the three most popular superheroes of all time, right behind Batman and Superman.  They were making too much money to just kill her off.  But once Marston lost control of the comic (due to illness, first polio then cancer) the editor put in charge seemed to do everything he could to declaw this difficult woman.  That she survived to once again be the powerful symbol of womanhood is remarkable, but it also goes some way to explain the bizarre manifestation of the Seventies TV show.  Comparing the Lynda Carter characterization to the forthcoming Gal Gadot realization shows a marked difference in attitude and approach.

TCDWOWO EC007 Wonder Woman Gal GadotThe holdovers from the Silver Age era in Carter’s look, while “true” to the traditional Wonder Woman, nevertheless supported the highly sexualized aspect of the character, while the new look shows the underlying mythology that was originally loaded into the construct, that of a warrior, and Amazon, something out of the bloodier, martial past of the ancient Greeks.

Lepore’s book examines the period as much as the principles and shows us the problematic world in which suffrage, feminism, and tradition often came into conflict.  The urgency to “put Wonder Woman in her place” especially after WWII played out in the comics in often tragicomic ways.  Wonder Woman becomes a full-fledged member of the Justice League only to be made the secretary.

Some of this may seem silly to modern readers.  Why couldn’t people just get over their prejudices?  Wasn’t it obvious, etc etc.?  Maybe.  And maybe that it was obvious was part of the problem.  Justice often conflicts with privilege, ethics with desire, and change is never welcome to the fearful.

Marston himself, as a human being, seems by turns charming and repulsive.  Art rarely emerges from pristine sources, unalloyed with the questionable.  But then, Marston probably did not see himself as an artist or Wonder Woman as a work of art. He had causes to pursue, beliefs to validate, and…well, bills to pay.

Lest anyone believe that Wonder Woman was all Marston’s invention, though, Lepore makes it abundantly clear that the women around him contributed just as much if not more and served in many ways as models upon which he based the character.

The Secret History of Wonder Woman is a fascinating look at a time in America and an aspect of culture that for too long was underappreciated.  It is a serious and thorough examination of popular mythology and from whence many of our modern concepts came.  In these times when women are once more required to defend their rights and the very idea of equality, it might be useful to see how and in what ways this struggle has been going on for a long time and under some of the most unexpected banners in some of the most unusual ways.

On Heinlein and Expectations

William Patterson Jr. finished and delivered the second volume of his copious biography of Robert A. Heinlein not long before he passed away of a heart attack.  He was too young.  After reading his opus, he may well have had another book about Heinlein in him which we will now not see.

I base that on the fact that while volume 2—The Man Who Learned Better: 1948 to 1988—is filled with the minutiae of a crowded life, there seems little in-depth analysis and assessment of Heinlein’s work.  Given the few and scattered remarks about the shortcomings of other books of criticism published during Heinlein’s lifetime, one might reasonably expect such an assessment from a writer of evident skill and insight.  It is not out of the realm of probability that he may have intended such analyses for a third volume devoted exclusively to such an assessment.

To be sure, there are brief passages about several of the books of a critical nature that are useful.  (Detailing the travails of writing a given work, while fascinating to anyone interested in Heinlein’s life, is no substitute for a thorough study of the work in question.  This is not intended as a criticism of what is in the book, only that the wealth of information spurs a desire for more, especially when presented with tantalizing explanations of some problematic works that alter past perceptions.)  For instance, in discussing one of Heinlein’s most poorly understood later period novels, I Will Fear No Evil, Patterson reveals that Heinlein’s ambition in writing it was as response to postmodernism, taking apparently as inspiration John Barth’s Giles, Goat Boy and work by Philip Roth.  If true—and I have no reason to doubt him, as Heinlein himself discussed this in his own correspondence—this casts a very different light on what has become the Heinlein novel even ardent fans seem to dislike, often hate.

Although Heinlein rarely discussed his process with the story that became I Will Fear No Evil, …[i]t was as if he was working on crafting a New Wave kind of story that worked as story—the kind of thing for fiction that Frank Lloyd Wright had done with the Bauhaus when he designed Fallingwater in 1935…

He had Nabokov on his mind as well as the New Wave movement (this would have been right in the middle of it) and postmodernism, as well as reacting against the enshrinement going on in fandom of Campbellian Golden Age conventions.  He wanted to shake everyone up.

If in fact that was the nature of the work, it becomes clear why the book seemed to have no “natural” audience and served to confuse people more than reinforce Heinlein’s reputation as the “dean of space age fiction.”  The core readership of science fiction—fandom—would have loathed the postmodernist ambiguities while mainstream critics still treated science fiction as a fad and a not very good one at that.  Had someone told the New York Times reviewers that the book was a postmodern allegory, they would have (perhaps silently) laughed in dismay.

At this point a deeper analysis of the book might have been in order.

But Patterson was not doing literary analysis, he was chronicling a fascinating life.

Heinlein has long been the largest head on the Mount Rushmore of science fiction.  The myths about him, from his first sale to his unhindered success to his idolization of redheads to his supposed fascism, have stood in for any real knowledge about him, seasoned here and there with personal anecdotes.  In fact, Heinlein was almost pathologically private and resented anyone poking into his personal life.  He had a public persona, which he apparently enjoyed using, based on certain aspects of his character which those who saw only that took to be the whole man.  In later years his critics viewed him as hopelessly anachronistic, conservative to the point of feudalistic, a reactionary, and, despite sales figures, marginal to the field.  The service Patterson has done, besides the obvious demythologizing (especially in the first volume), is the extensive contextualizing of the man, the filling in of event, and the examination of how surfaces hide as much as reflect what lies behind what the public sees.

Heinlein was nothing if not experimental.  Often, because he was conducting his experiments at the times he did, the experiments were misperceived and misunderstood.  One can sympathize with his repeated desire not to have his work “analyzed” in an academic sense because he felt it would rob readers of seeing for themselves.  He likely disliked the idea of seeing his own motives and character analyzed through the lens of his work, something which happens often, especially in academic works.  He did not wish to be “psychologized” by people who may well not “get” what he was trying to do in the first place.

He was very much about control in this regard.

As in much of the rest of his life.  His detractors occasionally riff on the idea that he was in some ways a fraud, that his desire for control was only to mask a deep sense of incompetence or even incomprehension.  This is an unfortunately shallow reading.  Consider: Heinlein’s one ambition as a youth was to have a Navy career.  He worked himself into physical breakdown to get through Annapolis only to find out a short time into what he thought would be a lifetime calling that his own health was sabotaging him.  He had to leave the Navy because his body failed him.  The one thing he truly wanted to do was denied him.

Some people might give up and sell siding for the rest of their lives.  Heinlein tried many things.  He ran for political office, he tried mining, pursued his education, finally coming to writing.  Even after early success at that, he continued trying to serve his country and ran a research lab.

That he may have felt some ambivalence about the thing that eventually became his most successful endeavor might be understood given all this.  Rather than hiding incompetence, it is perhaps more accurate to say that he lived with continued fear that some new malady or accident might put an end to this as well.  It is not inconceivable that he expected, however minutely, that the bottom would fall out in the next step or two.  Reading about the speed with which he turned out clearly superior novels, it is not hard to imagine a nagging imp of doubt that he might not be able to do this next week for reasons completely out of his control

Misrepresentation and fraud have nothing to do with this.

What is most interesting in all this is seeing the bell curve of influence with each new book.  Heinlein’s work was audacious when written, groundbreaking when published, influential throughout the period when other writers reacted to it, and then reassigned as exemplary of some shortcoming on the author’s part as the culture caught up with it and passed it by.  In hindsight, the flaws are myriad, some profound, but I can think of no other science fiction writer to suffer such extremes of regard, especially within their lifetime.

What becomes apparent in reading the 1000 plus pages of Patterson’s work is that the one thing Heinlein intended with each book was to start a discussion.  What so many seem to have taken as pronouncements from on high, Heinlein intended as the opening gambit in a long conversation.  Instead of engaging in the argument, too many people made him their personal guru, something he consistently rejected, and when they realized finally that some of the things Heinlein said were problematic or downright inflammatory, they turned on him.  He wanted to be Socrates, not Aristotle as remade by the Church.  He wanted people to disagree, to engage.

How else to explain the wild variations of philosophy between works like Starship Troopers and Stranger In A Strange Land, Beyond This Horizon and Farnham’s Freehold, Methusaleh’s Children and The Moon Is A Harsh Mistress?

On the other hand, he seemed often to work in a vacuum of his own making.  He bridled at the confines of expected SF forms, yet he did not avail himself of relationships with the mainstream literary establishment he longed to be part of.  He wanted to write work that transcended genre boundaries—and read extensively outside the field—and yet he rarely seemed to engage in the cultural discourse going on outside the SF “ghetto.”  He and Virginia, his third wife, were usually politically isolated, even while trying to fully interact with the ongoing political dynamic.  Heinlein’s politics were more of the “curse on both your houses” variety than anything categorizably useful.  He claimed affinity with libertarianism, yet had no real respect for much that passed for political philosophy under that banner.  Neither fish nor fowl, it came to others to try to define him, and he gave them little assistance.  The country moved in directions with which he disagreed, but his reactions gave no support to others who thought the same way and wanted to do this or that to change it.  He lived by a definition of liberal that was being quickly left behind by those working under that label.  His consistent message through his fiction was “Think for yourself” and yet it came across more and more as “if you don’t think like me you’re an idiot.”  Those looking for ready-made answers in his work could only see the latter.

Narratively, volume 2 is packed too tightly to be as good a read as the first book.  No doubt this is a result of trying to keep it usefully in hand in combination with the increased wealth of information available about this forty year period.  But it nevertheless offers a fascinating look at a genuine iconoclast within his context, and for that it is a very worthy book.

Finally, as much as detractors would like to make Heinlein an irrelevancy, the very obsessiveness with which many of them attend his deconstruction suggests that while one may disagree over him profoundly, he is not easily ignored or dismissed.  Whatever else, he did succeed in getting a conversation going.  Sometimes it’s actually about what he considered important.

About Hild, A New Novel, A New World

It is completely fitting that science fiction writers should write historical fiction.  Both forms deal with the same background—alien worlds.

Because we live in a story-saturated era where access to the ages is easily had with a visit to the library, the local bookstore, the internet, movies, it is easy to assume we know—that we understand—the past, with the same cordial familiarity we experience our own personal history.  That people lived differently “back then” seems more a matter of fashion and technology, not a question of thought process or philosophy or world view.*  People lacked central heating and air conditioning, cars, television, telephones, indoor plumbing, antibiotics…but they lived essentially the same way.

Well, one could make a case that they did,  but you have to ask the question “In what ways did they live the same way?”  Therein lies the heart of good historical analysis and extrapolation.

Because while we can connect with people of the past in many very broad ways—they were human, they loved, they hated, they were greedy and generous, they were driven by passions, they dreamed—the specifics can school us in the range of the possible.  What does it mean to be human?

Far more than we might imagine.

But that’s where the novelist comes in, the writer who takes the time to grapple with those myriad distinctions and give us a look into those differences that are still, regardless of how remote they seem from our personal understanding of “human,”  part of who we are, at least potentially.

I mention science fiction at the beginning because at a certain level, if we’re dealing with something deeper than costume drama or plot-driven adventure fiction, the exercise of finding, comprehending, and actualizing on the page an entire period from the past—Republican Rome, Hellenic Greece, the Mesopotamia  of the Sumerians, the Kingdom of Chin, or post Roman England—is much the same as building a world out of logic and broad-based knowledgeable extrapolation.  In some instances, extrapolation is all-important because the fact is we simply do not know enough to more or less copy a record into a fictional setting.  Instead, we have to take the tantalizing scraps of what remain of that world and supply the connective tissue by imagining what must, what probably, what could have been there.  And in the process we discover a new world.

If done well, that newness becomes a mirror for us to perceive what we have overlooked in ourselves.  (Which is what good fiction ought to do anyway, but in the well-constructed historical it is a special kind of revelation.)

Seventh Century England is rich with the unknown, the ambiguous, the seductively out-of-reach.  It existed between one deceptively homogeneous era and another, between the Roman Empire and the emergence of the Holy Roman Empire.  More, it held some of the last vestiges of the once vast Celtic Empire.  It was a land where shadow-pasts vied for hegemony over the mythic substrate defining meaning for the warlords, petty kings, and mystics serving them. Pagan religions found themselves competing with this new Christianity, which had been around a while but was finally beginning to make significant headway among the competing kingdoms, looking for the leverage it needed to make itself an “official” religion with the authority to shove the others aside.

Into this came a woman who eventually mattered enough, given the overwhelming patriarchal structure of the day, to deserve a mention from the Venerable Bede (who saw women much as most men of his time did, necessary creatures in need of guidance and by dint of their sex lesser beings).  In Book 4 of his Ecclesiastical History of the English People we’re told of St. Hilda, who was by any measure of the era (and even ours) astonishing.  “Her prudence was so great…that even kings and princes asked and received her advice.”

A good novel starts with a good question and in this case it would be: Who was this woman and how did she get to this place?

A question to which Nicola Griffith impressively supplies an answer in her new novel, Hild, (Farrar, Strauss, Giroux).

Hild, later St. Hilda of Whitby, lived from 614 to 680.  She was a second daughter of minor nobility whose father died, leaving the family at the mercy of rival kingdoms.  Later she founded an abbey, where she remained the rest of her life, and was a teacher of prelates and princes.

Note that.  Seventh Century, at a time and in a place where women were little more than property, Hild could not only read but commanded respect.  That alone would make her fit subject for a big historical novel.  Certainly she would serve as the basis for a cathartic life-lesson to modern audiences about the innate power of women and the need to find and act upon one’s own identity.

But Griffith avoids this in some ways too easy path to sympathy for her character and does what superb history should—provides context and shows her character in situ, living as she would have.  Hild had her own problems to face and they are not ours.  Through the course of 560 pages of well-chosen and seemingly hand-polished words, Hild is given to us as a person, fully realized, of her own time.  This is a different world and these people did not see it as we do.

The success of a novel is in its ability to bring the reader entirely in and hold them, enmeshed, for the duration.  Griffith’s past novels have demonstrated that she can achieve this in both science fiction (Ammonite, Slow River) and noir thriller (The Blue Place, Stay, Always).  But in some ways those novels presented less of a challenge in their immersive requirements—they were closer to home, nearer to our own world, and allowed for reader assumptions to come into play.  (This is deceptive, of course, and is more a question of laziness on the part of the reader than on any artistic shortcuts a writer might take.)  Hild represents an order of magnitude greater risk on Griffith’s part, a kind of dance through a mine field of possible failures that could cause reader disconnect or, worse, a betrayal of her characters.  It is a great pleasure to note that she made no such missteps, got all the way to other side, world intact, with a character very much herself.

This is what historical fiction ought to do.  Take you and put you in a world that is quantitatively and qualitatively different and still engage your sympathies.  As we follow Hild from birth, through her education (under the guidance of her mother, who is herself remarkable) and into a young adulthood in which she comes into possession of some authority, we find ourselves shifting out of our comfort zones with respect to the givens of the world.

Hild is the first book of a trilogy, which will cover Hild’s whole life.  If the next two books are done with as much care, diligence, and grace as this, we are all in for a remarkable experience.

And out of the richly-wrought tapestry of difference, we really do find a connection across the centuries.  Just not where one might ordinarily look for one.

______________________________________________________

*World view is itself a phrase fraught with change, for to have one requires we have some notion of The World, and that has changed constantly over time.  What world?  How big?  Who is in it?  Look at the changes in the past five centuries, which some historians identify as the modern era.  We have gone from a flat earth at the center of a solar system which defined the limits of space to an uneven sphere orbiting an insignificant middle range star of a small galaxy that is one out of billions and billions of galaxies, with no evident limit to what comprises the universe.

Quantum Branching…As Literature Embraces Science Fiction, the Past is Again and Again

Kate Atkinson’s latest novel, Life After Life, is a remarkable achievement.  It’s several hundred pages of exquisitely controlled prose contain the story of Ursula Todd, who is in the course of the story, born again and again and again.  Each life, some so very brief, ends in a tragic death, accidental, malevolent, heroic, painful, and each time she starts over, comes to the point where that mistake was but is now sidestepped, turned away, avoided.  She lives multiple times, each one different, and yet she remains herself.

The novel opens with a shocking scene—Ursula, a young woman living in Berlin, enters a café wherein she finds Adolf Hitler, surrounded by sycophants, enjoying his celebrity.  She pulls a pistol and takes aim,

Then she is born.

It is 1910, in the English countryside, and snowing heavily.  The scene is reminiscent of Dickens.  She is born.  First she dies from strangulation, the umbilical cord wrapped around her with no  one around who knows what to do.  Then in the next life that obstacle is overcome.  And so it goes, as she ages, staggers through one life after another, growing a little older each time, her family battered by one damn thing after another.  Ursula herself, a middle child, watches as much as participates in the homely evolution of this middle class English family, and we are treated to an almost microscopic study of its composition—its hypocrisies, its crises, it successes, its failures.

Ursula endures.  As her name almost punningly suggests, she Bears Death, over and over.  She never quite remembers, though.  She has intense feelings of déjà vu, she knows such and such should be avoided, this and that must be manipulated, but she never quite knows why.  At times she comes perilously close to recognition, but like so much in life her actions are more ideas that seemed good at the time than any deeper understanding.

Unlike the rigor of traditional time travel, the past does change, but then this is not a time travel novel, at least not in any traditional sense.  You might almost say it’s a reincarnation story, but it’s not that, either, because Ursula never comes back as anyone other than herself.   At one point in the novel, time is described, not as circular but as a palimpsest—layers, one atop another, compiling.  The result here is a portrait more complete than most not of a life lived but of life as potential.  But for this or that, there wandered the future.  It is a portrait of possibility.

The big events of history are not changed, though.  Nothing Ursula does in her manifold existences alters the inevitability of WWII or Hitler or the Spanish Flu or any of the mammoth occurrences that dominate each and every life she experiences.

What she does change is herself.  And, by extension, her family, although all of them remain persistently themselves throughout.  It is only the consequences of their self expression that become shaped and altered.

We see who are the genuine heroes, who the fools, the cowards, the victims and victors as, where in one life none of this might emerge clearly, in the repeated dramas with minor changes character comes inexorably to the fore.

Atkinson does not explain how any of this happens.  It’s not important, because she isn’t doing the kind of fiction we might encounter as straight up science fiction, where the machinery matters.  She’s examining ramifications of the personal in a world that is in constant flux on the day to day level even as the accumulation of all that movement builds a kind of monolithic structure against which our only real choice is to choose what to do today.  Consequently, we have one of the most successful co-options of a science fiction-like conceit into a literary project of recent memory.

On a perhaps obvious level, isn’t this exactly what writers do?  Reimagine the personal histories of their characters in order to show up possibility?

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

Greatless Illusion

The third book I read recently which resonated thematically with the previous two is one I have come somewhat late to given my inclinations.  But a new paperback edition was recently released and I considered buying it.  I hesitated as I was uncertain whether anything new or substantively unique was contained therein to make it worth having on my shelf.  I have other books along similar lines and while I am fond of the author, it seemed unlikely this book would offer anything not already covered.

Christopher Hitchens was a journalist and essayist and became one of our best commentators on current events, politics, and related subjects.  Even when I disagreed with him I have always found his arguments cogent and insightful and never less than solidly grounded on available fact.

So when he published a book of his views on religion, it seemed a natural addition to my library, yet I missed it when it first came out.  Instead, I read Richard Dawkins’ The God Delusion, which I found useful and well-reasoned, but pretty much a sermon to one who needed no convincing.  Such books are useful for the examples they offer to underpin their arguments.

Such is the case with God Is Not Great: How Religion Poisons Everything.  Hitchens’ extensive travels and his experiences in the face of conflict between opposing groups, often ideologically-driven, promised a surfeit of example and he did not fail to provide amply.

The title is a challenge, a gauntlet thrown at the feet of those with whom Hitchens had sizeable bones to pick.  In the years since its initial publication it has acquired a reputation, developed a set of expectations, and has become something of a cause celebré sufficient for people to take sides without having read it.  I found myself approaching the book with a set of expectations of my own and, with mild surprise, had those expectations undermined.

Yes, the book is a statement about the nature of religion as an abusive ideology—regardless of denomination, sect, theological origin—and offers a full range of examples of how conflicts, both between people and peoples, are generally made worse (or, more often than not, occur because of) by religious infusions into the situation.  It is in many ways a depressing catalog of misuse, misinterpretation, misstatement, misunderstanding, and sometimes misanthropy born out of religious conviction.  Hitchens analyzes the sources of these problems, charts some of the history, and gives us modern day examples.

But he tempers much of this by drawing a distinction between individuals and ideologies.

He also opens with a statement that in his opinion we shall never be rid of it.  This is quite unlike people like Dawkins who actually seem to feel humankind can be educated out of any need of religion.  Hitchens understood human nature all too well to have any hope that this was possible.

He does allow that possibly religion allows some good people to be better, but he does not believe religion makes anyone not already so inclined good.

By the end of the book, there will likely be two reactions.  One, possibly the more common, will be to dismiss much of his argument as one-sided.  “He overlooks all the good that has been done.”  It is interesting to me that such special pleading only ever gets applied consistently when religion is at issue.  In so much else, one or two missteps and trust is gone, but not so in religion, wherein an arena is offered in which not only mistakes but serious abuse can occur time and time again and yet the driving doctrine never called into question.  The other reaction will be to embrace the serious critique on offer, even the condemnations, and pay no attention to the quite sincere attempt to examine human nature in the grip of what can only be described as a pathology.

Because while Hitchens was a self-proclaimed atheist, he does take pains to point out that he is not talking about any sort of actual god in this book, only the god at the heart of human-made religions.  For some this may be a distinction without a difference, but for the thoughtful reader it is a telling distinction.  That at the end of it all, Hitchens see all—all—manifestations of gods through the terms of their religions as artifices.  And he wonders then why people continue to inflict upon themselves and each other straitjackets of behavior and ideology that, pushed to one extreme or another, seem to always result in some sort of harm, not only for the people who do not believe a given trope but for the believers themselves.

We are, being story-obsessed, caught in the amber of our narratives.  Per Mr. Thompson’s analysis of myth, we are never free of those stories—even their evocation for the purposes of ridicule bring us fully within them and determine the ground upon which we move.  The intractable differences over unprovable and ultimately unsubstantiated assumptions of religious dictate, per the history chronicled around the life Roger Smith, have left us upon a field of direst struggle with our fellows whose lack of belief often is perceived as a direct threat to a salvation we are unwilling ourselves to examine and question as valid, resulting in abuse and death borne out of tortured constructs of love.  Christopher Hitchens put together a bestiary of precedent demonstrating that treating as real the often inarticulate longings to be “right” in the sight of a god we ourselves have invented, too often leads to heartache, madness, and butchery.

The sanest religionists, it would seem by this testament, are those with the lightest affiliation, the flimsiest of dedications to doctrine.  They are the ones who can step back when the call to massacre the infidel goes out.

All of which is ultimately problematic due simply to the inexplicable nature of religion’s appeal to so many.

But it is, to my mind, an insincere devoteé who will not, in order to fairly assess the thing itself, look at all that has been wrought in the name of a stated belief.  Insincere and ultimately dangerous, especially when what under any other circumstance is completely wrong can be justified by that which is supposed to redeem us.