2016

Tardiness comes in direct proportion to chaos. The year ended and all was in flux.

However, reading goes on.

I did not finish nearly as many books in 2016 as I tried to. At least, not other people’s books.  I did finish drafts of two of my own.  My desk, at the moment, is clear, and maybe I can do a better job in 2017 of keeping abreast here.

A good deal of my science fiction reading was pretty much for the reading group I host at Left Bank Books. That group affords me opportunity and motivation to read novels I might not otherwise get to.  So I reread Alfred Bester’s The Stars My Destination for the first time in three decades, but I also read The Left Hand of Darkness for the first time ever. I do not regret the delay. It is a mature novel, with a great deal my younger self may well have missed.  As to the former, it came very close to not holding up.  I had forgotten (if I ever realized it this way) just how brutal a novel it is, and not just in the character of Gully Foyle. Bester’s achievement way back in the Fifties remains remarkable for its unyielding insistence on a fragmented, painful, chaotic, and historically consistent future.

I also reacquainted myself with Tiptree, in the form of Her Smoke Rose Up Forever. It seems fitting in this period of reassessment and revolution, when the face of science fiction is—has—changed and brought forth a volatile reaction to that change.  Tiptree was doing much of what is being so rancorously challenged within the field today, but as she was a singular voice and not a “trend” she provoked different challenges then while becoming accepted generally as a brilliant writer and a jewel in the crown of SF stars.

I also reread (for the first time since it came out) Robert Silverberg’s Dying Inside, which I reviewed in the previous post.  I was much too inexperienced a reader the first time to appreciate everything Silverberg was doing, so I probably forgot the book as soon as I finished it.

It is true that some books must be “grown into”—I am currently rereading Samuel R. Delany’s Trouble On Triton for the book group and realizing that, while I read it eagerly the first time, I probably missed almost everything important about. Likewise with another reread, Gene Wolfe’s The Fifth Head of Cerberus, which is ostensibly a novel about colonialism.  I say “ostensibly” but that does not mean it isn’t.  It very much is about colonialism, all three of the novellas which comprise the whole.  But it is as much about how we colonize ourselves, sometimes to our loss, as it is about colonizing foreign soil, in this case another world with a native population that strives to adapt but may have found in the end their only options were extinction or counter-colonization.  As always, Wolfe’s subtlety is rigorously slippery, his points less direct,  corrosive of expectation.

Titan Books has rereleased Michael Moorcock’s Cornelius Chronicles, a story cycle that is the very definition of indirect.  Moorcock took as his template the Romantic poets—Byron, Shelley, et al—and displaced them into a near future chaos in the form of his “hero” Jerry Cornelius, who wants to save the world only to resurrect his dead sister so they can be together.  The prose are rife with Sixties hip, but not so overwhelmingly anachronistic that the novels aren’t just as readable now as they were then.  The response to them is perhaps necessarily altered and certainly the themes play out differently. Moorcock may have been the grown-up in the room at the advent of New Wave.  He did go on to write some marvelously rich books after these.

I finished Ann Leckie’s delightfully subversive Ancillary trilogy.  I need to do a full review soon.  Treat yourself.

A smattering of other SF titles I can recommend whole-heartedly:  Lavi Tidhar’s Central Station; Sylvain Neuvel’s Sleeping Giants; Carter Sholz’s Gypsy; Binti by Nnedi Okorafor.

And Nisi Shawl’s wonderful Everfair.  An alternate history steampunk done the way steampunk ought to be done.  I owe it a full review, but let me say here that this is one of the best first novels I’ve read in a long time.

I read two China Mieville books this year, one very good.  This Census Taker I have to count as a failure.  It has good writing fascinating bits, but failed to come together the way I’ve come to expect from Mieville.  The other, newer one, is The Last Days of New Paris, which is excellent.  This pair allowed me to understand that one of the primary passions Mieville indulges in his work is cities.  His best work portrays a city as a complete character.  This Census Taker lacked that.

Of the non science fiction read this year, I did Moby-Dick with my other reading group.  I resisted doing this book.  I’ve never liked it.  I find it turgid, convoluted, often opaque.  There is also a darkness to it that can be suffocating. Over several months we tackled it, dissected it, ran through various analyses.  I conclude that it is a superb work, fully deserving of its reputation.  It is A great American novel if not The American Novel, because America is its subject, though it takes place on a whaling ship far at sea.  It is not a flattering picture, though, displaying throughout the contradictions, hypocrisies, and shortcomings of the then young nation which continue to plague us.  It does this brilliantly.

I still don’t like it.  I find little pleasure in the actual reading.  That, as they say, is my problem.

A colleague and coworker, Kea Wilson, published her first novel, We Eat Our Own. I commend it.  I reviewed it here.

A novel that straddles the genre boundaries somewhat that caused some controversy upon its initial publication is Kazuo Ishiguro’s The Buried Giant.  This is a post-Arthurian quest story with much to say about memory and community and the price of vengeance.

This was a big year for nonfiction.

Robert Gleick’s new tome, Time Travel: A History is an exceptional soliloquy on the concept, science, and cultural use of time travel, beginning with Wells and covering both the scientific realm and the popular fiction realm, showing how they have played off each other and how the idea has evolved and worked through our modern view of the universe and our own lives.  Previously in the year I’d read his magnificent biography of Richard Feynman, Genius.  Gleick is a great explainer and a fine craftsman.

As well, Carlo Rovelli’s Seven Brief Lessons About Physics.  They are brief, they are accessible, they are to be enjoyed.  And, along the same lines, Void by James Owen Weatherall, about the physics of empty space.  It’s far more fascinating than it might sound.

I can recommend Peter Frankopan’s Silk Roads, which is a history of the world from the viewpoint of the Orient.  The shift in perspective is enlightening.  Along the same lines I read Charles Mann’s 1491, which was eye-opening and thought-provoking—and in some ways quite humbling.

I also read Arlie Russell Hochschild’s Strangers In Their Own Land, especially in the wake of what I think I can safely call the most surprising election result in recent history. This book is a study of the right-wing culture that has developed in many startlingly contradictory ways.  I believe this would be worth reading for anyone trying to make sense of the people who continually vote in ways that seem to make no sense—and also for those who do vote that way just so they might understand what it is about their movement that seems so incomprehensible to many of their fellow citizens.

I read a few short of 50 books in 2016 cover to cover.  I will be reviewing some of them in the future.

Here’s hoping for a good year of reading to come.

 

 

 

Defending Angels

It is arguable that we live in a post-colonial age. We no longer see major powers moving into previously independent places and usurping the land and the people and declaring them to now be part of some empire. Not the way we did in the 18th and 19th centuries. (We wink at smaller-scale examples of roughly the same thing, but while Ukraine may be prey to Russia, we don’t see Russia trying to occupy New Zealand.) The scramble for Africa was the last eruption of such hubris. And there are now plenty of studies indicating that it was never a profitable enterprise anyway, that every power that indulged its imperialist urge did so at great expense that was never recouped, not in the long run. At best, such endeavors paid for the re-formation of both the imperial power and its colonies into more modern forms independent of each other.  At worst, it was pillage that benefited a few individuals and large companies and resulted in short-term wealth-building and long-term grief for everyone involved.

Yet the impulse drove relocations of population, experiments in applied bureaucratic overreach, and an ongoing debate over the ethics of intrusion.  One could argue that the Aztec civilization was a horrible construct with human sacrifice at its aesthetic center and the world is well rid of it.  On the other hand, it is equally true that the Spaniards who toppled it had no right to do so and unleashed a different sort of ugliness on the indigenous populations. Every European power that followed them into the so-called New World bears the same weight of shame for the wanton destruction of things they could not understand.  If here and there something positive came out of it, that something was by accident and had no real part of the initial decision to Go There.

With what we now know—ethically, scientifically, behaviorally—if given the chance to do it again, would we?  And if we decided to go ahead anyway, would we do anything differently or would we still be dominated by a subconscious obsession to exploit for resources to fuel a growing population trapped within an economic system that seems custom made to produce the necessary excuses to do what we want with whatever we find?

We seem forever to be doing things that go sour on us and then having to clean up the mess and apologize and figure out how to prevent a repeat performance. The problem with that is, one situation is not so exactly like another that the lessons do not come with big loopholes and the opportunity for rationalizing our hubristic avarice.

In short, we never learn.

At least, not in aggregate.  We understand this as well and so a good part of our political theorizing is geared toward a place wherein the individual moral insight can be effectively balanced against the rock-stupid momentum of the group; and in which the common wisdom of historical experience as exemplified by the group can temper the less enlightened passions of the individual.  In other words, to find the point at which we can allow for the individual who is correct to trump the so-called “will of the people” and conversely where that common will can morally check the individual who may only be thinking of him or herself, the group be damned.

Underneath, threaded into, and informing Marguerite Reed’s Philip K. Dick Award nominated novel, Archangel, we find this ongoing debate carried on at several levels.

Ubastis is a world seemingly ideal for large-scale human settlement.  Two waves of advance “scouts” grounded to do extensive surveys, impact studies, and established trial settlements. It became clear that this was a vital ecosphere and that, compatibility aside, questions of too much too soon drove the negotiations that prevented a rush to fill it with human excess.  Dr. Vashti Loren, widow of the spiritual and moral leader of these two waves, is one of the principle advisors on the ad hoc committee overseeing Ubasti, which exists as a kind of protectorate.  The rest of human polity is hungry for it to be opened for a larger human presence, which the people who live there know will mean the ruin of a unique biome. Vashti becomes the focus of all the efforts to forestall such open colonization.  As the widow of a slain “hero” she carries great weight.

She is also a problematic figure in this culture.  She is a genetically unmodified human in a larger culture where modification has become so widespread that “Natches” are special. That she is a protector of an “unmodified” ecosphere is only the first layer of what becomes a deeply meaningful representation of not only human moral responsibility but also human potential in an alien cosmos.

Reed gives us a civilization where aggression is being gene-modified out of individual humans, even though wars are ostensibly still fought, uprisings happen, and certain strain of bloodlust remains a given in controlled contexts. That Vashti is wholly unmodified adds to the irony that she also hunts native species as part of her job as an exobiologist and as a kind of PR component to assuage outworlders who are curious, acquisitive, and need persuading that Ubastis requires the time to be understood before the exploitation full-scale human settlement will bring. She takes outworld visitors on sdafari to hunt the local big game.

Her deceased husband, Lasse, was murdered by a renegade “soldier”—a Beast, a BioEngineered ASault Tactician, a member of a clone experiment in super soldiers—as a result of trying to prevent poaching.  The Interests trying to discard the treaty that keeps Ubastis inviolate have all along been probing at the defenses, trying to engineer excuses for open incursions.  Vashti kills the Beast.  That action calls into question her sanity, but she effectively defends herself from charges that would see her “re-educated.”

What she did not know was the deeper game her husband was playing to bring about a future independent Ubastis—and that it involved the Beasts, the lot of which have been presumably destroyed as too dangerous. Vashti begins to learn what her husband never told her when she is confronted with a Beast that has been smuggled onto Ubastis by the governor’s wife.  She vows to kill it, but that impulse itself gradually morphs into powerfully conflicted responsibilities, the details of which comprise the plot of this densely-detailed and finely-realized novel.

Vashti. The name has history. She was the Queen of a Persian ruler who requested she appear naked before a banquet he was holding in honor of other kings.  A “higher politics” was obviously going on and his demand of his wife was obviously part of the impression he was trying to make on his fellow kings.  Vashti refused.  Harriet Beecher Stowe later declared that Vashti’s refusal was the first blow for women’s rights.  She followed her own code.  Her husband’s request was deeply inappropriate even in that culture.  Vashti stood by her own values.

Make of that what you will.  Reed’s Vashti is a woman dedicated to a set of principles which are sorely tested in the course of the novel.  Watching her come to terms with political, ecological, and moral realities and steer a course between the shoals of competing colonial, imperial, and personal demands makes for a compelling read.  She is a superbly realized, flawed character, and the questions she raises, wrestles with, and reacts to lend themselves to consideration long after the last page.

This is excellent science fiction.  It takes the abstract, the conjectural, and the epistemology of human systems and moral dictates and makes them personal, the stakes high, and answers often problematic, leaving us with a great deal to think about.

In Review

2015 is done and I have read what I read.  It was a year fraught with turmoil in science fiction, a year prompting reassessments, a year when required reading competed with reading for pleasure, and the time constraints of working on a new novel (two, in fact) impeded chipping away at my to-be-read pile, which mounds higher.

As in the past, I count only books I have read cover to cover here.  If I added in total pages of unfinished reading, I’m probably up with my usual volume (somewhere around 90 books), but that would be a cheat.  That said, I read 50 books in 2015.

One thing I concluded, both from what I read and the upheaval in the background about what is or is not worthy science fiction, is that the decades long pseudowar between mainstream and genre is over.  Skirmishes will continue to be fought here and there, certain elements will refuse to yield or concede, but by and large the evidence suggests that, on the part of the literary writers at least SF has made its point. A couple of examples:

Station Eleven by Emily St. John Mandel is science fiction.  In fact, after talking it over for nearly a year since I read it, it seems to me to be Heinleinesque.  Better written, the characters less exemplars than real people, but in basic conceit and plot, this is a Heinlein novel. It has all the elements—survivors, a plucky heroine, a global catastrophe forcing those who remain to learn quickly a whole suite of new skills, and an ongoing discussion throughout about what is of value and ought to be preserved.  It is a superbly written work and that alone made the identification difficult.  Heinlein, at his best, could be as good as anyone in any genre, but to see the form raised to this level shows both his virtues and his weaknesses.  The population of the Earth is reduced buy a superflu.  The novel flashes back and forth around the life of a kind of patriarch whose biological and artistic progeny struggle in a post-technological world to both survive and preserve the best of that former world.  The novel prompts questions, challenges preconceptions, and draws us in.  It was not marketed as science fiction and it has continued to sell very well.  It is science fiction and no one has batted an eye.

The Water Knife by Paolo Bacigalupi.  An ecological thriller, an examination of a different kind of breakdown, a different kind of survival, peopled by characters as real as can be.  In a decade this will be historical fiction, probably, but it is SF and also mainstream and also uncategorizable.  Exceptional.

Straddling the boundary is Jeff Vandermeer’s Annihilation, which is a curiosity.  It proceeds like a straightforward “survey mission” novel—specialists set down upon an alien world and struggling to unravel its mysteries before said world kills them.  Only in this case the “alien” world in a patch of reclaimed wilderness somewhere along the eastern seaboard, probably north Florida, that is undergoing some strange transformations due to an experiment gone wrong.  There are touches of zombie fiction, government conspiracy, and even Lovecraftian uber-malignancy evoked, but the story, as told by The Biologist, feels more meta than any of those suggest.  the landscape works to inform the soul-wrenching recognitions and evolutions within the Biologist as she works to understand what is going on in the aptly named Area X.  Vandermeer has created a work bordering on genius here by virtue of externalizing and foregrounding mystical revelation as ecological transmutation, but as you read you can’t tease the meta passages from the plot in any clear way, so the experience, when you give yourself over to it, is wholly immersive.

So what I’m seeing—in many more titles still on my TBR pile—is the embrace of science fiction by what was formerly an ambivalent cadre of artists who are using it to ends traditionally ignored by main-body SF.

In the other direction, the infusion of literary concerns, which necessarily drag real-world issues in with them, into genre writing has prompted a squeal of protest from those who wish to keep their starships pure, their aliens obvious, and their weapons decisive.  “Good writing” is still a poorly understood quality by too many in the genres (by no means a problem exclusive to SF, but because of the nature of SF a problem which yields far more obvious failures) and the clinging to an aesthetic attributed to the so-called Golden Age and exemplified by writers probably more often revered than actually read (and therefore misperceived in intent) has exacerbated the old antagonisms and a final flaring up of fires dying to ash.  The clunky sentence is a hallmark of much of this, more likely as consequence rather than intent, and the cliched scenario becomes more obviously so as the whole point of what we mean by “literary” in its most useful mode is overlooked or, perhaps, willfully ignored in a fit of defensive refusal to pay attention to what matters, namely the truth of human experience and the profitable examination of, for want of a better word, the Soul.

Where the cross-fertilization of mainstream and genre has been successfully accomplished, we’ve been seeing novels and stories of marvelous effect.  We have been seeing them all along and in the past such examples were readily offered as proof that SF wass “just as good” as anything published as mainstream.  I’ve always felt that being “just ad good” was selling our potential short, but the work has to rise to the challenge, and there always have been such works.

Among such that I read this past year were a few from that rich past, mainly for the reading group I host at work.  The Two of Them by Joanna Russ; Extra(Ordinary) People, also by Russ; The Doomsday Book by Connie Willis; Mythago Wood by Robert Holdstock; The Sparrow by Mary Doria Russell; and Engine Summer by John Crowley.  In retrospect, there have always been writers writing in the full embrace of science fiction but without any of the stylistic constraints of its pulp origins, and such works remain trenchant and readable and offer surprising commentary still on current questions.

The Sparrow was a highlight. I have known since its publicatin that it was sort of a riff on James Blish’s classic, A Case Of Conscience, but it so much more. Russell’s elegant reversal of the moral question elevates this novel to the top tiers of useful literary works. I have not yet read its sequel, but I am looking forward to it after this treat.

I also reread Harlan Ellison’s Shatterday for the reading group. It’s been a good long while since I did so and I was not disappopinted, although I read many of the stories through a more cynical eye. The opening tale, Jeffty Is Five, remains, for me, one of the most gutwrenching short stories of all time.

Another highpoint this past year was James Morrow’s new novel, Galapagos Regained, a neatly unclassifiable work of speculative history.  I gave it a lengthy review here and recommend a look. This is a superbly done work that deserves more attention than it has received.

I also read Morrow’s amusing novella, The Madonna and the Starship, which runs a delightful gamne via Fifties television and alien visitors who come to bestow an award and offer assistance in exterminating the irrational on Earth.  Morrow is acerbic even as he is funny.

Among the most interesting new works of science fiction I red this year is The Three-Body Problem by Cixin Liu, translation by Ken Liu.  This is the first part of a trilogy about alien invasion and resistance as written from a Chinese perspective.  It is an exceptional translation.  It won the Hugo Award, the first, I believe, translation to do so, and certainly the first Asian novel to win.  There is high-end physics, nasty politics, murder, and the conundrums of committed action. The cultural quirks made it even more interesting.

Like almost everyone, it seems, I read The Martian by Andrew Weir. This was great fun and well executed.  My quibble, along with many others, was with the opening gambit to explain the marooning of the astronaut, but I’m content to see it as a mere dramatic choice.  It didn’t preent me from enjoying the rest of the book, which, in the words of the screen adaptation, “scienced the shit out all this” and did so in an accessible and entertaining manner which I applaud.  I couldn’t help seeing it as a newer version of an older film, Robinson Crusoe On Mars, and naturally this one works a bit better.  Hell, we know more, there’s no excuse for bad science, and Mr. Weir that.  He wrote a realistic piece of speculation and followed through admirably.

Another novel that gave a far more “realistic” view of an old, favorite SF trope, is Kim Stanley Robinson’s Aurora.  There is much to love about this book, but it is not lovable.  It’s a clear-eyed look at what an interstellar generation ship would actually be like.  And it is bleak, in terms of the traditions of SF.  Suffice it to say without giving away too much that Robinson fully incorporates entropy into his formula with predictably gloomy results, but for all that it is a thoroughly engaging work.

At the other end of the “hard” SF spectrum is Charles Gannon’s Fire With Fire.  Future interstellar expansion brings humanity into contact with our neighbors.  The resulting tensions drive the novel.  I reviewed it here.

Science fiction is a broad, broad field and has room for a magnificently wide range even on the same subjects.  It even has room, as I noted above, for exceptional style.  One of the most enjoyable reads for me, on that note, was Ian McDonald’s new novel, Luna.  There will be comparisons made to Heinlein’s The Moon Is A Harsh Mistress.  Look for an upcoming review where I will argue that the comparison, while in some ways valid, is superficial.  Anyone who has not read McDonald, treat yourself.  This would be a good one with which to begin.

In a completely different area of the playground, there is Daryl Gregory’s AfterParty, which I found excellent.  It’s about drug abuse and the workings of delusion and murder.  Anything I might say here would spoil it.  Go.  Find it.  Imbibe.

The bulk of my reading, after that and a few other titles, has been scattered.  I found a brand new history of the Group f64, which was the first dedicated group of photographers to push the pure art of the straight photograph.  Ansel Adams, Edward Weston, Imogen Cunningham, several others, in the 20s and 30s established the ground upon which all photography came to be viewed for the rest of the 20th century and even, arguably, into today. Mary Street Alinder, who has previously written a biography of Ansel Adams, did a superb job chronicling this group of prickly independent artist.

I read a history of a superhero, Wonder Woman, and discovered that the story of her creation was even stranger than the character herself.

A new work by journalist Johann Hari, Chasing The Scream, opened my eyes to the thorny issue of the Drug War.

In the wake of seeing the film Interstellar and beginning work on my own novel about (partly) interstellar travel, I dove into Kip Thorne’s Black Holes & Time Warps and had my mind bent in some ways I didn’t think it could be bent.  This has prompted a reengagement with science on this level which is proving difficult, tedious, and yet rewarding.  My mind no longer has the plasticity it once enjoyed.  On the other hand, experience has proven a benefit in that I seem to be absorbing and comprehending at a much deeper level.  We shall see.

Quite a bit of history, much of it unfinished.  In a separate reading group, I’m going through Victor Hugo’s Les Miserables, and reading in the history of the French Revolution, the Republic, its fall, all partly to complete the third novel of my trilogy, but also because the literature available is so rich and surprising that it has become its own pleasure.  It would seem now I’m about to embark on early American history again, anchored by Ron Chernow’s biography of Alexander Hamilton.

There was a new Mary Russell novel this past year, Dreaming Spies, by Laurie R. King.  I discovered a Dan Simmons novel about Holmes which I’d overlooked when it came out, The Fifth Heart, in which he is paired with Henry James, one more in a long line of novels and stories concerning Holmes’ unlikely interaction with historical figures.  Simmons is a terrific writer, but even he tended toward the tedious in this one.  He needs to learn to leave his research in his files.  But it was a unique take on Holmes and he even managed to elicit my sympathy toward James, a writer I find problematic at best, insufferable at worst, and annoying the rest of the time.

So much for the highlights.  Let me end by noting that the Best American series has finally realized that science fiction and fantasy are a real thing and launched one of their annual collections to cover it.  This after both Best Of infographics and comics.  Better late than never, I suppose.  The series editor is John Joseph Adams—difficult to imagine better hands—and this first volume was edited by Joe Hill, which I found interesting to say the least.  Mr. Hill is a horror writer.  Certainly many of the stories have a strong horror element, but over all this is a collection full of marvels, from the writing to the ideas.  I’ll try to keep track of this one in future.

So while not numerically great, 2015 was filled with many very excellent books.  I’m looking forward to 2016.  My stack awaits.

Happy New Year.

 

 

Easy Habits and the Consequences of Belief

At first glance, the two books could not be more different. Subject, tone, everything seems different. Not at odds so much as…nonoverlapping.

Which is ironic, since both deal, in their separate ways, with that very idea, the separation of areas of knowing.

Probably because I read them so close together I recognized their shared concerns as clearly as I did. Whatever the reason, it struck me as obvious in so many ways that I began to recall all the other books over the last few years that could be likewise gathered within this same subset, all on distinct topics and yet all based on, to some degree, an analysis of the same human propensity to disregard evidence when it contradicts belief.

Let me begin with the more general of the two.

Jerry A. Coyne is an evolutionary biologist.  Compared to others with public profiles, he has published few books.  Three, to be precise.  His latest, Faith Vs. Fact: Why Science and Religion Are Incompatible, is a direct challenge to Stephen Jay Gould’s idea of “nonoverlapping magisteria.”  Gould’s premise is that science and religion should not conflict with each other because they are concerned with entirely separate realms of knowing—hence the nonoverlapping part—and except for certain agenda-driven partisans, there is no reason for them to be in conflict.  Coyne sees this as accommodationism, which he thoroughly discredits in his book.

My claim is this: science and religion are incompatible because they have different methods for getting knowledge about reality, have different ways of assessing the reliability of that knowledge, and, in the end, arrive at conflicting conclusions about the universe.  “Knowledge” acquired by religion is at odds not only with scientific knowledge, but also with knowledge professed by other religions.  In the end, religion’s methods, unlike those of science, are useless for understanding reality.

Coyne identifies Accommodationism as an attempt not to stir the hornet’s nest, because scientists are often dependent on politically sensitive funding.  Science, especially Big Science dealing with questions of origins, is expensive and the days of the independently wealthy scientist are largely gone.  Rocking the boat by annoying those who hold the purse strings would seem ill-advised.

The conclusions, he goes on to argue, of scientific inquiry continually poke holes in those claims by religion that still assert authority over secular matters.

But more than that, such deferral to authority erodes our critical capacity and can lead to false conclusions and bad judgments, all because we accept the hegemony of “faith.”

…a word defined in the New Testament as “the substance of things hoped for, the evidence of things not seen.”  The philosopher Walter Kaufmann characterized it as “intense, usually confident, belief that is not based on evidence sufficient to command assent from every reasonable person.”

The sticking point for many would be that “reasonable person” proviso, for surely, as Coyne concedes—often—there are many reasonable people who nevertheless espouse a religious faith.  What are we to make of that?  Is the basis for the assessment of “reasonable” perhaps too ill-defined or is the scope of “belief” too broad?

Coyne takes us through the process, giving a thorough explication of why science and religion are incompatible if not a convincing argument for the down-side of belief for belief’s sake.  He tells an anecdote early in the book about how he yearly teaches undergraduates a course in evolution, which the majority do very well at, but admit, after earning their A, to not believing a word of what he taught.  Because it contradicts their religious belief.

It is this that Coyne sees as the dangerous aspect of faith as promulgated through religion, the a priori rejection of evidence in favor of a set of usually unexamined beliefs.  He takes us through a catalogue of negative manifestations, from the rejection of medicines to the acts of terrorists to the rejection of solid science (like climate change), that have their underlying justifications in religion.

Coyne, an admitted atheist, puts all this forward while taking pains to admit the personal comfort to be found in religion by many people.  From a certain point of view he tends to go the extra kilometer to be fair.  Of course, going through the comments on various review sites, those predisposed to reject such arguments accuse him of profound bias if not outright malicious intent.  One cannot help but wonder if they bothered to read the book, all or even in part.

The book makes its case clearly and concisely.  It avoids the polemic outrage to be found in other tomes by big name atheists by sticking largely to evidentiary concerns and philosophical arguments.

But, one may ask, so what?  Religion and science are two realms in which most people would assume they have no stake. Castles in the air, esoteric arguments about things that have no impact on our daily lives.  Most people seem to keep a religion much the same way they have a pet and certainly in the West the majority live secular lives and only rarely feel compelled to apply their religious convictions to anything.  As for science, as long as the technology we depend on works, all the rest is so much theoretical handwaving.  It makes no difference if we have almost no understanding of quantum mechanics and certainly evolution is just tenure-building nonsense having to do with million-year-old bones and what kind of textbooks the school district might use next year.  Nothing to do with us in our daily lives. So what if people rely more on faith and belief in making their daily judgments than on evidence-based science?  We operate more on heuristics defended by aphorism than by reason and applied understanding, or so Daniel Kahneman tells us in his excellent study, Thinking, Fast and Slow, and we by and large get along perfectly well that way.

How does this argument concern me?

Johann Hari’s new book, Chasing The Scream, has an answer to that.  Of sorts, if we but make the epistemological leap.

Hari is writing about the drug war. It is, for him, as much a personal examination as a journalistic one, as he admits to having family and friends who are addicts.  He begins with a simple question.

I scribble down some questions that had puzzled me for years.  Why did the drug war start, and why does it continue?  Why can some people use drugs without any problems, while others can’t?  What really causes addiction?  What happens if you choose a radically different policy?

Okay, three questions.  Still, simple, basic questions, to which any reasonable person might reasonably expect a reasonable answer.

Yet we spend billions, bully small countries, destroy thousands if not millions of lives, all in pursuit of policies which rest on an appalling lack of informed justification.  By the end of the book you come to see that there are no answers to those simjple questions which in any way validate our continuing on with things as they are.  As they have been for a hundred years.

Hari goes back to beginning of the war, before drugs were illegal, and takes use through the history.  Back to Harry Anslinger, the head of the Federal Bureau of Narcotics, who fueled the drug war for the twin purposes of maintaining his agency and exorcizing demons that had possessed him since childhood, even in the face of substantive research and sound arguments denying his approach to the problem had any merit and more than ample evidence—Prohibition—that it would lead to catastrophe, not only on a national but on a global level.  Hari details Anslingers battle to destroy Billie Holiday and his use of intimidation and police tactics and, subsequently, U.S. foreign policy to assure the continued crusade to eradicate drugs and destroy addicts.

Not because of any evidence Anslinger understood that led him to think this was the only if not the best way, but because he believed it in spite of growing mountains of evidence that this was a wrongheaded approach.  He suppressed evidence, hounded physicians who dared present alternative models to the drug problem, intimidated politicians—except those who could secure him funding—and strenuously denied the validity of any evidence that  contradicted his belief.

He believed.

As many did and still do.  It is this that trumps hard evidence.

Even as a young adolescent I thought the argument for our drug policies was lacking.  I thought at the time that I just didn’t understand.  Never having been in the least interested in drugs and knowing few if any who were involved with anything more serious than marijuana, it seemed not to concern me.  Later, I did a short stint as a volunteer drug counselor, but the work was far more than I could handle at the time.  I trusted the people in charge knew what they were doing and certainly the gang violence associated with drugs seemed to make a persuasive case that this was a worthwhile and often desperate fight.

But as the years went by and the war continued, I began to notice bits of research here and there and how certain politicians contradicted themselves and how the prison population, especially in the wake of Nixon’s near militarization of the police community and the drug war ethos, was growing and in very worrisome ways.  I began to seriously rethink my position with Reagan’s zero tolerance policies and the mandatory sentencing guidelines he established through Ed Meese, one of the notable Puritans of the modern age.  Even so, I had the nagging suspicion that maybe I was just missing something.  Certainly I didn’t approve of drug addiction, but I more and more came to believe that these people needed help, not punishment.  We understand that process very well with alcohol, why is it different with narcotics?

And besides, there are plenty of people who receive perfectly legal prescription narcotics and never become addicts.

The number of holes in the picture kept growing.  I no longer trusted stated drug policy, but I didn’t understand the instransigence of people over reform.

Hari’s book lays it out very clearly.  Money is high on the list.  We fund too many of the wrong people at too high a level for them to be easily weaned from the teat.  Foreign policy is also tied to this, especially in these days of international terrorism which has a strong drug component.  But the factor that ties this in to Jerry A. Coyne’s book is the one Hari covers only glancingly.

Belief.

It is easier to rely on what we have always believed than to look at evidence that requires us to change our mind.

Many aspects of our lives are covered by this observation, but where problems arise are those with political and social ramifications.  The persistent beliefs about the poor, about minorities, about people with different beliefs, even when evidence is provided which significantly challenges such beliefs and suggests strongly that not only are they wrong but that we would be better off discarding them, we cling to them.

Hence my conflation of these two books and the suggestion that they share a common idea.

Not that I argue that all beliefs are wrong.  What is wrong is the intractable nature of unquestioned belief.  The only reason the drug war continues is that it has considerable popular support and the only reason it has that is that many people cannot bring themselves to change their minds about it in the face of not only evidence but what we euphemistically call common sense.

But that can be said of so many things which directly and indirectly impact our lives.

Perhaps it is a stretch and perhaps I argue out of my own biases, but it seems to me the most valuable tool we can have in our intellectual toolbox is the ability to say “well, I believe that but I might be wrong.”  Faith cannot maintain, however, among people who are able to say that and apply it to anything and everything.  Science is built on exactly that principle—all knowledge is conditional—but belief, as exemplified by religion, thrives by the absence of that principle.  It says some knowledge is absolute, not to be questioned.

In the contemplation of matters of theological concern, this perhaps offers consolation, comfort, a certain utility, to be sure.  But it is easy for unquestioning acceptance of arguments from authority to become habit and then apply it to anything that our prejudices suggest we may wish to avoid examining.  Drug addiction is an unfortunate affliction and it may be uncomfortable for people to see it for what it is—a disease, like alcoholism.  That discomfort makes room for assertions from authority offered by people who claim to be working on our behalf.  Our unwillingness to ask evidentiary questions is a cozy environment for demagogues and despots.

When you ask “What does this have to do with me?” you might be surprised at the consequences of avoidance and the price of unquestioning trust.  Why should we learn science and how to apply the discipline of examining evidence?  So we don’t hurt ourselves out of the easy habit of belief.

Traditions and New Eyes

I recently finished rereading a book from last year, preparing to read the sequel. I should cop to the fact that my reading has rarely been what you might call “timely” and I’ve gotten worse over the last several years.  When I wrote reviews for actual pay this was not as much a problem, because I had to read current material.  But left to my own devices, I pick and choose from my to-be-read pile at random, pretty much the way I buy books to begin with.  So I might read an old Agatha Christie concurrently with a newer physics tome by Kip Thorne after having finished a very new history of the sinking of the Lusitania, then pick up a newish novel while at the same time rereading some Ted Sturgeon… So it goes.

So I am very much “behind” almost all the time.  I served as a judge for the PKD Award one year and managed to read an unbelievable number of recently-published SF novels.  I can commit and stay the course when required. But in general my reading keeps me in a kind of ever-imminent nontime in terms of how I encounter works.  I don’t sort by decade when I start reading, not unless something in the text forces me to recognize it.  So to me, it is not at all odd to see James Blish and Iain M. Banks and C.J. Cherryh and Ann Leckie as in some sense contemporaneous.

So when I encounter a novel like Charles E. Gannon’s Fire With Fire I have no trouble—in fact, take some delight—in seeing it as part of a continuous thread that connects Doc Smith to Poul Anderson to C.J. Cherryh to any number of others who over the past 70 + years have mined the fields of alien encounter/politicomilitary SF.  And when I say I found the closest affinity with Poul Anderson at the height of his Terran Empire/Flandry period, that is, for me, high praise.

I loved Dominic Flandry.  Not so much the character, though there is that, but the milieu Anderson created.  One of the appealing aspects of his future history, especially those stories, was the authentic “lived in” feel he achieved, rarely duplicated by his peers, and seldom realized to good effect now.  Gannon does this.

The story in Fire With Fire is nothing new.  Earth has begun to settle other worlds around other stars and it’s only a matter of time before we encounter other space-faring civilizations.  In fact, we have, only it isn’t public knowledge, and in some instances it’s not something the discoverers even want noticed.  While Anderson had the Cold War to work with, Gannon has the transnational world, with all its disquieting ambiguities over what constitutes nations and how they differ from corporations and the undeniable motivation of profit in almost all human endeavors, leading to an ever-shifting array of allies and enemies in arrangements not always easy to define much less see.  He takes us through all this quite handily.  It’s not so much that he knows the pitfalls of human civilizations than that he recognizes that the field is nothing but pitfalls.

“All that is necessary for evil to succeed is that good men do nothing.”  Hobbes’ dictum plays in the background throughout as good guys dupe both bad guys and good guys, people are moved around and used like game pieces, power is unleashed—or not—based on calculi often having little to nothing to do with ethics and morality.  This is politics writ large and individuals learn to surf the swells or drown.

Into which is tossed Caine Riordan, an investigative journalist who is also a good man.  He is unfortunately snatched out of his life through a security mishap, placed in cryogenic suspension, and awakened 14 years later with a hundred or so hours of missing memory dogging him through the rest of the book, memories having to do with the two men in whose thrall he seems now to be. Nolan Corcoran, retired Admiral, and Richard Downing, former SAS and often reluctant aid to Admiral Corcoran.  Not reluctant in being unwilling to serve, but reluctant about some of their methods.  They run a secret organization designed to prepare for exosapient first contact.  It practically doesn’t exist, sort of in the way gravity under certain conditions doesn’t exist, and now Caine has become their tool.

Without going into details, which are a major aspect of this novel, suffice to say it is about that first contact and the political ramifications thereof. This is a not a new idea and much of the book may, to some, feel like ground well trod, but there is ample pleasure to be had in the trek over familiar ground seen through fresh eyes.  What is done better here than the usual is the economic and political backgrounding and the debates over the impact of first contact.  Furthermore, the seemingly impossible disaffection among the various political entities comprising the world we know are displayed to lend a plangent note of nailbiting despair to the very idea that we might pull ourselves together sufficiently for anything remotely resembling a world government.

To be sure, Gannon adroitly addresses the hoary old notion that when we meet the aliens they themselves will already have worked all this out long since and be in a position to pass elder judgment on our upstart species.  They haven’t.  They have a (barely) workable framework among themselves, but the advent of introducing another new race into their club proves to be an opportunity for old issues to be forged into new knives.

Gannon handles all this well.  He clearly has a grasp of how politics works and has imaginatively extended that knowledge to how nonhuman species might showcase their own realpolitik. He has a flair for detail.  He handles description very well, sets scenes effectively, and even manages to disguise his infodumps as conversations we want to hear.  Most of the time, it has the pleasurable feel of listening to a good musician groove on an extended improvisation.  Throughout we feel sympatico with Caine and the people he cares for and the situation is certainly compelling.

For me, this was a walk down a street I haven’t visited in some time.  I read this novel with a considerable experience of nostalgia.  It is part of a tradition.  A well-executed piece of an ongoing examination over issues we, as SF fans, presumably hope one day to see in reality.  We keep turning this particular Rubik’s Cube over in our collective hands, finding variations and new combinations, looking for the right face with which to walk into that future confrontation.  One may be forgiven if this particular form of the contemplation seems so often to turn on the prospect of war.  After all, aren’t we supposed to be past all that by the time we develop star travel and put down roots elsewhere?

There are two (at least) answers to that.  The first, quite cynically, is “Why would we be?”  Granted that most wars have at least something to do with resources.  One side wants what the other side has, and you can do the research and find cause to argue that even the most gloriously honor-driven wars had deep economic aspects to them.  Certainly the conduct of all wars has deep economic consequences.  But while that is true and might be argued for most wars, it is also true that many wars need not have been fought as there were other means of securing those resources.  But that didn’t matter.  It was the war, for someone, that mattered more even than the well-being of the people, the polity.  That ill-define and in retrospect absurd thing Glory is a very real ambition down through history.  Wars get fought as much for that as for anything else.  Suddenly having all your resources needs met would do nothing to dampen that.  In fact, it might exacerbate the Napoleonic impulse in some instances.

Because the reality is that Going There will do that.  Not survey missions, no, but if we assume the level of technology and capacity that allows for colonies on world in other solar systems, then we can assume a post-scarcity economy.  It’s the only way it makes sense.  We will not solve economic problems with an interstellar empire, the empire will be the result of those solutions.

So that leaves us with the second reason we may still face war.  No less cynical but more intractable. Racism.  Not the kind of small-minded nonsense we deal with in terms of skin color and language, but the real deal—wholly different biologies confronting each other over the question of intelligence and legal rights and the desirability of association.  Deeper even than that is the history and tradition brought to the question by two civilizations with absolutely nothing in common, having developed in isolation more profound than any we might imagine on the face of the Earth.

Not that either of these are inevitable, and it may well be that sophistication of technology and its responsible use breeds requisite tolerances.  But this is, as likely as it sounds philosophically, not a given, any more than war with aliens is inevitable. So we talk about it, in the pages of fictions with long traditions. There are certainly other possibilities, other scenarios, and there are other writers dealing with those.  Gannon is dealing with this one.

And doing so with a thick cord of optimism that raises this above the level of the usual “Deltoid Phorce” clone in the tradition of Tom Clancy or some other purveyor of gadget-driven war porn.  Gannon asks some questions in the course of this novel which keep it from descending to the level of the field-manual-with-body-count-in-technicolor.  This is more like what Poul Anderson would have written, with no easy answers, and heroes who are not unalloyed icons.

It’s worth your time.

Nicely done, Mr. Gannon.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Time and Motion

William Gibson is, if nothing else, a careful writer.  You can feel it in the progress of any one of his novels and in the short stories.  Careful in his choice of topic, placement of characters, deployment of dialogue, style.  He sets each sentence in place with a jeweler’s eye to best effect.  The results often seem spare, even when they are not, and have invited comparisons to noir writers, minimalists, modernists.  Entering upon a Gibson novel is a step across a deceptively simple threshold into a finely-detailed maze that suggests multiple paths but inevitably leads to a conclusion that, in hindsight, was already determined had we but noticed just how sophisticated a writer it is with whom we’re dealing.

His last set of novels, the Bigend Trilogy, was not even science fiction, though they felt like it.  The application of a science-fictional perception of how the world works produced a dazzling bit of dissonance in which the ground itself became familiar through alienation.  He does that, shows us something we should be utterly familiar with as if it were an alien artifact.  As a result, the shock of recognition at the end contains a thick cord of nostalgia and a sense of loss mingled with new discovery.  The chief discovery, of course, is the realization just how close we are to what we think of as The Future.  Through this effect, he renders the future as both less alien and stranger at the same time.

Which is something he indulges fully in the opening chapters of his new novel, The Peripheral.

Author William Gibson. (by Michael O'Shea)

For a while you don’t know that the two points of view are not in the same world.  It’s a masterpiece of misdirection achieved through the intermediary of a game.

Flynn Fisher’s brother is ex-special ops military, living in an old airstream in a town in the middle of a mid-21st century rural America that is clearly struggling with the unstable economy.  To make extra money, he often moonlights as a beta tester on new games.  The novel opens when he brings Flynn in to sub for him one night while he goes off to confront a radical religious group he hates, known as Luke 4:5.  (The verse reads: Then leading him to a height, the devil showed him in a moment of time all the kingdoms of the world.  Even here, Gibson is playing at metaphors pertinent to the novel in its entirety.)  Flynn used to do this sort of work herself but quit when the games became more and more violent.  He assures her this isn’t like that, she’ll be running a security drone of some kind keeping paparazzi away from a high-rise luxury apartment.  He’ll pay her well, as he’s being likewise well-paid.  Just one night, maybe two.  She agrees.

The simulation seems to take place in a city she sort of recognizes and may be London, but it’s all different from the London she knows.  It’s as her brother claimed, flying interference, until the second night when the woman living there is murdered most horrifically and Flynn is a witness.  Thinking it’s still a game, she wants nothing more to do with it.

Meanwhile, Wilf Netherton, a publicist living in London, is working with a performance artist who has been tasked as a negotiator to a colony of self-modified humans living on an artificial island of reformed debris.  Wilf’s job is to keep her on task, which can be very difficult as she is very much a rebel and can go in unexpected directions without any warning.  As she confronts those with whom she is supposed to negotiate, something goes wrong and she ends up killing the leader.  Another murder.

Netherton’s associate, an operative in government intelligence, must divorce herself from the fiasco and cut ties with Netherton.  He goes to ground with a friend of his, a member of a powerful family of Russian descent, who has a unique hobby—he operates a “stub” in history.

At this point we realize that Flynn and Netherton are not simply divided by class and place but by time itself.  Netherton’s London is 70 years in Flynn’s future and is the London wherein Flynn witnessed the murder of the woman, who turns out to be the sister of the performance artist who just committed a second murder.  For her part, Flynn is in their past, a past Netherton’s friend has been playing with via a form of time travel that is based on the transfer of information.

And we are now fully in the grip of one of the cleverest time travel stories in recent memory.  Nothing physical travels, only information.  Gibson has taken a page from Benford’s classic Timescape and wrought changes upon it.  Flynn and Netherton “meet” once a police inspector of Netherton’s time becomes involved and starts running the stub Netherton’s friend has set up.  She needs a witness to the murder before she can act.  Flynn is that witness.  What follows is well-imagined set of antagonistic countermeasures that affect both worlds economically.

And that may be one of the most interesting subtexts.  Flynn finds herself the titular head of the American branch of a corporation which till then only existed as a device to explain the game she thought she was beta testing.  As such, she becomes enormously wealthy out necessity—she is under attack by the forces allied to the murderer in the future.  Politicians and corporations change hands, the economy is distorted, the world severed from its previous course, and everything is changed.

Gibson is indulging one of his favorite ideas, that information is possibly the most potent force.  Data has consequences.

Flynn is one of Gibson’s best creations since Molly Millions.  Smart, gutsy, practical, and loyal to family and friends, she adapts quickly to the staggering reality into which she and hers have stumbled.  She manages in both time zones admirably but not implausibly.  As counterpart, Netherton is an interesting case study of a man who hates the times in which he lives, is by far too intelligent to ignore it, and subsequently suffers a number of self-destructive flaws which he gradually comes to terms with as his interactions with Flynn progress.

At the heart of the novel is a question of causality, certainly, but also one of responsibility.  The pivotal point in history that separates Flynn’s world from Netherton’s is an event euphemistically called The Jackpot.  It’s a joke, of course, and a twisted one at that, as it was only a jackpot for a few who survived and became, ultimately, even wealthier than they had been.  The label refers to a collection of factors leading the deaths of billions and the loss of an entire era due to humanity’s inability to stop itself from doing all the things that guaranteed such an outcome.  It’s a cynical insight and not a particularly difficult one to achieve, but Gibson, as usual, portrays it with a dry assessment of how it will actually play out and how it will look to those who come after.  His conclusion seems to be, “Well, we really aren’t all in this together.”

The apparent simplicity of the narrative is another mask for the games Gibson plays.  It doesn’t feel like a profound or dense work.  Only afterward, in the assessment phase, do we begin to understand how much he says, how solid are his insights, and how rich are his conceits.  Gibson creates a surface over which the reader may glide easily.  But it’s a transparent surface and when you look down, there, below you, is a chasm of meaning, awaiting inspection, offered in a moment of time.

Future Historicity

History, as a discipline, seems to improve the further away from events one moves. Close up, it’s “current events” rather than “history.”  At some point, the possibility of objective analysis emerges and thoughtful critiques may be written.

John Lukacs, Emeritus Professor of History at Chestnut Hill College, understands this and at the outset of his new study, A Short History of the Twentieth Century, allows for the improbability of what he has attempted:

Our historical knowledge, like nearly every kind of human knowledge, is personal and participatory, since the knower and the known, while not identical, are not and cannot be entirely separate.

He then proceeds to give an overview of the twentieth century as someone—though he never claims this—living a century or more further on might.  He steps back as much as possible and looks at the period under examination—he asserts that the 20th Century ran from 1914 to 1989—as a whole, the way we might now look at, say, the 14th Century or the 12th and so on.  The virtue of our distance from these times is our perspective—the luxury of seeing how disparate elements interacted even as the players on the ground could not see them, how decisions taken in one year affected outcomes thirty, forty, even eighty years down the road.  We can then bring an analysis and understanding of trends, group dynamics, political movements, demographics, all that go into what we term as culture or civilization, to the problem of understanding what happened and why.

Obviously, for those of us living through history, such perspective is rare if not impossible.

Yet Lukacs has done an admirable job.  He shows how the outbreak and subsequent end of World War I set the stage for the collapse of the Soviet Empire in 1989, the two events he chooses as the book ends of the century.  He steps back and looks at the social and political changes as the result of economic factors largely invisible to those living through those times, and how the ideologies that seemed so very important at every turn were more or less byproducts of larger, less definable components.

It is inevitable that the reader will argue with Lukacs.  His reductions—and expansions—often run counter to what may be cherished beliefs in the right or wrong of this or that.  But that, it seems, is exactly what he intends.  This is not a history chock full of the kind of detail used in defending positions—Left, Right, East, West, etc—and is often stingy of detail.  Rather, this is a broad outline with telling opinions and the kind of assertions one might otherwise not question in a history of some century long past.  It is intended, I think, to spur discussion.

We need discussion.  In many ways, we are trapped in the machineries constructed to deal with the problems of this century, and the machinery keeps grinding even though the problems have changed.  Pulling back—or even out of—the in situ reactivity seems necessary if we are to stop running in the current Red Queen’s Race.

To be sure, Lukacs makes a few observations to set back teeth on edge.  For instance, he dismisses the post World War II women’s consciousness and equality movements as byproducts of purely economic conditions and the mass movement of the middle class to the suburbs.  He has almost nothing good to say about any president of the period but Franklin Roosevelt.

He is, certainly, highly critical of the major policy responses throughout the century, but explains them as the consequence of ignorance, which is probably true enough.  The people at the time simply did not know what they needed to know to do otherwise.

As I say, there is ample here with which to argue.

But it is a good place to start such debates, and it is debate—discussion, interchange, conversation—that seems the ultimate goal of this very well-written assay.  As long as it is  debate, this could be a worthy place to begin.

He provides one very useful definition, which is not unique to Lukacs by any means, yet remains one of those difficult-to-parse distinctions for most people and leads to profound misunderstandings.  He makes clear the difference between nations and states.  They are not the same thing, though they are usually coincidentally overlapped.  States, he shows, are artificial constructs with borders, governmental apparatus, policies.  Nations, however, are simple Peoples.  Hence Hitler was able to command the German nation even though he was an Austrian citizen.  Austria, like Germany, was merely a state.  The German People constituted the nation.

Lukacs—valuably—shows the consequences of confusing the two, something which began with Wilson and has tragically rumbled through even to this day.  States rarely imposed a national identity, they always rely on one already extant—though often largely unrealized.  And when things go wrong between states, quite often it is because one or the other have negotiated national issues with the wrong part.

Which leads to an intriguing speculation—the fact that nativist sympathies really do have a difficult time taking root in this country.  Americans do not, by this definition, comprise a Nation.  A country, a state, a polity, certainly.  But not really a Nation.

And yet we often act as if we were.

Questions.  Discussion.  Dialogue.  This is the utility and virtue of this slim volume.

Greatless Illusion

The third book I read recently which resonated thematically with the previous two is one I have come somewhat late to given my inclinations.  But a new paperback edition was recently released and I considered buying it.  I hesitated as I was uncertain whether anything new or substantively unique was contained therein to make it worth having on my shelf.  I have other books along similar lines and while I am fond of the author, it seemed unlikely this book would offer anything not already covered.

Christopher Hitchens was a journalist and essayist and became one of our best commentators on current events, politics, and related subjects.  Even when I disagreed with him I have always found his arguments cogent and insightful and never less than solidly grounded on available fact.

So when he published a book of his views on religion, it seemed a natural addition to my library, yet I missed it when it first came out.  Instead, I read Richard Dawkins’ The God Delusion, which I found useful and well-reasoned, but pretty much a sermon to one who needed no convincing.  Such books are useful for the examples they offer to underpin their arguments.

Such is the case with God Is Not Great: How Religion Poisons Everything.  Hitchens’ extensive travels and his experiences in the face of conflict between opposing groups, often ideologically-driven, promised a surfeit of example and he did not fail to provide amply.

The title is a challenge, a gauntlet thrown at the feet of those with whom Hitchens had sizeable bones to pick.  In the years since its initial publication it has acquired a reputation, developed a set of expectations, and has become something of a cause celebré sufficient for people to take sides without having read it.  I found myself approaching the book with a set of expectations of my own and, with mild surprise, had those expectations undermined.

Yes, the book is a statement about the nature of religion as an abusive ideology—regardless of denomination, sect, theological origin—and offers a full range of examples of how conflicts, both between people and peoples, are generally made worse (or, more often than not, occur because of) by religious infusions into the situation.  It is in many ways a depressing catalog of misuse, misinterpretation, misstatement, misunderstanding, and sometimes misanthropy born out of religious conviction.  Hitchens analyzes the sources of these problems, charts some of the history, and gives us modern day examples.

But he tempers much of this by drawing a distinction between individuals and ideologies.

He also opens with a statement that in his opinion we shall never be rid of it.  This is quite unlike people like Dawkins who actually seem to feel humankind can be educated out of any need of religion.  Hitchens understood human nature all too well to have any hope that this was possible.

He does allow that possibly religion allows some good people to be better, but he does not believe religion makes anyone not already so inclined good.

By the end of the book, there will likely be two reactions.  One, possibly the more common, will be to dismiss much of his argument as one-sided.  “He overlooks all the good that has been done.”  It is interesting to me that such special pleading only ever gets applied consistently when religion is at issue.  In so much else, one or two missteps and trust is gone, but not so in religion, wherein an arena is offered in which not only mistakes but serious abuse can occur time and time again and yet the driving doctrine never called into question.  The other reaction will be to embrace the serious critique on offer, even the condemnations, and pay no attention to the quite sincere attempt to examine human nature in the grip of what can only be described as a pathology.

Because while Hitchens was a self-proclaimed atheist, he does take pains to point out that he is not talking about any sort of actual god in this book, only the god at the heart of human-made religions.  For some this may be a distinction without a difference, but for the thoughtful reader it is a telling distinction.  That at the end of it all, Hitchens see all—all—manifestations of gods through the terms of their religions as artifices.  And he wonders then why people continue to inflict upon themselves and each other straitjackets of behavior and ideology that, pushed to one extreme or another, seem to always result in some sort of harm, not only for the people who do not believe a given trope but for the believers themselves.

We are, being story-obsessed, caught in the amber of our narratives.  Per Mr. Thompson’s analysis of myth, we are never free of those stories—even their evocation for the purposes of ridicule bring us fully within them and determine the ground upon which we move.  The intractable differences over unprovable and ultimately unsubstantiated assumptions of religious dictate, per the history chronicled around the life Roger Smith, have left us upon a field of direst struggle with our fellows whose lack of belief often is perceived as a direct threat to a salvation we are unwilling ourselves to examine and question as valid, resulting in abuse and death borne out of tortured constructs of love.  Christopher Hitchens put together a bestiary of precedent demonstrating that treating as real the often inarticulate longings to be “right” in the sight of a god we ourselves have invented, too often leads to heartache, madness, and butchery.

The sanest religionists, it would seem by this testament, are those with the lightest affiliation, the flimsiest of dedications to doctrine.  They are the ones who can step back when the call to massacre the infidel goes out.

All of which is ultimately problematic due simply to the inexplicable nature of religion’s appeal to so many.

But it is, to my mind, an insincere devoteé who will not, in order to fairly assess the thing itself, look at all that has been wrought in the name of a stated belief.  Insincere and ultimately dangerous, especially when what under any other circumstance is completely wrong can be justified by that which is supposed to redeem us.

Monstrous Partiality

In keeping with the previous review, we turn now to a more modern myth, specifically that of our nation’s founding.  More specifically, one component which has from time to time erupted into controversy and distorted the civil landscape by its insistence on truth and right.

But first, a question:  did you know that once upon a time, in Massachussetts, it was illegal to live alone?

There was a law requiring all men and women to abide with families—either their own or others—and that no one, man or woman, was permitted to build a house and inhabit it by themselves.

John M. Barry details this and much more about early America which, to my knowledge, never makes it into history classes, at least not in primary or secondary schools, in his excellent book  Roger Williams and the Creation of the American Soul: Church, State, and the Birth of Liberty.

9780670023059_p0_v1_s260x420

Discussion of the Founding—and most particularly the Founding Fathers—centers upon the Revolutionary Era collection of savants who shaped what became the United States.  It is sometimes easy to forget that Europeans had been on these shores, attempting settlements, for almost two centuries by then.  It’s as if that period, encapsulated as it is in quaint myths of Puritans, Pocahontas, Squanto, John Smith, and Plymouth Rock, occupies a kind of nontime, a pre-political period of social innocence in which Individuals, whose personalities loom large yet isolated, like Greek Gods, prepared the landscape for our later emergence as a nation.  My own history classes I recall did little to connect the English Civil War to the Puritan settlements and even less to connect the major convulsions in English jurisprudence of that period to the the evolution of political ideas we tend to take for granted today.  In fact, it seems pains are taken to sever those very connections, as if to say that once here, on North American soil, what happened in Europe was inconsequential to our national mythos.

That illusion is shattered by Barry in this biography of not only one of the most overlooked and misunderstood Founders but of that entire morass of religious and political struggle which resulted in the beginnings of our modern understanding of the wall of separation between church and state.  More, he makes it viscerally real why  that wall not only came into being but had  to be.

If you learned about Roger Williams at all in high school, probably the extent of it was “Roger Williams was a Puritan who established the colony that became Rhode Island.  He contributed to the discussion over individual liberty.”  Or something like that.  While true, it grossly undervalues what Williams actually did and how important he was to everything that followed.

In a way, it’s understandable why this is the case.  Williams occupies a time in our history that is both chaotic and morally ambiguous.  We like to think differently of those who settled here than they actually were, and any deeper examination of that period threatens to open a fractal abyss of soul searching that might cast a shadow over the period we prefer to exalt.

But the seeds of Williams’ contribution were sown in the intellectual soil which to this day has produced a troubling crop of discontent between two different conceptions of what America is.

The Puritans (whom we often refer to as The Pilgrims) were religious malcontents who opposed the English church.  They had good reason to do so.  King James I (1566 – 1625) and then his son, Charles I (1600 – 1649), remade the Church of England into a political institution of unprecedented intrusive power, establishing it as the sole legitimate church in England and gradually driving out, delegitimizing, and anathematizing any and all deviant sects—including and often most especially the Puritans.  Loyalty oaths included mandatory attendance at Anglican services and the adoption of the Book of Common Prayer.  The reason this was such a big deal at the time was because England had become a Protestant nation under Queen Elizabeth I and everything James and Charles were doing smacked of Catholicism (or Romishness), which the majority of common folk had rejected, and not without cause.  The history of the religious whipsaw England endured in these years is a blood-soaked one.  How people prayed, whether or not they could read the Bible themselves, and their private affiliations to their religious conceptions became the stuff of vicious street politics and uglier national power plays.

So when we hear that the Pilgrims came to America in order to worship as they saw fit, we sympathize.  Naturally, we feel, everyone should be allowed to worship in their own way.  We have internalized the idea of private worship and the liberty of conscience—an idea that had no currency among the Puritans.

The Puritans were no more tolerant than the high church bishops enforcing Anglican conformity in England.  They thought—they believed—their view of christian worship was right and they had come to the New World to build their version of perfection.  A survey of the laws and practices of those early colonies gives us a picture of ideological gulags where deviation was treated as a dire threat, a disease, which sometimes required the amputation of the infected individual: banishment.

Hence the law forbidding anyone from living alone.  It was thought that in isolation, apart from people who could keep watch over you and each other, the mind’s natural proclivity to question would create nonconformity.

Conformity is sometimes a dirty word today.  We pursue it but we reserve the right to distance ourselves from what we perceive as intrusiveness in the name of conformity.  Among the Puritans, conformity was essential to bring closer the day of Jesus’ return.  Everyone had to be on the same page for that to occur.

(Which gave them a lot of work to do.  Not only did they have to establish absolute conformism among themselves, but they would at some point have to go back to England and overthrow the established—i.e. the King’s—order and convert their fellow Britons, and then invade the Continent and overthrow Catholicism, and all the while they had to go out into the wilderness of North America and convert all the Indians…but first things first, they needs must become One People within their own community—something they were finding increasingly difficult to do.)

Into this environment came Roger Williams and his family.  Williams was a Puritan.  But he also had a background as apprentice to one of the most formidable jurists in English history, Sir Edward Coke, the man who ultimately curtailed the power of the king and established the primacy of Parliament.  Coke was no Puritan—it’s a question if he was anything in terms of religious affiliation beyond a christian—but he was one of the sharpest minds and most consistent political theorists of his day.  He brought WIlliams into the fray where the boy saw first-hand how power actually worked.  He saw kings be petty, injustices imposed out of avarice, vice, and vengeance in the name of nobly-stated principles.  And, most importantly, he saw how the church was corrupted by direct involvement in state matters.

This is a crucial point of difference between Williams and later thinkers on this issue.  Williams was a devout christian.  What he objected to was the way politics poisoned the purity that was possible in religious observance.  He wanted a wall of separation in order to keep the state out of the church, not the other way around.  But eventually he came to see that the two, mingled for any reason, were ultimately destructive to each other.

Williams was an up-and-coming mover among the Puritans, but the situation for him and many others became untenable and he decamped to America in 1631, where he was warmly received by the governor of Massachussetts, John Winthrop.  In fact, he was eagerly expected by the whole established Puritan community—his reputation was that great—and was immediately offered a post.

Which he turned down.

Already he was thinking hard about what he had witnessed and learned and soon enough he came into conflict with the Puritan regime over matters of personal conscience.

What he codified eloquently was his observation that the worst abuses of religiously-informed politics (or politically motivated religion) was the inability of people to be objective.  A “monstrous partiality” inevitably emerged to distort reason in the name of sectarian partisanship and that this was destructive to communities, to conscience, to liberty.

For their part, the Puritans heard this as a trumpet call to anarchy.

The Massachussetts Puritans came very close to killing Williams.  He was forced to flee his home in the midst of a snowstorm while he was still recovering from a serious illness.  He was succored by the Indian friends he had made, primarily because he was one of the very few Europeans who had bothered to learn their language.  They gave him land, which eventually became Providence Plantation, and he attracted the misfits from all over.  Naturally, Massachussetts saw this as a danger to their entire program.  If there was a place where nonconformity could flourish, what then became of their City on the Hill and the advent toward which they most fervently worked?

The next several years saw Williams travel back and forth across the Atlantic to secure the charter for his colony.  He knew Cromwell and the others and wrote his most famous book, The Bloody Tenent of Persecution, for Cause of Conscience,
in 1644 right before returning to America to shepherd his new colony.  In this book for the first time is clearly stated the argument for a firm wall of separation.  It is the cornerstone upon which the later generation of Founders built and which today rests the history of religious freedom we take as a natural right.

But the struggle was anything but civil and the abuses to which Williams responded in his call for a “Liberty of conscience” are not the general picture we have of the quaint Pilgrims.

Barry sets this history out in vivid prose, extensively sourced research, and grounds the story in terms we can easily understand as applicable to our current dilemma.  One may wonder why Williams is not more widely known, why his contributions are obscured in the shadow of what came later.  Rhode Island was the first colony with a constitution that did not mention god and it was established for over fifty years before a church was built in Providence.

Williams himself was not a tolerant man.  He loathed Baptists and positively hated Quakers.  But he valued his principles more.  Perhaps he saw in his own intolerance the very reason for adoption of what then was not merely radical but revolutionary.