Tardiness comes in direct proportion to chaos. The year ended and all was in flux.

However, reading goes on.

I did not finish nearly as many books in 2016 as I tried to. At least, not other people’s books.  I did finish drafts of two of my own.  My desk, at the moment, is clear, and maybe I can do a better job in 2017 of keeping abreast here.

A good deal of my science fiction reading was pretty much for the reading group I host at Left Bank Books. That group affords me opportunity and motivation to read novels I might not otherwise get to.  So I reread Alfred Bester’s The Stars My Destination for the first time in three decades, but I also read The Left Hand of Darkness for the first time ever. I do not regret the delay. It is a mature novel, with a great deal my younger self may well have missed.  As to the former, it came very close to not holding up.  I had forgotten (if I ever realized it this way) just how brutal a novel it is, and not just in the character of Gully Foyle. Bester’s achievement way back in the Fifties remains remarkable for its unyielding insistence on a fragmented, painful, chaotic, and historically consistent future.

I also reacquainted myself with Tiptree, in the form of Her Smoke Rose Up Forever. It seems fitting in this period of reassessment and revolution, when the face of science fiction is—has—changed and brought forth a volatile reaction to that change.  Tiptree was doing much of what is being so rancorously challenged within the field today, but as she was a singular voice and not a “trend” she provoked different challenges then while becoming accepted generally as a brilliant writer and a jewel in the crown of SF stars.

I also reread (for the first time since it came out) Robert Silverberg’s Dying Inside, which I reviewed in the previous post.  I was much too inexperienced a reader the first time to appreciate everything Silverberg was doing, so I probably forgot the book as soon as I finished it.

It is true that some books must be “grown into”—I am currently rereading Samuel R. Delany’s Trouble On Triton for the book group and realizing that, while I read it eagerly the first time, I probably missed almost everything important about. Likewise with another reread, Gene Wolfe’s The Fifth Head of Cerberus, which is ostensibly a novel about colonialism.  I say “ostensibly” but that does not mean it isn’t.  It very much is about colonialism, all three of the novellas which comprise the whole.  But it is as much about how we colonize ourselves, sometimes to our loss, as it is about colonizing foreign soil, in this case another world with a native population that strives to adapt but may have found in the end their only options were extinction or counter-colonization.  As always, Wolfe’s subtlety is rigorously slippery, his points less direct,  corrosive of expectation.

Titan Books has rereleased Michael Moorcock’s Cornelius Chronicles, a story cycle that is the very definition of indirect.  Moorcock took as his template the Romantic poets—Byron, Shelley, et al—and displaced them into a near future chaos in the form of his “hero” Jerry Cornelius, who wants to save the world only to resurrect his dead sister so they can be together.  The prose are rife with Sixties hip, but not so overwhelmingly anachronistic that the novels aren’t just as readable now as they were then.  The response to them is perhaps necessarily altered and certainly the themes play out differently. Moorcock may have been the grown-up in the room at the advent of New Wave.  He did go on to write some marvelously rich books after these.

I finished Ann Leckie’s delightfully subversive Ancillary trilogy.  I need to do a full review soon.  Treat yourself.

A smattering of other SF titles I can recommend whole-heartedly:  Lavi Tidhar’s Central Station; Sylvain Neuvel’s Sleeping Giants; Carter Sholz’s Gypsy; Binti by Nnedi Okorafor.

And Nisi Shawl’s wonderful Everfair.  An alternate history steampunk done the way steampunk ought to be done.  I owe it a full review, but let me say here that this is one of the best first novels I’ve read in a long time.

I read two China Mieville books this year, one very good.  This Census Taker I have to count as a failure.  It has good writing fascinating bits, but failed to come together the way I’ve come to expect from Mieville.  The other, newer one, is The Last Days of New Paris, which is excellent.  This pair allowed me to understand that one of the primary passions Mieville indulges in his work is cities.  His best work portrays a city as a complete character.  This Census Taker lacked that.

Of the non science fiction read this year, I did Moby-Dick with my other reading group.  I resisted doing this book.  I’ve never liked it.  I find it turgid, convoluted, often opaque.  There is also a darkness to it that can be suffocating. Over several months we tackled it, dissected it, ran through various analyses.  I conclude that it is a superb work, fully deserving of its reputation.  It is A great American novel if not The American Novel, because America is its subject, though it takes place on a whaling ship far at sea.  It is not a flattering picture, though, displaying throughout the contradictions, hypocrisies, and shortcomings of the then young nation which continue to plague us.  It does this brilliantly.

I still don’t like it.  I find little pleasure in the actual reading.  That, as they say, is my problem.

A colleague and coworker, Kea Wilson, published her first novel, We Eat Our Own. I commend it.  I reviewed it here.

A novel that straddles the genre boundaries somewhat that caused some controversy upon its initial publication is Kazuo Ishiguro’s The Buried Giant.  This is a post-Arthurian quest story with much to say about memory and community and the price of vengeance.

This was a big year for nonfiction.

Robert Gleick’s new tome, Time Travel: A History is an exceptional soliloquy on the concept, science, and cultural use of time travel, beginning with Wells and covering both the scientific realm and the popular fiction realm, showing how they have played off each other and how the idea has evolved and worked through our modern view of the universe and our own lives.  Previously in the year I’d read his magnificent biography of Richard Feynman, Genius.  Gleick is a great explainer and a fine craftsman.

As well, Carlo Rovelli’s Seven Brief Lessons About Physics.  They are brief, they are accessible, they are to be enjoyed.  And, along the same lines, Void by James Owen Weatherall, about the physics of empty space.  It’s far more fascinating than it might sound.

I can recommend Peter Frankopan’s Silk Roads, which is a history of the world from the viewpoint of the Orient.  The shift in perspective is enlightening.  Along the same lines I read Charles Mann’s 1491, which was eye-opening and thought-provoking—and in some ways quite humbling.

I also read Arlie Russell Hochschild’s Strangers In Their Own Land, especially in the wake of what I think I can safely call the most surprising election result in recent history. This book is a study of the right-wing culture that has developed in many startlingly contradictory ways.  I believe this would be worth reading for anyone trying to make sense of the people who continually vote in ways that seem to make no sense—and also for those who do vote that way just so they might understand what it is about their movement that seems so incomprehensible to many of their fellow citizens.

I read a few short of 50 books in 2016 cover to cover.  I will be reviewing some of them in the future.

Here’s hoping for a good year of reading to come.




Urban Character

After a number of extraordinary novels, one thing is clear about China Mieville’s work: he loves cities.

New Crobuzon, Embassytown, London, Beszel/Ul Qoma—each distinctive, layered, multifaceted, richly alive, and impossible to map as any living being’s soul, these remarkable urban spaces center, anchor, and frame the human (and not human) people who inhabit them. Consequently, the revelations of their interactions acquire architectonic depth fully evocative of the qualities of amazement, wonder, and dismay good science fiction should produce.

But each is unique, a character unto itself. Likewise in his new novel, The Last Days of New Paris, which gives us a Nazi occupied Paris that has swallowed its conquerors in the very decadence their ideology sought to suppress by supplanting it with their own.

In the early days of the occupation, a young American schooled the occult symbolist morphologies of Crowley and company infiltrates to find the enclave of French Surrealists holed up in city center. He finds them ensconced in a kind of internal exile, playing at resistance by ignoring the Nazis and pretending they are the gatekeepers and caretakers of the essential Paris.   Breton, Varo, Lamba, others. The American has brought a device–Americans have always been good at devices—which, in one frenetic evening, manages to capture the surrealist essence of these imagineers and store.  The “battery” is conceivably a tremendous weapon with which to fight the Nazis, but it is stolen, and then in at the end of a series of tragic inevitabilities, explodes, unleashing the transformative power it contains on the very fabric of Paris.

Which becomes a living, shifting, mutating landscape of surrealist manifs, blind alleys, cavernous enclaves, cul-d-sacs, and psychic traps and pitfalls.  The most effective fighters are those who navigate this landscape, understand at least what is happening if not how, and can tap into the indeterminate loyalties of the now living architecture.

The actions shifts between 1941 and 1950, a year in this universe wherein the Nazis are still in Paris and, presumably, in a large portion of Europe, and the war continues.  Events are building toward some kind of a climax, with the Nazis attempting the manufacture their own manifs.  They lack the necessary turn of mind, though, and all their attempts are stillborn or ruinously self-destructive.  But they doggedly continue until it seems hell itself feels threatened by their machinations.

The novel itself is riddled with Surrealist quotes, riffs, nods, and inspirations. This is an alternate history built on the notion that imagination and art can be as brutally decisive in waras any martial technology—but that the deployment of such visions must be done with care. The Nazis used symbols and a dark vision of æsthetic insistence to drive their machine. It can be argued that they failed because they did not fully understand either the power of imagery or the way in which human imagination will never be yolked to serve a purely nihilist aim. The humanitarian drives that confronted them and stopped them in our reality are kin to the bizarre visions which in Mieville’s skilled renderings shackled the Nazis to a fight that could not be finished, certainly not in their favor. The climax and denouement are equally decisive and inconclusive, as it should be.

But the tour through this externalized, foregrounded metaphor of a city is a brilliant odyssey through the power of human imagination.

Sleeping Dragons

Kazuo Ishiguro works a consistent theme. Even in his earliest novels, he explores the manner in which people refuse to acknowledge the reality through which they move. Many of his characters display a kind of aphasia, an inability to grasp the issues surrounding them, the motives of people, even those they are close to, or what is unfolding before their eyes. In a way, they are peculiarly narcissistic. I say peculiar because quite often their sense of themselves is the last thing they seem concerned with, even when others are.

At times this has led him to experiment with tactics of evasion that result in novels that resist our attempts to connect, even to access what is going on, but we read them anyway because he cloaks the experiments with plots and devices that hold our interest, but which we suspect are little more than extensions of the evasions at the core of his characters’ lives.

In a few instances, he has his characters actually go out in search of the mystery that seems to enshroud their worlds, though usually they look in the wrong places or simply fail to comprehend what they discover.

Such is the motive behind Axl and Beatrice as they leave their small village in the heart of a post-Arthurian England to find their long-absent and possibly estranged son and perhaps get to the bottom of the cloying fog suffocating memory. Their journey takes them to the source of a strange amnesia in The Buried Giant.

The landscape is mythic. This is a land occupied by Britons and Saxons. It is a land that has only recently been host to the epic struggles of King Arthur, Merlin, his knights, and the aspirations of Camelot. If there is any doubt how real Ishiguro intends us to treat this, he dispels such doubt by having Axl and Beatrice encounter the aging Sir Gawain, one of the few survivors of those days.

There is much of the Quixote in this Gawain, although his skills are impressive. Age alone has blunted his abilities. Ostensibly, he is still on a quest. Not the Grail. No, that is never mentioned. Rather he claims to be on a mission to slay the she-dragon Querig.

Joining them is a young Saxon warrior, Wistan, and a boy he has rescued from a village where because of a wound the boy suffered from ogres the villagers intend to kill him for fear that he will become an ogre.  As, indeed, he is destined to—but not in the way superstition would have it.

Wistan for his part is also on a mission.  He, too, is on the hunt for Querig. But for him Querig’s demise is but a means to an end, and a terrible end at that. He and Gawain come into conflict over it eventually and thereby we learn both the source of the Mist, which robs people of their memory, and a truth about King Arthur not recorded in the myths.

Through all this, even as it would seem rich material for a dense fantasy about knights and dragons and kings and ogres, Ishiguro’s focus is on Axl and Beatrice and the nature and quality of commitment and forgiveness.  For in the mists of poorly-glimpsed memory there are terrible things between them and as they progress on their journey to find their son Axl begins to have second thoughts, not at all sure he wants to remember, afraid that perhaps he had been the cause of great pain and sorrow.  Ishiguro is concerned here primarily—and almost exclusively—with the nature of time, memory, and forgiveness and the many ways they are the same essential thing.

In that sense, the controversy he stirred when the novel appeared by claiming that he was not writing a fantasy—that he did not want to be seen as plowing the same fields as George R.R. Martin or Patrick Rothfuss—was unfortunate. He spoke truly.  This is not a fantasy in the sense of contemporary sword & sorcery or secondary-world fantasies.  He is not doing the same thing as Martin, although he may have borrowed a subtheme or two from Tolkein. His disclaimer was taken as a derogation of fantasy, yet one can see from the text that he is fond of those elements of the book taken directly from the long tradition of English fantasy.

If there is a fantasy element here worthy of the name it is in his portrayal of the end of a mythology and the terminus of one world as it transforms into another.  The Buried Giant is about remembering as much as it is about things forgotten.  The changes soon to be wrought by the conclusion of Wistan’s quest and Gawain’s final stand have to do with how history turns and what is taken after a time of interregnum during which things lost are grasped, reshaped, and put to new uses.

But it is always about what is between people and how we use memory and its infelicities.

As in other Ishiguro novels, there is much that annoys.  His characters talk.  And talk and talk and talk and often it is about nothing until we realize that it is all tactic.  Dissimulation as replacement for substantive communication—until finally the act of avoidance itself becomes the point and the things hidden are revealed by inference. Axl and Beatrice as blind and trying to perceive the elephant they explore with tentative fingers. That it is to a purpose, however, makes it no less frustrating, but it would be a mistake to see this as anything other than absolutely intended.

The point of the quest–for all of them–becomes evident when at last they find Querig and it turns out not to be what they had all expected.  And we then see how myth sometimes is more useful than reality.


James Gleick’s biography of physicist Richard Feynman ought to be part of all high school science classes. Not only does he chronicle the life of one of the preeminent scientists of the 20th Century, not only does he portray the chief puzzle of physics in that century clearly, he manages to convey the substance of the work in enviably accessible prose without once “dumbing down” any of it. All this while using remarkably few equations and those usually only in description of what they do, not in the numbers and symbols themselves. One comes away from the book—Genius: The Life And Science of Richard Feynman—feeling that this would not be such a difficult subject, or at least feeling that it would be as much a human endeavor as art or music or engineering or accounting.

Science is encased in an opaque mythography that seems designed to make people feel inferior. In the main, this is a consequence of language.  At one time, the language of science was abstruse in the extreme. Isaac Asimov once wrote a short story poking fun at the tortured convolutions of scientific jargon.

I say at one time. An effort has been made in recent decades to make it clearer. It occurred to some that the density and deliberate complexifying of scientific papers itself had done unintended damage to the field by making it inaccessible to the very people it is ultimately intended to benefit.  We might not have such frustrating debates going on today in the social and political realms over climate or vaccination had scientists themselves, as part of in-group cultural arcana, kept the lay public at such a distance by making what they do appear simultaneously elitist and impenetrable.

Feynman himself rejected such practices throughout his career.  He never assumed people—average people—couldn’t understand.  Just say it plain.  If you could not explain it clearly, he believed, then you yourself did not understand it. He was constantly railing against “stupid questions.”  Questions either too vague, too big, or too beside the point for any kind of reasonable answer.

But he wanted the questions. He wanted to see that spark of interest, and if he saw a glimmering he would try to fan it into flame. His enthusiasm was infectious.

Richard Feynman became one of the most important questioners in 20th Century science. Partly this was due to his idiosyncratic approach to problem-solving. For example, he rarely ever finished reading a paper by someone else.  He would read just enough to understand the problem and then work it out for himself. He didn’t want the solutions handed to him, he wanted the challenge and, evidently, deep pleasure of doing it himself. Of course, in that way he also found errors, logical inconsistencies, even fraud on occasion.  He was a prodigious calculator, often able to do complex  equations in his head. He intimidated and fascinated in equal measure.

What some mistook for slapdash, undisciplined thinking was rather underpinned by a vigorously compulsive commitment to fact and, ultimately, truth. The rigor of his work ultimately proved the equal or superior to many if not most of his contemporaries. He insisted that the universe make sense and, crucially, he was unafraid of admitting he did not know something.

He lost the love of his life while working on the atomic bomb, a perhaps unfortunate pairing of profound experiences which, while he seldom talked about either, perhaps informed his seemingly random path through post WWII physics.  He was late in receiving a Nobel Prize, partly by his inability to find the “right problem” to work on.But in the course of his search, the work he did informed and solidified work done by many others.

Feynman may have been a genius.  In a scintillating chapter, Gleick examines the subject of genius itself, partly to address the peculiar circumstance that we seem no longer to have any geniuses. This, he suggests, is a byproduct of the fact that we have so many and in a wide range of fields.  What we seem to lack is the singular personality to which the label readily appends. We have no public space anymore for an Einstein.

Or a Feynman.  But that does not mean we do not have them…or that we do not need them. Now perhaps more than ever.

Gleick has humanized Feynman in this, although in Feynman’s case that may never have been needed.  He was known for playing bongos, he was a raconteur, he spoke in a thick New York accent, and he came across often as a kind of rural wit, plainspoken and commonsensical to the core.  Yet his major work was in one of the most difficult and demanding aspects of modern science and he was a major presence. Appearances too often supplant substance.

Knowing this, Gleick also humanized the subject Feynman devoted his life to, making the science accessible and, to a surprisingly large extent, comprehensible to the nonspecialist.

In an era in which “hero” is too often applied to the physical—athletes, models, soldiers, actors—and may itself be a term corrupted by overuse and inconsistent application, it might serve us well to draw back and consider how little attention is paid to thinking and the solving of problems.  The process alone is contributory and ought not be beholden to budget committees or P&L charts or mere application review. That there are people who spend their time unraveling the mysteries of the universe, not in some mystical sense of gurus on mountaintops but in the sense of an Einstein figuring out why the universe is the way it is, should be a source of inspiration.  In the final analysis, it is likely that people like Richard Feynman give us more as a culture and a civilization than all the pseudo-philosophical mouthings of all the gurus that have ever lived.  That one can pull a device out of one’s pocket, almost anywhere on the planet, and look up any or all of those gurus is a consequence of people like Feynman figuring out the nature of the real unseen, the quantum level of reality, which, as Feynman would have insisted, is Reality.


In Review

2015 is done and I have read what I read.  It was a year fraught with turmoil in science fiction, a year prompting reassessments, a year when required reading competed with reading for pleasure, and the time constraints of working on a new novel (two, in fact) impeded chipping away at my to-be-read pile, which mounds higher.

As in the past, I count only books I have read cover to cover here.  If I added in total pages of unfinished reading, I’m probably up with my usual volume (somewhere around 90 books), but that would be a cheat.  That said, I read 50 books in 2015.

One thing I concluded, both from what I read and the upheaval in the background about what is or is not worthy science fiction, is that the decades long pseudowar between mainstream and genre is over.  Skirmishes will continue to be fought here and there, certain elements will refuse to yield or concede, but by and large the evidence suggests that, on the part of the literary writers at least SF has made its point. A couple of examples:

Station Eleven by Emily St. John Mandel is science fiction.  In fact, after talking it over for nearly a year since I read it, it seems to me to be Heinleinesque.  Better written, the characters less exemplars than real people, but in basic conceit and plot, this is a Heinlein novel. It has all the elements—survivors, a plucky heroine, a global catastrophe forcing those who remain to learn quickly a whole suite of new skills, and an ongoing discussion throughout about what is of value and ought to be preserved.  It is a superbly written work and that alone made the identification difficult.  Heinlein, at his best, could be as good as anyone in any genre, but to see the form raised to this level shows both his virtues and his weaknesses.  The population of the Earth is reduced buy a superflu.  The novel flashes back and forth around the life of a kind of patriarch whose biological and artistic progeny struggle in a post-technological world to both survive and preserve the best of that former world.  The novel prompts questions, challenges preconceptions, and draws us in.  It was not marketed as science fiction and it has continued to sell very well.  It is science fiction and no one has batted an eye.

The Water Knife by Paolo Bacigalupi.  An ecological thriller, an examination of a different kind of breakdown, a different kind of survival, peopled by characters as real as can be.  In a decade this will be historical fiction, probably, but it is SF and also mainstream and also uncategorizable.  Exceptional.

Straddling the boundary is Jeff Vandermeer’s Annihilation, which is a curiosity.  It proceeds like a straightforward “survey mission” novel—specialists set down upon an alien world and struggling to unravel its mysteries before said world kills them.  Only in this case the “alien” world in a patch of reclaimed wilderness somewhere along the eastern seaboard, probably north Florida, that is undergoing some strange transformations due to an experiment gone wrong.  There are touches of zombie fiction, government conspiracy, and even Lovecraftian uber-malignancy evoked, but the story, as told by The Biologist, feels more meta than any of those suggest.  the landscape works to inform the soul-wrenching recognitions and evolutions within the Biologist as she works to understand what is going on in the aptly named Area X.  Vandermeer has created a work bordering on genius here by virtue of externalizing and foregrounding mystical revelation as ecological transmutation, but as you read you can’t tease the meta passages from the plot in any clear way, so the experience, when you give yourself over to it, is wholly immersive.

So what I’m seeing—in many more titles still on my TBR pile—is the embrace of science fiction by what was formerly an ambivalent cadre of artists who are using it to ends traditionally ignored by main-body SF.

In the other direction, the infusion of literary concerns, which necessarily drag real-world issues in with them, into genre writing has prompted a squeal of protest from those who wish to keep their starships pure, their aliens obvious, and their weapons decisive.  “Good writing” is still a poorly understood quality by too many in the genres (by no means a problem exclusive to SF, but because of the nature of SF a problem which yields far more obvious failures) and the clinging to an aesthetic attributed to the so-called Golden Age and exemplified by writers probably more often revered than actually read (and therefore misperceived in intent) has exacerbated the old antagonisms and a final flaring up of fires dying to ash.  The clunky sentence is a hallmark of much of this, more likely as consequence rather than intent, and the cliched scenario becomes more obviously so as the whole point of what we mean by “literary” in its most useful mode is overlooked or, perhaps, willfully ignored in a fit of defensive refusal to pay attention to what matters, namely the truth of human experience and the profitable examination of, for want of a better word, the Soul.

Where the cross-fertilization of mainstream and genre has been successfully accomplished, we’ve been seeing novels and stories of marvelous effect.  We have been seeing them all along and in the past such examples were readily offered as proof that SF wass “just as good” as anything published as mainstream.  I’ve always felt that being “just ad good” was selling our potential short, but the work has to rise to the challenge, and there always have been such works.

Among such that I read this past year were a few from that rich past, mainly for the reading group I host at work.  The Two of Them by Joanna Russ; Extra(Ordinary) People, also by Russ; The Doomsday Book by Connie Willis; Mythago Wood by Robert Holdstock; The Sparrow by Mary Doria Russell; and Engine Summer by John Crowley.  In retrospect, there have always been writers writing in the full embrace of science fiction but without any of the stylistic constraints of its pulp origins, and such works remain trenchant and readable and offer surprising commentary still on current questions.

The Sparrow was a highlight. I have known since its publicatin that it was sort of a riff on James Blish’s classic, A Case Of Conscience, but it so much more. Russell’s elegant reversal of the moral question elevates this novel to the top tiers of useful literary works. I have not yet read its sequel, but I am looking forward to it after this treat.

I also reread Harlan Ellison’s Shatterday for the reading group. It’s been a good long while since I did so and I was not disappopinted, although I read many of the stories through a more cynical eye. The opening tale, Jeffty Is Five, remains, for me, one of the most gutwrenching short stories of all time.

Another highpoint this past year was James Morrow’s new novel, Galapagos Regained, a neatly unclassifiable work of speculative history.  I gave it a lengthy review here and recommend a look. This is a superbly done work that deserves more attention than it has received.

I also read Morrow’s amusing novella, The Madonna and the Starship, which runs a delightful gamne via Fifties television and alien visitors who come to bestow an award and offer assistance in exterminating the irrational on Earth.  Morrow is acerbic even as he is funny.

Among the most interesting new works of science fiction I red this year is The Three-Body Problem by Cixin Liu, translation by Ken Liu.  This is the first part of a trilogy about alien invasion and resistance as written from a Chinese perspective.  It is an exceptional translation.  It won the Hugo Award, the first, I believe, translation to do so, and certainly the first Asian novel to win.  There is high-end physics, nasty politics, murder, and the conundrums of committed action. The cultural quirks made it even more interesting.

Like almost everyone, it seems, I read The Martian by Andrew Weir. This was great fun and well executed.  My quibble, along with many others, was with the opening gambit to explain the marooning of the astronaut, but I’m content to see it as a mere dramatic choice.  It didn’t preent me from enjoying the rest of the book, which, in the words of the screen adaptation, “scienced the shit out all this” and did so in an accessible and entertaining manner which I applaud.  I couldn’t help seeing it as a newer version of an older film, Robinson Crusoe On Mars, and naturally this one works a bit better.  Hell, we know more, there’s no excuse for bad science, and Mr. Weir that.  He wrote a realistic piece of speculation and followed through admirably.

Another novel that gave a far more “realistic” view of an old, favorite SF trope, is Kim Stanley Robinson’s Aurora.  There is much to love about this book, but it is not lovable.  It’s a clear-eyed look at what an interstellar generation ship would actually be like.  And it is bleak, in terms of the traditions of SF.  Suffice it to say without giving away too much that Robinson fully incorporates entropy into his formula with predictably gloomy results, but for all that it is a thoroughly engaging work.

At the other end of the “hard” SF spectrum is Charles Gannon’s Fire With Fire.  Future interstellar expansion brings humanity into contact with our neighbors.  The resulting tensions drive the novel.  I reviewed it here.

Science fiction is a broad, broad field and has room for a magnificently wide range even on the same subjects.  It even has room, as I noted above, for exceptional style.  One of the most enjoyable reads for me, on that note, was Ian McDonald’s new novel, Luna.  There will be comparisons made to Heinlein’s The Moon Is A Harsh Mistress.  Look for an upcoming review where I will argue that the comparison, while in some ways valid, is superficial.  Anyone who has not read McDonald, treat yourself.  This would be a good one with which to begin.

In a completely different area of the playground, there is Daryl Gregory’s AfterParty, which I found excellent.  It’s about drug abuse and the workings of delusion and murder.  Anything I might say here would spoil it.  Go.  Find it.  Imbibe.

The bulk of my reading, after that and a few other titles, has been scattered.  I found a brand new history of the Group f64, which was the first dedicated group of photographers to push the pure art of the straight photograph.  Ansel Adams, Edward Weston, Imogen Cunningham, several others, in the 20s and 30s established the ground upon which all photography came to be viewed for the rest of the 20th century and even, arguably, into today. Mary Street Alinder, who has previously written a biography of Ansel Adams, did a superb job chronicling this group of prickly independent artist.

I read a history of a superhero, Wonder Woman, and discovered that the story of her creation was even stranger than the character herself.

A new work by journalist Johann Hari, Chasing The Scream, opened my eyes to the thorny issue of the Drug War.

In the wake of seeing the film Interstellar and beginning work on my own novel about (partly) interstellar travel, I dove into Kip Thorne’s Black Holes & Time Warps and had my mind bent in some ways I didn’t think it could be bent.  This has prompted a reengagement with science on this level which is proving difficult, tedious, and yet rewarding.  My mind no longer has the plasticity it once enjoyed.  On the other hand, experience has proven a benefit in that I seem to be absorbing and comprehending at a much deeper level.  We shall see.

Quite a bit of history, much of it unfinished.  In a separate reading group, I’m going through Victor Hugo’s Les Miserables, and reading in the history of the French Revolution, the Republic, its fall, all partly to complete the third novel of my trilogy, but also because the literature available is so rich and surprising that it has become its own pleasure.  It would seem now I’m about to embark on early American history again, anchored by Ron Chernow’s biography of Alexander Hamilton.

There was a new Mary Russell novel this past year, Dreaming Spies, by Laurie R. King.  I discovered a Dan Simmons novel about Holmes which I’d overlooked when it came out, The Fifth Heart, in which he is paired with Henry James, one more in a long line of novels and stories concerning Holmes’ unlikely interaction with historical figures.  Simmons is a terrific writer, but even he tended toward the tedious in this one.  He needs to learn to leave his research in his files.  But it was a unique take on Holmes and he even managed to elicit my sympathy toward James, a writer I find problematic at best, insufferable at worst, and annoying the rest of the time.

So much for the highlights.  Let me end by noting that the Best American series has finally realized that science fiction and fantasy are a real thing and launched one of their annual collections to cover it.  This after both Best Of infographics and comics.  Better late than never, I suppose.  The series editor is John Joseph Adams—difficult to imagine better hands—and this first volume was edited by Joe Hill, which I found interesting to say the least.  Mr. Hill is a horror writer.  Certainly many of the stories have a strong horror element, but over all this is a collection full of marvels, from the writing to the ideas.  I’ll try to keep track of this one in future.

So while not numerically great, 2015 was filled with many very excellent books.  I’m looking forward to 2016.  My stack awaits.

Happy New Year.



Easy Habits and the Consequences of Belief

At first glance, the two books could not be more different. Subject, tone, everything seems different. Not at odds so much as…nonoverlapping.

Which is ironic, since both deal, in their separate ways, with that very idea, the separation of areas of knowing.

Probably because I read them so close together I recognized their shared concerns as clearly as I did. Whatever the reason, it struck me as obvious in so many ways that I began to recall all the other books over the last few years that could be likewise gathered within this same subset, all on distinct topics and yet all based on, to some degree, an analysis of the same human propensity to disregard evidence when it contradicts belief.

Let me begin with the more general of the two.

Jerry A. Coyne is an evolutionary biologist.  Compared to others with public profiles, he has published few books.  Three, to be precise.  His latest, Faith Vs. Fact: Why Science and Religion Are Incompatible, is a direct challenge to Stephen Jay Gould’s idea of “nonoverlapping magisteria.”  Gould’s premise is that science and religion should not conflict with each other because they are concerned with entirely separate realms of knowing—hence the nonoverlapping part—and except for certain agenda-driven partisans, there is no reason for them to be in conflict.  Coyne sees this as accommodationism, which he thoroughly discredits in his book.

My claim is this: science and religion are incompatible because they have different methods for getting knowledge about reality, have different ways of assessing the reliability of that knowledge, and, in the end, arrive at conflicting conclusions about the universe.  “Knowledge” acquired by religion is at odds not only with scientific knowledge, but also with knowledge professed by other religions.  In the end, religion’s methods, unlike those of science, are useless for understanding reality.

Coyne identifies Accommodationism as an attempt not to stir the hornet’s nest, because scientists are often dependent on politically sensitive funding.  Science, especially Big Science dealing with questions of origins, is expensive and the days of the independently wealthy scientist are largely gone.  Rocking the boat by annoying those who hold the purse strings would seem ill-advised.

The conclusions, he goes on to argue, of scientific inquiry continually poke holes in those claims by religion that still assert authority over secular matters.

But more than that, such deferral to authority erodes our critical capacity and can lead to false conclusions and bad judgments, all because we accept the hegemony of “faith.”

…a word defined in the New Testament as “the substance of things hoped for, the evidence of things not seen.”  The philosopher Walter Kaufmann characterized it as “intense, usually confident, belief that is not based on evidence sufficient to command assent from every reasonable person.”

The sticking point for many would be that “reasonable person” proviso, for surely, as Coyne concedes—often—there are many reasonable people who nevertheless espouse a religious faith.  What are we to make of that?  Is the basis for the assessment of “reasonable” perhaps too ill-defined or is the scope of “belief” too broad?

Coyne takes us through the process, giving a thorough explication of why science and religion are incompatible if not a convincing argument for the down-side of belief for belief’s sake.  He tells an anecdote early in the book about how he yearly teaches undergraduates a course in evolution, which the majority do very well at, but admit, after earning their A, to not believing a word of what he taught.  Because it contradicts their religious belief.

It is this that Coyne sees as the dangerous aspect of faith as promulgated through religion, the a priori rejection of evidence in favor of a set of usually unexamined beliefs.  He takes us through a catalogue of negative manifestations, from the rejection of medicines to the acts of terrorists to the rejection of solid science (like climate change), that have their underlying justifications in religion.

Coyne, an admitted atheist, puts all this forward while taking pains to admit the personal comfort to be found in religion by many people.  From a certain point of view he tends to go the extra kilometer to be fair.  Of course, going through the comments on various review sites, those predisposed to reject such arguments accuse him of profound bias if not outright malicious intent.  One cannot help but wonder if they bothered to read the book, all or even in part.

The book makes its case clearly and concisely.  It avoids the polemic outrage to be found in other tomes by big name atheists by sticking largely to evidentiary concerns and philosophical arguments.

But, one may ask, so what?  Religion and science are two realms in which most people would assume they have no stake. Castles in the air, esoteric arguments about things that have no impact on our daily lives.  Most people seem to keep a religion much the same way they have a pet and certainly in the West the majority live secular lives and only rarely feel compelled to apply their religious convictions to anything.  As for science, as long as the technology we depend on works, all the rest is so much theoretical handwaving.  It makes no difference if we have almost no understanding of quantum mechanics and certainly evolution is just tenure-building nonsense having to do with million-year-old bones and what kind of textbooks the school district might use next year.  Nothing to do with us in our daily lives. So what if people rely more on faith and belief in making their daily judgments than on evidence-based science?  We operate more on heuristics defended by aphorism than by reason and applied understanding, or so Daniel Kahneman tells us in his excellent study, Thinking, Fast and Slow, and we by and large get along perfectly well that way.

How does this argument concern me?

Johann Hari’s new book, Chasing The Scream, has an answer to that.  Of sorts, if we but make the epistemological leap.

Hari is writing about the drug war. It is, for him, as much a personal examination as a journalistic one, as he admits to having family and friends who are addicts.  He begins with a simple question.

I scribble down some questions that had puzzled me for years.  Why did the drug war start, and why does it continue?  Why can some people use drugs without any problems, while others can’t?  What really causes addiction?  What happens if you choose a radically different policy?

Okay, three questions.  Still, simple, basic questions, to which any reasonable person might reasonably expect a reasonable answer.

Yet we spend billions, bully small countries, destroy thousands if not millions of lives, all in pursuit of policies which rest on an appalling lack of informed justification.  By the end of the book you come to see that there are no answers to those simjple questions which in any way validate our continuing on with things as they are.  As they have been for a hundred years.

Hari goes back to beginning of the war, before drugs were illegal, and takes use through the history.  Back to Harry Anslinger, the head of the Federal Bureau of Narcotics, who fueled the drug war for the twin purposes of maintaining his agency and exorcizing demons that had possessed him since childhood, even in the face of substantive research and sound arguments denying his approach to the problem had any merit and more than ample evidence—Prohibition—that it would lead to catastrophe, not only on a national but on a global level.  Hari details Anslingers battle to destroy Billie Holiday and his use of intimidation and police tactics and, subsequently, U.S. foreign policy to assure the continued crusade to eradicate drugs and destroy addicts.

Not because of any evidence Anslinger understood that led him to think this was the only if not the best way, but because he believed it in spite of growing mountains of evidence that this was a wrongheaded approach.  He suppressed evidence, hounded physicians who dared present alternative models to the drug problem, intimidated politicians—except those who could secure him funding—and strenuously denied the validity of any evidence that  contradicted his belief.

He believed.

As many did and still do.  It is this that trumps hard evidence.

Even as a young adolescent I thought the argument for our drug policies was lacking.  I thought at the time that I just didn’t understand.  Never having been in the least interested in drugs and knowing few if any who were involved with anything more serious than marijuana, it seemed not to concern me.  Later, I did a short stint as a volunteer drug counselor, but the work was far more than I could handle at the time.  I trusted the people in charge knew what they were doing and certainly the gang violence associated with drugs seemed to make a persuasive case that this was a worthwhile and often desperate fight.

But as the years went by and the war continued, I began to notice bits of research here and there and how certain politicians contradicted themselves and how the prison population, especially in the wake of Nixon’s near militarization of the police community and the drug war ethos, was growing and in very worrisome ways.  I began to seriously rethink my position with Reagan’s zero tolerance policies and the mandatory sentencing guidelines he established through Ed Meese, one of the notable Puritans of the modern age.  Even so, I had the nagging suspicion that maybe I was just missing something.  Certainly I didn’t approve of drug addiction, but I more and more came to believe that these people needed help, not punishment.  We understand that process very well with alcohol, why is it different with narcotics?

And besides, there are plenty of people who receive perfectly legal prescription narcotics and never become addicts.

The number of holes in the picture kept growing.  I no longer trusted stated drug policy, but I didn’t understand the instransigence of people over reform.

Hari’s book lays it out very clearly.  Money is high on the list.  We fund too many of the wrong people at too high a level for them to be easily weaned from the teat.  Foreign policy is also tied to this, especially in these days of international terrorism which has a strong drug component.  But the factor that ties this in to Jerry A. Coyne’s book is the one Hari covers only glancingly.


It is easier to rely on what we have always believed than to look at evidence that requires us to change our mind.

Many aspects of our lives are covered by this observation, but where problems arise are those with political and social ramifications.  The persistent beliefs about the poor, about minorities, about people with different beliefs, even when evidence is provided which significantly challenges such beliefs and suggests strongly that not only are they wrong but that we would be better off discarding them, we cling to them.

Hence my conflation of these two books and the suggestion that they share a common idea.

Not that I argue that all beliefs are wrong.  What is wrong is the intractable nature of unquestioned belief.  The only reason the drug war continues is that it has considerable popular support and the only reason it has that is that many people cannot bring themselves to change their minds about it in the face of not only evidence but what we euphemistically call common sense.

But that can be said of so many things which directly and indirectly impact our lives.

Perhaps it is a stretch and perhaps I argue out of my own biases, but it seems to me the most valuable tool we can have in our intellectual toolbox is the ability to say “well, I believe that but I might be wrong.”  Faith cannot maintain, however, among people who are able to say that and apply it to anything and everything.  Science is built on exactly that principle—all knowledge is conditional—but belief, as exemplified by religion, thrives by the absence of that principle.  It says some knowledge is absolute, not to be questioned.

In the contemplation of matters of theological concern, this perhaps offers consolation, comfort, a certain utility, to be sure.  But it is easy for unquestioning acceptance of arguments from authority to become habit and then apply it to anything that our prejudices suggest we may wish to avoid examining.  Drug addiction is an unfortunate affliction and it may be uncomfortable for people to see it for what it is—a disease, like alcoholism.  That discomfort makes room for assertions from authority offered by people who claim to be working on our behalf.  Our unwillingness to ask evidentiary questions is a cozy environment for demagogues and despots.

When you ask “What does this have to do with me?” you might be surprised at the consequences of avoidance and the price of unquestioning trust.  Why should we learn science and how to apply the discipline of examining evidence?  So we don’t hurt ourselves out of the easy habit of belief.

Motives and Revelations

There is a remarkable scene—one of many—in James Morrow’s new novel, Galapagos Regained, wherein the final straw is broken for Charles Darwin and we are shown the moment he decided to back his radical new view of nature and its processes. Wholly fictional, no doubt, yet based on reality, Darwin has come to London to confront a young woman who has betrayed his trust while working in his household. The confrontation with the fictional Chloe Bathhurst is not the one that matters.  Rather, it is the confrontation Darwin is having with the edifice of a loving god.  His daughter is dying—tuberculosis—and the scientist in him knows there is nothing to be done, that an indifferent nature cares nothing for her goodness, her innocence, and any human claim on justice and fairness is but the empty babblings of a minor species only recently transcendent upon the ancient stage of life.  Darwin is angry and resentful.  The transgressions which resulted in his dismissing Miss Bathhurst are insignificant now against this greater, vaster crime which, he believes, has no actual perpetrator.  The only thing he can do, he decides, is to give her his blessing in pursuit of her own goal, which pursuit got her fired from his service.


She was fired for attempting to steal the sketch he had written concerning the transmutation of species, a precursor work to his epic On The Origin of Species.  She did this in order to procure a means to free her errant father from debtors prison by using the work as the basis for winning the Shelley Prize, for which competition has been ongoing for some time in Oxford.  The purpose of the prize to reward anyone who can prove or disprove the existence of God.  Chloe, during her employ as Darwin’s zookeeper, became aware of his theory and thought it ideal to present and win the prize.

Darwin refused.  When she elected then to steal the notes and present it on her own, she was caught and dismissed.  Darwin was at the time unaware that she had already made a copy of the paper and thought he had caught her in the act.

Now, in the lobby of a London playhouse, where Chloe had once been employed as an actress, Darwin, aware that she in fact had stolen his treatise, is sanctioning her quest.

“Don’t overestimate my sympathy.  Had I two thousand surplus pounds, I would cover your father’s debts, then arrange for you to tell the world you no longer believe in transmutationism.  That said, I must allow as how a part of me wants you to claim the prize, for it happens that my relationship with God—“

“Assuming He exists.”

“Assuming He exists, our relationship is in such disarray that I should be glad to see Him thrown down…Get thee to South America, Miss Bathhurst.  Find your inverse Eden.  Who am I to judge your overweening ambition?  We’re a damned desperate species, the lot of us, adrift on a wretched raft, scanning the horizon with bloodshot eyes and hollow expectations.  Go to the Encantadas.  Go with my blessing.”

Because this is what Chloe has determined to do.  Go to the Galapagos Islands to gather specimens to support the argument for transmutation of species.  The Shelley Society fronts her the money to do so, she enlists her card-sharp brother in the expedition, they find a ship, and set sail.  The Society had already bankrolled an expedition to Turkey for the purpose of finding the remnants of Noah’s Ark, so this was only fair.

Accompanying her ship is Reverend Malcolm Chadwick, anglican minister and formerly one of the judges of the Shelley contest—on the side of the deity.  He steps down from that post at the request of Bishop Wilberforce and sent on this new mission to oversee what Chloe will do.  He departs with uneasy conscience, made so by the second part of Bishop Wilberforce’s plot, which sends another minister in another ship with the intention to go to the Encantadas and set in motion the ultimate destruction by slaughter of all the animals on the islands, thus to deprive the forces of atheism their troublesome evidence.  Chadwick finds this idea appalling, but he is faithful and says nothing.  He joins Chloe’s expedition, which becomes Odyssean in its complications and obstacles.

The novel proceeds from one adventure to another until Chloe herself, stricken ill in the Amazon basin, undergoes a kind of religious conversion, and decides she is wrong in her conviction that there is no god.  Morrow then expands on the struggle she engages with her fellow travelers and her own considerable intelligence.

What we are treated to in this novel is a thorough examination of human motivation in the face of shifting paradigms.  It may be clear where his sympathies lie, but he is too good a writer to load the dice in favor of his preferred viewpoint.  He gives his characters their own and follows them where they would naturally lead.  He never denigrates faith, only the fickleness of our intentions in the face of conflicting desires and awkward choices.  Tempting as it may have been in the end to simply declare a winner, Morrow instead takes a more difficult and fulfilling tack by portraying the times in which this debate flared into full flame with the advent of a solid theory of evolution.

Chloe Bathhurst herself is an admirable character.  An actress, adept as a quick study, she proves herself intellectually versatile and equal to any challenge.  As well, those who both aid and oppose her are equally well-drawn and Morrow deftly clarifies their motives.

Along the way, he gives a field demonstration in observation and interpretation, showing us the process whereby new understanding takes us over and how revelation can be a problematic gift.

Morrow is one of our best writers plowing the ground of controversy.  He never takes the simplistic road.  The pleasure in reading one of his novels is that of being allowed free range of the imagination in pursuit of specific truths stripped of dogma.  In fact, he disassembles dogma in the course of his yarns, a fact that is often not apparent while we’re in the grip of his artifice.

An artifice made warm by the complete humanness of his characters.  One his best creations is Chloe Bathhurst.  In her, several clichés and canards are undone, as well as many perhaps uncomfortable but rewarding questions asked.  She exemplifies the first rule of the explorer—never be afraid to go and see for yourself.  Do so and you’ll be amazed at what is revealed.

And what is lost.

The title parodies Milton’s Paradise Regained, from which perhaps Morrow took a bit of inspiration:

I, when no other durst, sole undertook
The dismal expedition to find out
And ruine Adam, and the exploit perform’d
Successfully; a calmer voyage now
Will waft me; and the way found prosperous once
Induces best to hope of like success.

Perhaps not so much to “ruin Adam” as to give us a view into a vaster garden, older and truer, and less a burden to our capacity for wonder.

Taste and Quality

Obliquely, this is about a current debate within science fiction. However, the lineaments of the argument pertain to literature as a whole.  I offer no solutions or answers here, only questions and a few observations.  Make of it what you will.

Reading experience is a personal thing. What one gets out of a novel or story is like what one gets out of any experience and being required to defend preferences is a dubious demand that ultimately runs aground on the shoals of taste.  I once attended a course on wine and the presenter put it this way: “How do you know you’re drinking a good wine? Because you like it.”  Obviously, this is too blanket a statement to be completely true, but he made his point.  If you’re enjoying something it is no one’s place to tell you you’re wrong to do so based on presumed “objective” criteria.  That $200.00 bottle of Sassicaia may fail to stack up against the $20.00 Coppola Claret as far as your own palate is concerned and no one can tell you your judgment is wrong based on the completely personal metric of “I like it/I don’t like it.”

However, that doesn’t mean standards of quality are arbitrary or that differences are indeterminate.  Such are the vagaries and abilities of human discernment that we can tell when something is “better” or at least of high quality even when we personally may not like it.

For instance, I can tell that Jonathan Franzen is a very good writer even though I have less than no interest in reading his fiction.  I can see that Moby-Dick is a Great Novel even while it tends to bore me.  I acknowledge the towering pre-eminence of Henry James and find him an unpalatable drudge at the same time.

On the other end of the spectrum, I can see how Dan Brown is a propulsive and compelling story-teller even while I find him intellectually vacuous and æsthetically tedious.

My own personal list of what may be described as guilty pleasures includes Ian Fleming, Edgar Rice Burroughs (but only the John Carter novels; never could get into Tarzan), and a score of others over the years who caught my attention, appealed for a time, and have since fallen by the wayside, leaving me with fond memories and no desire to revisit.  A lot of the old Ace Doubles were made up of short novels of dubious merit that were nevertheless great fun for a teenager on a lonely afternoon.

I would never consider them Great Art.

Taste is the final arbiter.  But using it to determine quality—rather than allowing quality to determine taste—is doomed because taste changes.  Works you might strenuously defend at one time in your life can over time suffer as your taste and discernment evolve.  It’s sad in one way because it would be a fine thing to be able to summon up the same reactions experienced on one of those lonely afternoons, aged 16, and poring through the deathless excitement of a pulp adventure you might, given your enthusiasm, mistake for Great Writing.

I try always to make a distinction between things I like and things I think are Good.  Often they’re the same thing, but not always, and like other judgments humans make tend to become confused with each other.  Hence, debate over merit can take on the aspects of an argument on that day at the base of the Tower of Babel when people stopped understanding each other.

But if that’s all true, then how do we ever figure out which standards are valid and which bogus?  I mean, if it’s ALL subjective, how can any measure of quality ever rise to set the bar?

Fortunately, while personal experience is significant, collective experience also pertains. History, if you will, has taught us, and because art is as much a conversation as a statement we learn what works best and creates the most powerful effects over time. Having Something To Say that does not desiccate over time is a good place to start, which is why Homer still speaks to us 2500 years after his first utterances.  We derive our ability to discern qualities from our culture, which includes those around us informing our daily experiences.  In terms of literature, the feedback that goes into developing our personal values is a bit more specific and focused, but we have inexhaustible examples and a wealth of possible instruction.  We do not develop our tastes in a vacuum.

Honest disagreement over the specific qualities of certain works is part of the process by which our tastes develop. I might make a claim for Borges being the finest example of the short story and you might counter with de Maupassant—or Alice Munro. Nothing is being denigrated in this. The conversation will likely be edifying.

That’s a conversation, though.  When it comes to granting awards, other factors intrude, and suddenly instead of exemplary comparisons, now we have competition, and that can be a degrading affair unless standards are clear and processes fairly established.  Unlike a conversation, however, quality necessarily takes a back seat to simple preference.

Or not so simple, perhaps. Because any competition is going to assume at least a minimum of quality that may be universally acknowledged. So we’re right back to trying to make objective determinations of what constitutes quality.

If it seems that this could turn circular, well, obviously. But I would suggest it only becomes so when an unadmitted partisanship becomes a key factor in the process.

This can be anything, from personal acquaintance with the artist to political factors having nothing to do with the work in hand. Being unadmitted, perhaps even unrecognized, such considerations can be impossible to filter out, and for others very difficult to argue against. They can become a slow poison destroying the value of the awards. Partisanship—the kind that is not simple advocacy on behalf of a favored artist but is instead ideologically based, more against certain things rather than for something—can deafen, blind, reduce our sensibilities to a muted insistence on a certain kind of sensation that can be serviced by nothing else. It can render judgment problematic because it requires factors be met having little to do with the work.

Paradoxically, art movements, which are by definition partisan, have spurred innovation if only by reaction and have added to the wealth of æsthetic discourse. One can claim that such movements are destructive and indeed most seem to be by intent. Iconoclasm thrives on destroying that which is accepted as a standard and the most vital movements have been born of the urge to tilt at windmills, to try to bring down the perceived giants.  We gauge the success of such movements by remembering them and seeing how their influence survives in contemporary terms.

Those which did not influence or survive are legion. Perhaps the kindest thing to be said of most of them was they lacked any solid grasp of their own intent. Many, it seems, misunderstood the very purpose of art, or, worse, any comprehension of truth and meaning. More likely, they failed to distinguish between genuine art and base propaganda.

How to tell the difference between something with real merit and something which is merely self-serving?  All heuristics are suspect, but a clear signal that other than pure artistic intent is at play is the advent of the Manifesto.  Most are hopelessly locked in their time and the most innocent of them are cries against constraint.  But often there’s an embarrassing vulgarity to them, a demand for attention, as insistence that the work being pushed by the manifesto has merit if only people would see it.

Not all manifestos are signs of artistic vacuity, but those that front for worthwhile work usually fade quickly from service, supplanted by the work itself, and are soon forgotten.  Mercifully.  We are then left with the work, which is its own best advocate.  In hindsight it could be argued that such work would have emerged from the froth all on its own, without the need of a “movement” to advance its cause.  Unfortunately, art requires advocates, beginning with the simplest form of a purchase.  In crowded fields overfull of example, the likelihood of a lone artist succeeding on his or her own, without advocacy, is slim.

Advocacy for an individual artist, by a cadre of supporters, can make or break a career.  And this would of course be a natural development of widespread appreciation.  It’s organic.

Advocacy for a perceived type of art begins to suffer from the introduction of agendas having less to do with the artists than with a commitment to the aforementioned windmill-tilting.

The next phase is advocacy of a proscriptive nature—sorting out what belongs and doesn’t belong, measuring according to a prescribed set of protocols, and has little to do with individual works and much to do with the æsthetic and political prejudices of the movement.  The quality of a given work is less important at this stage than whether it “fits” the parameters set by the movement’s architects.  Taste plays a smaller and smaller role as the movement meets opposition or fails to advance its agenda. With the demotion of taste comes the dessication of quality.  The evocative ability of art, its facility to communicate things outside the confines of the manifesto-driven movement eventually becomes a kind of enemy.  We’re into the realm of cookie-cutter art, paint-by-numbers approaches, template-driven.  Themes are no longer explored but enforced, preferred message becomes inextricable from execution, and the essential worth of art is lost through disregard of anything that might challenge the prejudice of the movement.

This is a self-immolating process.  Such movements burn out from eventual lack of both material and artists, because the winnowing becomes obsessional, and soon no one is doing “pure” work according to the demands of the arbiters of group taste.

As it should be.  Anything worthwhile created during the life of the movement ends up salvaged and repurposed by other artists.  The dross is soon forgotten.  The concerns of these groups become the subject of art history discussions.  The dismissal of works in particular because “well, he’s a Marxist” or “she was only an apologist for capitalism”—factors which, if the chief feature of a given work might very well render it ephemeral, but in many instances have little to do with content—prompts head-scratching and amusement well after the fury of controversy around them.

Given this, it may seem only reasonable that an artist have nothing to do with a movement.  The work is what matters, not the fashions surrounding it.  Done well and honestly, it will succeed or fail on its own, or so we assume.

But that depends on those ineffable and impossible-to-codify realities of quality and taste.  Certainly on the part of the artist but also, and critically, on the part of the audience.

Here I enter an area difficult to designate.  The instant one demands a concrete description of what constitutes quality, the very point of the question is lost.  Again, we have heuristics bolstered by example.  Why, for instance, is Moby-Dick now regarded as a work of genius, by some even as the great American novel, when in its day it sold so poorly and its author almost died in complete obscurity?  Have we become smarter, more perceptive? Has our taste changed?  What is it about that novel which caused a later generation than Melville’s contemporaries to so thoroughly rehabilitate and resurrect it?  Conversely, why is someone like Jacqueline Susanne virtually unremarked today after having been a huge presence five decades ago?

I have gone on at some length without bringing up many examples, because taste and quality are so difficult to assess.  What one “likes” and what one may regard as “good” are often two different things, as I said before, and has as much to do with our expectations on a given day of the week as with anything deeply-considered and well-examined. My purpose in raising these questions—and that’s what I’ve been doing—has to do with a current struggle centering on the validity of awards as signs of intrinsic worth.

The best that can be said of awards as guideposts to quality is that if a group of people, presumably in possession of unique perspectives and tastes, can agree upon a given work as worthy of special note, then it is likely a sign that the work so judged possesses what we call Quality.  In other words, it is an excellent, indeed exceptional, example of its form.  I’ve served on a committee for a major award and over the course of months the conversations among the judges proved educational for all of us and eventually shed the chafe and left a handful of works under consideration that represented what we considered examples of the best that year of the kind of work we sought to award.

I never once found us engaged in a conversation about the politics of the work.  Not once.

Nor did we ever have a discussion about the need to advance the cause of a particular type of work.  Arguments over form were entirely about how the choice of one over another served the work in question.  When we were finished, it never occurred to me that a set of honest judges would engage in either of those topics as a valid metric for determining a “winner.”  No one said, “Well it’s space opera and space opera has gotten too many awards (or not enough)” and no one said, “The socialism in this work is not something I can support (or, conversely, because of the political content the faults of the work should be overlooked for the good of the cause).”  Those kinds of conversations never happened.  It was the work—did the prose support the premise, did the characters feel real, did the plot unfold logically, were we moved by the story of these people.

Consensus emerged.  It was not prescribed.

This is not to say other metrics have no value, but they can be the basis of their own awards.  (The Prometheus Award is candidly given to work of a political viewpoint, libertarianism.  It would be absurd for a group to try to hijack it based on the argument that socialism is underrepresented by it.)  But even then, there is this knotty question of quality.

Here’s the thorny question for advocates of predetermined viewpoints: if an artist does the work honestly, truthfully, it is likely that the confines of manifesto-driven movements will become oppressive and that artist will do work that, eventually, no longer fits within those limits.  To complain that the resulting work is “bad” because it no longer adheres to the expectations of that group is as wrongheaded as declaring a work “good” because it does tow the proper line.

Because that line has nothing to do with quality.  It may go to taste.  It certainly has little to do with truth.

Seeing At f.64

I have been involved in photography since I was 14 years old. My father gave me his Korean War-era Canon 35mm, a knock-off of a Leica model D, and I began a somewhat haphazard career which never quite went where I thought it would or should.  By the time I was 21 I had replaced that Canon with a pair of Minoltas, a Mamiya Universal (2 1/4 format), and a Linhof 4X5 view camera, plus the darkroom accoutrements to go with it all, including a massive Besseler 45M enlarger (which I still have) and pretensions of being a modern manifestation of a member of the Group f.64.

All I really knew about this group were the photographs and some names, and not even all of them.  Ansel Adams, Edward Weston, Imogen Cunningham.  Names to conjure with.  Sharp, full-tonal range images of landscapes, portraits, and details of objects made sublime by the attention of these artists of the lens.

In time I learned about other names—Alfred Stieglitz, Edward Steichen, Paul Strand, Walker Evans, Dorothea Lange, Ben Shahn—and picked up a bit of the history.  To be honest, photographing and printing were what interested me and I used the work of these past giants as goals and guides.  I bought a full set of Ansel Adams’s Basic Photo series, gleaned what I could, and set forth to do what he did, at least in my own small and considerably more modest way.  He had Yosemite, I had, when I got to it, the Ozarks.  Edward Weston had Big Sur, I had the Missouri and Meramec Rivers.  I spent too much time trying to duplicate their work and too little time finding what was worthwhile in my own backyard.

Which I did eventually and I’ve done some work of which I’m proud.  Ah, but there were giants in those days, and I wasn’t even getting to their shoulders.  The magical, mystical Group f.64 occupied in some sense the same place as the Fellowship of the Ring.  They were the first, the best, the most interesting.

And yet, little was available to be known about them between two covers.  Articles here and there, anecdotes, essays in their monographs, coincidental stories.  A great deal of myth can evolve from such piecemeal history.

Mary Street Alinder, who has previously written a biography of Ansel Adams, has published a history of the Group. Group f.64: Edward Weston, Ansel Adams, Imogen Cunningham, and the Community of Artists Who Revolutionized American Photography fills in the gaps of biography and history of this unlikely cadre of artists who took photography away from the Pictorialists who were dominant in the 1910s and 20s and gave it back to itself.

Beginning with the disdain East Coast photographers, dominated by the towering personality of Alfred Stieglitz, who was one of the principle gallerists and arbiters of what constituted art in America, for so-called “western photographers,” the primary movers came together in and around San Fransisco to establish their own hallmark on what they thought photography should be.  Dedicated to “pure” photography—that is, photographs that were themselves, making full use of the unique medium of lens, film, and chemistry, rather than the popular form of photographs that imitated paintings, with soft-focus and elaborate stage settings—Ansel Adams, Willard Van Dyke, and, somewhat reluctantly, Edward Weston (who was simply not a joiner, and yet philosophically supported what the group was trying to do, and ended up lending his name to it because of them all he was already well known and respected) formed the Group, Adams wrote a manifesto, and they set about arranging exhibits and establishing themselves as arbiters of a new vision of photography.

Young, enthusiastic, and perhaps a bit pretentious, the group eventually included Connie Kanaga, Dorothea Lange, Henry Swift, Alma Lavenson, Brett Weston, and a few others.  Both Van Dyke and Adams were aggressive about exhibits, and even opened their own (short-lived) galleries.  Eventually, though, they knew they had to break into the East Coast and convince Stieglitz of their worth.  Adams was the first to do so and after that the tastes of the country began to shift.

What emerges from the pages of Alinder’s book is in many ways the same struggles and passions that come with any artistic movement.  Also the conflicting personalities, the petty wars, the affairs, and the eventually dissolution of the Group as its members, all gifted and hard-working artistic obsessives, went their own ways.  The kind of work they did today is accepted as the given in what photography should be, at least as a base.  Adams and Weston shared a philosophy that the photograph should allow the subject to reveal its true nature.  For that to happen, the kind of standard obliteration of detail that the Pictorialists held as necessary had to go away.  Clarity, sharpness, richness of detail and tonality.  Hence the name of the group, f.64, which is one of the smallest apertures a large format lens can achieve (though not the smallest—f.128 is common for large format lenses), and by which the widest possible depth-of-field (meaning the range of acuity from nearest to farthest) and therefore the greates sharpness throughout the image.  By calling themselves this, they announced that stood in direct opposition to the fuzzy, false images that dominated galleries and public awareness.  (The elaborate staging of the Pictorialists moved easily and naturally into Madison Avenue advertising photography, but even there Group f.64 became the dominant æsthetic in terms of detail and clarity.)

The successes of all these artists would probably have occurred in any event, but not this way and not so soon and not so magnificently.  Obviously, some emerged as superstars while others have faded from our memory.  Alinder does service here by reminding us of them and placing them in their proper relation to the history.

One remarkable aspect of the Group is, in some ways perhaps, minor, but led to major consequences as the 20th Century unfolded.  Namely, they were the first artistic movement in the United States that made no distinction between male and female members.  All they cared about was the work and none of them seemed to have any retrograde ideas about women being lesser artists.  Right from the start more or less half the group was female, and we remember them well.  Imogen Cunningham and Dorothea Lange, certainly, but others now somewhat forgotten, like Consuela Kanaga and Sonia Noskowiak and Alma Lavenson informed the dialogue of 20th Century photography in powerful and significant ways.

The other thing that was a surprise to me was to now realize who wasn’t in the Group.  Based on the work alone, seeing it all the first time as a young photographer, I just assumed membership on the part of photographers like Walker Evan, Paul Strand, and Margaret Bourke-White.  Walker Evan, according to Alinder, did not care for Group f.64.  Ironically, he thought they were too  concerned with making”pretty” pictures, which was one of the complaints Group f.64 had toward the Pictorialists.  Evan was a chronicler—nothing should distract from the subject as found, not cropping, not composition, not fine printing.  Strand predated the group (and actually inspired Adams to change how he made images and prints) and Margaret Bourke-White was always a working commercial photographer following her own path.

Reading this and seeing how these people at the time had to modify and adapt their philosophy to account for some of the inconsistencies in their original intent and to make sure they did not exclude excellent work by too narrow a set of constraints was revelatory.  Alinder’s book has allowed me to better understand my own sources of inspiration.  As well, her prose are, like the people she writes about, examples of clarity, sharpness, and acuity.

Now A Word From Paradise Island

Of all the unlikely confluences you might stumble over in life, here’s a set: what do Margaret Sanger, lie detectors, and superheroes have in common?

If you answered “Wonder Woman!” you would be correct.  And if you could answer that before Jill Lepore’s The Secret History of Wonder Woman came out, you would be in rare company knowing that.  lepore_wonder_woman_cover

Wonder Woman was the creation of one William Moulton Marston, of the Massachussetts Marstons. If that familial pedigree escapes you, don’t feel bad, you didn’t miss it in history class.  The Marstons had come down a ways from their 19th Century heights, enough so that William was forced to apply for grants to finish his degree at Harvard in 1915 in psychology (Ph.D. earned in 1921).

Psychology was still a relatively “new” science and Harvard was in the vanguard.  Marston attached himself to the recently founded Harvard Psychological Laboratory and eventually ended up running it.  But only for a short while.  And that career curve would follow him the rest of his life.

Marston emerges in the pages of Lepore’s book as a brilliant man-child who just couldn’t seem to make a success that lasted of anything.  It becomes evident as his story unfolds that he simply confused personal fame with genuine success and made a series of bad calls with regards to where and how to apply his energies.  He was a self-promoter but when it came to backing up his claims he was lacking in substance.  He “invented” the lie detector and established it as a viable methodology and then ruined its use in court for decades through sloppy application and shoddy research.  Not only that, but the court case which apparently set the precedent for the exclusion of lie detector results from court rooms cast a pall on him personally, a circumstance he never quite recovered from.  If not for the three women he eventually lived with, it seems likely he would have been indigent.

That is where the story takes a turn toward the culturally amazing.  Marston was a hardcore feminist, of the “old school” that marched and advocated and disrupted (although being a man this rarely led to any inconvenience for him).  He seems to have been genuinely dedicated to the idea of equality for women and also a devotee of free love, a parallel movement to the equal rights movement that led to considerable discord among feminists.  He met and eventually married Elizabeth Holloway, who shared his philosophy.  She earned a law degree from Boston University and of the two possessed the discipline to apply herself to her chosen tasks, a trait Marston apparently lacked.  Eventually, Holloway (who kept her maiden name) was the principle breadwinner in the house, which grew to include Marjorie WIlkes Huntley, a woman he met while doing work for the Army under Robert Yerkes, and then later Olive Byrne, who was Margaret Sanger’s niece.

At a time when “alternate lifestyles” could land you in prison, this was, to say the least, unorthodox.  Marston apparently possessed enormous personal charmisma and none of the women ever left him.  After his death, the three more or less remained together almost until death.  Byrne and Holloway each bore two of Marston’s four children, but Byrne claimed to be a widowed cousin and refused to tell her own children that she was their mother.

How did all this produce one of the most well-known superheroes of all time?

Lepore, who obtained access to archives and letters never before made public, has detailed this remarkable story with clarity, humor, and insight.  Wonder Woman is probably one of the most rigorously constructed superheroes of the Golden Age of comics.  Marston did not see her as entertainment but as a tool for social propaganda, namely to advance the cause of equal rights for women.  In fact, Marston thought women should be in charge.  His Wonder Woman was no one’s servant and the first couple of years of her publication reads like an indictment of everything broken in society’s attitude toward women.

DC Comics was in a bit of a bind.  They were by and large uncomfortable with the feminist “messaging” but Wonder Woman, almost from the first issue, was one of the three most popular superheroes of all time, right behind Batman and Superman.  They were making too much money to just kill her off.  But once Marston lost control of the comic (due to illness, first polio then cancer) the editor put in charge seemed to do everything he could to declaw this difficult woman.  That she survived to once again be the powerful symbol of womanhood is remarkable, but it also goes some way to explain the bizarre manifestation of the Seventies TV show.  Comparing the Lynda Carter characterization to the forthcoming Gal Gadot realization shows a marked difference in attitude and approach.

TCDWOWO EC007 Wonder Woman Gal GadotThe holdovers from the Silver Age era in Carter’s look, while “true” to the traditional Wonder Woman, nevertheless supported the highly sexualized aspect of the character, while the new look shows the underlying mythology that was originally loaded into the construct, that of a warrior, and Amazon, something out of the bloodier, martial past of the ancient Greeks.

Lepore’s book examines the period as much as the principles and shows us the problematic world in which suffrage, feminism, and tradition often came into conflict.  The urgency to “put Wonder Woman in her place” especially after WWII played out in the comics in often tragicomic ways.  Wonder Woman becomes a full-fledged member of the Justice League only to be made the secretary.

Some of this may seem silly to modern readers.  Why couldn’t people just get over their prejudices?  Wasn’t it obvious, etc etc.?  Maybe.  And maybe that it was obvious was part of the problem.  Justice often conflicts with privilege, ethics with desire, and change is never welcome to the fearful.

Marston himself, as a human being, seems by turns charming and repulsive.  Art rarely emerges from pristine sources, unalloyed with the questionable.  But then, Marston probably did not see himself as an artist or Wonder Woman as a work of art. He had causes to pursue, beliefs to validate, and…well, bills to pay.

Lest anyone believe that Wonder Woman was all Marston’s invention, though, Lepore makes it abundantly clear that the women around him contributed just as much if not more and served in many ways as models upon which he based the character.

The Secret History of Wonder Woman is a fascinating look at a time in America and an aspect of culture that for too long was underappreciated.  It is a serious and thorough examination of popular mythology and from whence many of our modern concepts came.  In these times when women are once more required to defend their rights and the very idea of equality, it might be useful to see how and in what ways this struggle has been going on for a long time and under some of the most unexpected banners in some of the most unusual ways.