2016

Tardiness comes in direct proportion to chaos. The year ended and all was in flux.

However, reading goes on.

I did not finish nearly as many books in 2016 as I tried to. At least, not other people’s books.  I did finish drafts of two of my own.  My desk, at the moment, is clear, and maybe I can do a better job in 2017 of keeping abreast here.

A good deal of my science fiction reading was pretty much for the reading group I host at Left Bank Books. That group affords me opportunity and motivation to read novels I might not otherwise get to.  So I reread Alfred Bester’s The Stars My Destination for the first time in three decades, but I also read The Left Hand of Darkness for the first time ever. I do not regret the delay. It is a mature novel, with a great deal my younger self may well have missed.  As to the former, it came very close to not holding up.  I had forgotten (if I ever realized it this way) just how brutal a novel it is, and not just in the character of Gully Foyle. Bester’s achievement way back in the Fifties remains remarkable for its unyielding insistence on a fragmented, painful, chaotic, and historically consistent future.

I also reacquainted myself with Tiptree, in the form of Her Smoke Rose Up Forever. It seems fitting in this period of reassessment and revolution, when the face of science fiction is—has—changed and brought forth a volatile reaction to that change.  Tiptree was doing much of what is being so rancorously challenged within the field today, but as she was a singular voice and not a “trend” she provoked different challenges then while becoming accepted generally as a brilliant writer and a jewel in the crown of SF stars.

I also reread (for the first time since it came out) Robert Silverberg’s Dying Inside, which I reviewed in the previous post.  I was much too inexperienced a reader the first time to appreciate everything Silverberg was doing, so I probably forgot the book as soon as I finished it.

It is true that some books must be “grown into”—I am currently rereading Samuel R. Delany’s Trouble On Triton for the book group and realizing that, while I read it eagerly the first time, I probably missed almost everything important about. Likewise with another reread, Gene Wolfe’s The Fifth Head of Cerberus, which is ostensibly a novel about colonialism.  I say “ostensibly” but that does not mean it isn’t.  It very much is about colonialism, all three of the novellas which comprise the whole.  But it is as much about how we colonize ourselves, sometimes to our loss, as it is about colonizing foreign soil, in this case another world with a native population that strives to adapt but may have found in the end their only options were extinction or counter-colonization.  As always, Wolfe’s subtlety is rigorously slippery, his points less direct,  corrosive of expectation.

Titan Books has rereleased Michael Moorcock’s Cornelius Chronicles, a story cycle that is the very definition of indirect.  Moorcock took as his template the Romantic poets—Byron, Shelley, et al—and displaced them into a near future chaos in the form of his “hero” Jerry Cornelius, who wants to save the world only to resurrect his dead sister so they can be together.  The prose are rife with Sixties hip, but not so overwhelmingly anachronistic that the novels aren’t just as readable now as they were then.  The response to them is perhaps necessarily altered and certainly the themes play out differently. Moorcock may have been the grown-up in the room at the advent of New Wave.  He did go on to write some marvelously rich books after these.

I finished Ann Leckie’s delightfully subversive Ancillary trilogy.  I need to do a full review soon.  Treat yourself.

A smattering of other SF titles I can recommend whole-heartedly:  Lavi Tidhar’s Central Station; Sylvain Neuvel’s Sleeping Giants; Carter Sholz’s Gypsy; Binti by Nnedi Okorafor.

And Nisi Shawl’s wonderful Everfair.  An alternate history steampunk done the way steampunk ought to be done.  I owe it a full review, but let me say here that this is one of the best first novels I’ve read in a long time.

I read two China Mieville books this year, one very good.  This Census Taker I have to count as a failure.  It has good writing fascinating bits, but failed to come together the way I’ve come to expect from Mieville.  The other, newer one, is The Last Days of New Paris, which is excellent.  This pair allowed me to understand that one of the primary passions Mieville indulges in his work is cities.  His best work portrays a city as a complete character.  This Census Taker lacked that.

Of the non science fiction read this year, I did Moby-Dick with my other reading group.  I resisted doing this book.  I’ve never liked it.  I find it turgid, convoluted, often opaque.  There is also a darkness to it that can be suffocating. Over several months we tackled it, dissected it, ran through various analyses.  I conclude that it is a superb work, fully deserving of its reputation.  It is A great American novel if not The American Novel, because America is its subject, though it takes place on a whaling ship far at sea.  It is not a flattering picture, though, displaying throughout the contradictions, hypocrisies, and shortcomings of the then young nation which continue to plague us.  It does this brilliantly.

I still don’t like it.  I find little pleasure in the actual reading.  That, as they say, is my problem.

A colleague and coworker, Kea Wilson, published her first novel, We Eat Our Own. I commend it.  I reviewed it here.

A novel that straddles the genre boundaries somewhat that caused some controversy upon its initial publication is Kazuo Ishiguro’s The Buried Giant.  This is a post-Arthurian quest story with much to say about memory and community and the price of vengeance.

This was a big year for nonfiction.

Robert Gleick’s new tome, Time Travel: A History is an exceptional soliloquy on the concept, science, and cultural use of time travel, beginning with Wells and covering both the scientific realm and the popular fiction realm, showing how they have played off each other and how the idea has evolved and worked through our modern view of the universe and our own lives.  Previously in the year I’d read his magnificent biography of Richard Feynman, Genius.  Gleick is a great explainer and a fine craftsman.

As well, Carlo Rovelli’s Seven Brief Lessons About Physics.  They are brief, they are accessible, they are to be enjoyed.  And, along the same lines, Void by James Owen Weatherall, about the physics of empty space.  It’s far more fascinating than it might sound.

I can recommend Peter Frankopan’s Silk Roads, which is a history of the world from the viewpoint of the Orient.  The shift in perspective is enlightening.  Along the same lines I read Charles Mann’s 1491, which was eye-opening and thought-provoking—and in some ways quite humbling.

I also read Arlie Russell Hochschild’s Strangers In Their Own Land, especially in the wake of what I think I can safely call the most surprising election result in recent history. This book is a study of the right-wing culture that has developed in many startlingly contradictory ways.  I believe this would be worth reading for anyone trying to make sense of the people who continually vote in ways that seem to make no sense—and also for those who do vote that way just so they might understand what it is about their movement that seems so incomprehensible to many of their fellow citizens.

I read a few short of 50 books in 2016 cover to cover.  I will be reviewing some of them in the future.

Here’s hoping for a good year of reading to come.

 

 

 

Smartness

James Gleick’s biography of physicist Richard Feynman ought to be part of all high school science classes. Not only does he chronicle the life of one of the preeminent scientists of the 20th Century, not only does he portray the chief puzzle of physics in that century clearly, he manages to convey the substance of the work in enviably accessible prose without once “dumbing down” any of it. All this while using remarkably few equations and those usually only in description of what they do, not in the numbers and symbols themselves. One comes away from the book—Genius: The Life And Science of Richard Feynman—feeling that this would not be such a difficult subject, or at least feeling that it would be as much a human endeavor as art or music or engineering or accounting.

Science is encased in an opaque mythography that seems designed to make people feel inferior. In the main, this is a consequence of language.  At one time, the language of science was abstruse in the extreme. Isaac Asimov once wrote a short story poking fun at the tortured convolutions of scientific jargon.

I say at one time. An effort has been made in recent decades to make it clearer. It occurred to some that the density and deliberate complexifying of scientific papers itself had done unintended damage to the field by making it inaccessible to the very people it is ultimately intended to benefit.  We might not have such frustrating debates going on today in the social and political realms over climate or vaccination had scientists themselves, as part of in-group cultural arcana, kept the lay public at such a distance by making what they do appear simultaneously elitist and impenetrable.

Feynman himself rejected such practices throughout his career.  He never assumed people—average people—couldn’t understand.  Just say it plain.  If you could not explain it clearly, he believed, then you yourself did not understand it. He was constantly railing against “stupid questions.”  Questions either too vague, too big, or too beside the point for any kind of reasonable answer.

But he wanted the questions. He wanted to see that spark of interest, and if he saw a glimmering he would try to fan it into flame. His enthusiasm was infectious.

Richard Feynman became one of the most important questioners in 20th Century science. Partly this was due to his idiosyncratic approach to problem-solving. For example, he rarely ever finished reading a paper by someone else.  He would read just enough to understand the problem and then work it out for himself. He didn’t want the solutions handed to him, he wanted the challenge and, evidently, deep pleasure of doing it himself. Of course, in that way he also found errors, logical inconsistencies, even fraud on occasion.  He was a prodigious calculator, often able to do complex  equations in his head. He intimidated and fascinated in equal measure.

What some mistook for slapdash, undisciplined thinking was rather underpinned by a vigorously compulsive commitment to fact and, ultimately, truth. The rigor of his work ultimately proved the equal or superior to many if not most of his contemporaries. He insisted that the universe make sense and, crucially, he was unafraid of admitting he did not know something.

He lost the love of his life while working on the atomic bomb, a perhaps unfortunate pairing of profound experiences which, while he seldom talked about either, perhaps informed his seemingly random path through post WWII physics.  He was late in receiving a Nobel Prize, partly by his inability to find the “right problem” to work on.But in the course of his search, the work he did informed and solidified work done by many others.

Feynman may have been a genius.  In a scintillating chapter, Gleick examines the subject of genius itself, partly to address the peculiar circumstance that we seem no longer to have any geniuses. This, he suggests, is a byproduct of the fact that we have so many and in a wide range of fields.  What we seem to lack is the singular personality to which the label readily appends. We have no public space anymore for an Einstein.

Or a Feynman.  But that does not mean we do not have them…or that we do not need them. Now perhaps more than ever.

Gleick has humanized Feynman in this, although in Feynman’s case that may never have been needed.  He was known for playing bongos, he was a raconteur, he spoke in a thick New York accent, and he came across often as a kind of rural wit, plainspoken and commonsensical to the core.  Yet his major work was in one of the most difficult and demanding aspects of modern science and he was a major presence. Appearances too often supplant substance.

Knowing this, Gleick also humanized the subject Feynman devoted his life to, making the science accessible and, to a surprisingly large extent, comprehensible to the nonspecialist.

In an era in which “hero” is too often applied to the physical—athletes, models, soldiers, actors—and may itself be a term corrupted by overuse and inconsistent application, it might serve us well to draw back and consider how little attention is paid to thinking and the solving of problems.  The process alone is contributory and ought not be beholden to budget committees or P&L charts or mere application review. That there are people who spend their time unraveling the mysteries of the universe, not in some mystical sense of gurus on mountaintops but in the sense of an Einstein figuring out why the universe is the way it is, should be a source of inspiration.  In the final analysis, it is likely that people like Richard Feynman give us more as a culture and a civilization than all the pseudo-philosophical mouthings of all the gurus that have ever lived.  That one can pull a device out of one’s pocket, almost anywhere on the planet, and look up any or all of those gurus is a consequence of people like Feynman figuring out the nature of the real unseen, the quantum level of reality, which, as Feynman would have insisted, is Reality.

 

In Review

2015 is done and I have read what I read.  It was a year fraught with turmoil in science fiction, a year prompting reassessments, a year when required reading competed with reading for pleasure, and the time constraints of working on a new novel (two, in fact) impeded chipping away at my to-be-read pile, which mounds higher.

As in the past, I count only books I have read cover to cover here.  If I added in total pages of unfinished reading, I’m probably up with my usual volume (somewhere around 90 books), but that would be a cheat.  That said, I read 50 books in 2015.

One thing I concluded, both from what I read and the upheaval in the background about what is or is not worthy science fiction, is that the decades long pseudowar between mainstream and genre is over.  Skirmishes will continue to be fought here and there, certain elements will refuse to yield or concede, but by and large the evidence suggests that, on the part of the literary writers at least SF has made its point. A couple of examples:

Station Eleven by Emily St. John Mandel is science fiction.  In fact, after talking it over for nearly a year since I read it, it seems to me to be Heinleinesque.  Better written, the characters less exemplars than real people, but in basic conceit and plot, this is a Heinlein novel. It has all the elements—survivors, a plucky heroine, a global catastrophe forcing those who remain to learn quickly a whole suite of new skills, and an ongoing discussion throughout about what is of value and ought to be preserved.  It is a superbly written work and that alone made the identification difficult.  Heinlein, at his best, could be as good as anyone in any genre, but to see the form raised to this level shows both his virtues and his weaknesses.  The population of the Earth is reduced buy a superflu.  The novel flashes back and forth around the life of a kind of patriarch whose biological and artistic progeny struggle in a post-technological world to both survive and preserve the best of that former world.  The novel prompts questions, challenges preconceptions, and draws us in.  It was not marketed as science fiction and it has continued to sell very well.  It is science fiction and no one has batted an eye.

The Water Knife by Paolo Bacigalupi.  An ecological thriller, an examination of a different kind of breakdown, a different kind of survival, peopled by characters as real as can be.  In a decade this will be historical fiction, probably, but it is SF and also mainstream and also uncategorizable.  Exceptional.

Straddling the boundary is Jeff Vandermeer’s Annihilation, which is a curiosity.  It proceeds like a straightforward “survey mission” novel—specialists set down upon an alien world and struggling to unravel its mysteries before said world kills them.  Only in this case the “alien” world in a patch of reclaimed wilderness somewhere along the eastern seaboard, probably north Florida, that is undergoing some strange transformations due to an experiment gone wrong.  There are touches of zombie fiction, government conspiracy, and even Lovecraftian uber-malignancy evoked, but the story, as told by The Biologist, feels more meta than any of those suggest.  the landscape works to inform the soul-wrenching recognitions and evolutions within the Biologist as she works to understand what is going on in the aptly named Area X.  Vandermeer has created a work bordering on genius here by virtue of externalizing and foregrounding mystical revelation as ecological transmutation, but as you read you can’t tease the meta passages from the plot in any clear way, so the experience, when you give yourself over to it, is wholly immersive.

So what I’m seeing—in many more titles still on my TBR pile—is the embrace of science fiction by what was formerly an ambivalent cadre of artists who are using it to ends traditionally ignored by main-body SF.

In the other direction, the infusion of literary concerns, which necessarily drag real-world issues in with them, into genre writing has prompted a squeal of protest from those who wish to keep their starships pure, their aliens obvious, and their weapons decisive.  “Good writing” is still a poorly understood quality by too many in the genres (by no means a problem exclusive to SF, but because of the nature of SF a problem which yields far more obvious failures) and the clinging to an aesthetic attributed to the so-called Golden Age and exemplified by writers probably more often revered than actually read (and therefore misperceived in intent) has exacerbated the old antagonisms and a final flaring up of fires dying to ash.  The clunky sentence is a hallmark of much of this, more likely as consequence rather than intent, and the cliched scenario becomes more obviously so as the whole point of what we mean by “literary” in its most useful mode is overlooked or, perhaps, willfully ignored in a fit of defensive refusal to pay attention to what matters, namely the truth of human experience and the profitable examination of, for want of a better word, the Soul.

Where the cross-fertilization of mainstream and genre has been successfully accomplished, we’ve been seeing novels and stories of marvelous effect.  We have been seeing them all along and in the past such examples were readily offered as proof that SF wass “just as good” as anything published as mainstream.  I’ve always felt that being “just ad good” was selling our potential short, but the work has to rise to the challenge, and there always have been such works.

Among such that I read this past year were a few from that rich past, mainly for the reading group I host at work.  The Two of Them by Joanna Russ; Extra(Ordinary) People, also by Russ; The Doomsday Book by Connie Willis; Mythago Wood by Robert Holdstock; The Sparrow by Mary Doria Russell; and Engine Summer by John Crowley.  In retrospect, there have always been writers writing in the full embrace of science fiction but without any of the stylistic constraints of its pulp origins, and such works remain trenchant and readable and offer surprising commentary still on current questions.

The Sparrow was a highlight. I have known since its publicatin that it was sort of a riff on James Blish’s classic, A Case Of Conscience, but it so much more. Russell’s elegant reversal of the moral question elevates this novel to the top tiers of useful literary works. I have not yet read its sequel, but I am looking forward to it after this treat.

I also reread Harlan Ellison’s Shatterday for the reading group. It’s been a good long while since I did so and I was not disappopinted, although I read many of the stories through a more cynical eye. The opening tale, Jeffty Is Five, remains, for me, one of the most gutwrenching short stories of all time.

Another highpoint this past year was James Morrow’s new novel, Galapagos Regained, a neatly unclassifiable work of speculative history.  I gave it a lengthy review here and recommend a look. This is a superbly done work that deserves more attention than it has received.

I also read Morrow’s amusing novella, The Madonna and the Starship, which runs a delightful gamne via Fifties television and alien visitors who come to bestow an award and offer assistance in exterminating the irrational on Earth.  Morrow is acerbic even as he is funny.

Among the most interesting new works of science fiction I red this year is The Three-Body Problem by Cixin Liu, translation by Ken Liu.  This is the first part of a trilogy about alien invasion and resistance as written from a Chinese perspective.  It is an exceptional translation.  It won the Hugo Award, the first, I believe, translation to do so, and certainly the first Asian novel to win.  There is high-end physics, nasty politics, murder, and the conundrums of committed action. The cultural quirks made it even more interesting.

Like almost everyone, it seems, I read The Martian by Andrew Weir. This was great fun and well executed.  My quibble, along with many others, was with the opening gambit to explain the marooning of the astronaut, but I’m content to see it as a mere dramatic choice.  It didn’t preent me from enjoying the rest of the book, which, in the words of the screen adaptation, “scienced the shit out all this” and did so in an accessible and entertaining manner which I applaud.  I couldn’t help seeing it as a newer version of an older film, Robinson Crusoe On Mars, and naturally this one works a bit better.  Hell, we know more, there’s no excuse for bad science, and Mr. Weir that.  He wrote a realistic piece of speculation and followed through admirably.

Another novel that gave a far more “realistic” view of an old, favorite SF trope, is Kim Stanley Robinson’s Aurora.  There is much to love about this book, but it is not lovable.  It’s a clear-eyed look at what an interstellar generation ship would actually be like.  And it is bleak, in terms of the traditions of SF.  Suffice it to say without giving away too much that Robinson fully incorporates entropy into his formula with predictably gloomy results, but for all that it is a thoroughly engaging work.

At the other end of the “hard” SF spectrum is Charles Gannon’s Fire With Fire.  Future interstellar expansion brings humanity into contact with our neighbors.  The resulting tensions drive the novel.  I reviewed it here.

Science fiction is a broad, broad field and has room for a magnificently wide range even on the same subjects.  It even has room, as I noted above, for exceptional style.  One of the most enjoyable reads for me, on that note, was Ian McDonald’s new novel, Luna.  There will be comparisons made to Heinlein’s The Moon Is A Harsh Mistress.  Look for an upcoming review where I will argue that the comparison, while in some ways valid, is superficial.  Anyone who has not read McDonald, treat yourself.  This would be a good one with which to begin.

In a completely different area of the playground, there is Daryl Gregory’s AfterParty, which I found excellent.  It’s about drug abuse and the workings of delusion and murder.  Anything I might say here would spoil it.  Go.  Find it.  Imbibe.

The bulk of my reading, after that and a few other titles, has been scattered.  I found a brand new history of the Group f64, which was the first dedicated group of photographers to push the pure art of the straight photograph.  Ansel Adams, Edward Weston, Imogen Cunningham, several others, in the 20s and 30s established the ground upon which all photography came to be viewed for the rest of the 20th century and even, arguably, into today. Mary Street Alinder, who has previously written a biography of Ansel Adams, did a superb job chronicling this group of prickly independent artist.

I read a history of a superhero, Wonder Woman, and discovered that the story of her creation was even stranger than the character herself.

A new work by journalist Johann Hari, Chasing The Scream, opened my eyes to the thorny issue of the Drug War.

In the wake of seeing the film Interstellar and beginning work on my own novel about (partly) interstellar travel, I dove into Kip Thorne’s Black Holes & Time Warps and had my mind bent in some ways I didn’t think it could be bent.  This has prompted a reengagement with science on this level which is proving difficult, tedious, and yet rewarding.  My mind no longer has the plasticity it once enjoyed.  On the other hand, experience has proven a benefit in that I seem to be absorbing and comprehending at a much deeper level.  We shall see.

Quite a bit of history, much of it unfinished.  In a separate reading group, I’m going through Victor Hugo’s Les Miserables, and reading in the history of the French Revolution, the Republic, its fall, all partly to complete the third novel of my trilogy, but also because the literature available is so rich and surprising that it has become its own pleasure.  It would seem now I’m about to embark on early American history again, anchored by Ron Chernow’s biography of Alexander Hamilton.

There was a new Mary Russell novel this past year, Dreaming Spies, by Laurie R. King.  I discovered a Dan Simmons novel about Holmes which I’d overlooked when it came out, The Fifth Heart, in which he is paired with Henry James, one more in a long line of novels and stories concerning Holmes’ unlikely interaction with historical figures.  Simmons is a terrific writer, but even he tended toward the tedious in this one.  He needs to learn to leave his research in his files.  But it was a unique take on Holmes and he even managed to elicit my sympathy toward James, a writer I find problematic at best, insufferable at worst, and annoying the rest of the time.

So much for the highlights.  Let me end by noting that the Best American series has finally realized that science fiction and fantasy are a real thing and launched one of their annual collections to cover it.  This after both Best Of infographics and comics.  Better late than never, I suppose.  The series editor is John Joseph Adams—difficult to imagine better hands—and this first volume was edited by Joe Hill, which I found interesting to say the least.  Mr. Hill is a horror writer.  Certainly many of the stories have a strong horror element, but over all this is a collection full of marvels, from the writing to the ideas.  I’ll try to keep track of this one in future.

So while not numerically great, 2015 was filled with many very excellent books.  I’m looking forward to 2016.  My stack awaits.

Happy New Year.

 

 

Easy Habits and the Consequences of Belief

At first glance, the two books could not be more different. Subject, tone, everything seems different. Not at odds so much as…nonoverlapping.

Which is ironic, since both deal, in their separate ways, with that very idea, the separation of areas of knowing.

Probably because I read them so close together I recognized their shared concerns as clearly as I did. Whatever the reason, it struck me as obvious in so many ways that I began to recall all the other books over the last few years that could be likewise gathered within this same subset, all on distinct topics and yet all based on, to some degree, an analysis of the same human propensity to disregard evidence when it contradicts belief.

Let me begin with the more general of the two.

Jerry A. Coyne is an evolutionary biologist.  Compared to others with public profiles, he has published few books.  Three, to be precise.  His latest, Faith Vs. Fact: Why Science and Religion Are Incompatible, is a direct challenge to Stephen Jay Gould’s idea of “nonoverlapping magisteria.”  Gould’s premise is that science and religion should not conflict with each other because they are concerned with entirely separate realms of knowing—hence the nonoverlapping part—and except for certain agenda-driven partisans, there is no reason for them to be in conflict.  Coyne sees this as accommodationism, which he thoroughly discredits in his book.

My claim is this: science and religion are incompatible because they have different methods for getting knowledge about reality, have different ways of assessing the reliability of that knowledge, and, in the end, arrive at conflicting conclusions about the universe.  “Knowledge” acquired by religion is at odds not only with scientific knowledge, but also with knowledge professed by other religions.  In the end, religion’s methods, unlike those of science, are useless for understanding reality.

Coyne identifies Accommodationism as an attempt not to stir the hornet’s nest, because scientists are often dependent on politically sensitive funding.  Science, especially Big Science dealing with questions of origins, is expensive and the days of the independently wealthy scientist are largely gone.  Rocking the boat by annoying those who hold the purse strings would seem ill-advised.

The conclusions, he goes on to argue, of scientific inquiry continually poke holes in those claims by religion that still assert authority over secular matters.

But more than that, such deferral to authority erodes our critical capacity and can lead to false conclusions and bad judgments, all because we accept the hegemony of “faith.”

…a word defined in the New Testament as “the substance of things hoped for, the evidence of things not seen.”  The philosopher Walter Kaufmann characterized it as “intense, usually confident, belief that is not based on evidence sufficient to command assent from every reasonable person.”

The sticking point for many would be that “reasonable person” proviso, for surely, as Coyne concedes—often—there are many reasonable people who nevertheless espouse a religious faith.  What are we to make of that?  Is the basis for the assessment of “reasonable” perhaps too ill-defined or is the scope of “belief” too broad?

Coyne takes us through the process, giving a thorough explication of why science and religion are incompatible if not a convincing argument for the down-side of belief for belief’s sake.  He tells an anecdote early in the book about how he yearly teaches undergraduates a course in evolution, which the majority do very well at, but admit, after earning their A, to not believing a word of what he taught.  Because it contradicts their religious belief.

It is this that Coyne sees as the dangerous aspect of faith as promulgated through religion, the a priori rejection of evidence in favor of a set of usually unexamined beliefs.  He takes us through a catalogue of negative manifestations, from the rejection of medicines to the acts of terrorists to the rejection of solid science (like climate change), that have their underlying justifications in religion.

Coyne, an admitted atheist, puts all this forward while taking pains to admit the personal comfort to be found in religion by many people.  From a certain point of view he tends to go the extra kilometer to be fair.  Of course, going through the comments on various review sites, those predisposed to reject such arguments accuse him of profound bias if not outright malicious intent.  One cannot help but wonder if they bothered to read the book, all or even in part.

The book makes its case clearly and concisely.  It avoids the polemic outrage to be found in other tomes by big name atheists by sticking largely to evidentiary concerns and philosophical arguments.

But, one may ask, so what?  Religion and science are two realms in which most people would assume they have no stake. Castles in the air, esoteric arguments about things that have no impact on our daily lives.  Most people seem to keep a religion much the same way they have a pet and certainly in the West the majority live secular lives and only rarely feel compelled to apply their religious convictions to anything.  As for science, as long as the technology we depend on works, all the rest is so much theoretical handwaving.  It makes no difference if we have almost no understanding of quantum mechanics and certainly evolution is just tenure-building nonsense having to do with million-year-old bones and what kind of textbooks the school district might use next year.  Nothing to do with us in our daily lives. So what if people rely more on faith and belief in making their daily judgments than on evidence-based science?  We operate more on heuristics defended by aphorism than by reason and applied understanding, or so Daniel Kahneman tells us in his excellent study, Thinking, Fast and Slow, and we by and large get along perfectly well that way.

How does this argument concern me?

Johann Hari’s new book, Chasing The Scream, has an answer to that.  Of sorts, if we but make the epistemological leap.

Hari is writing about the drug war. It is, for him, as much a personal examination as a journalistic one, as he admits to having family and friends who are addicts.  He begins with a simple question.

I scribble down some questions that had puzzled me for years.  Why did the drug war start, and why does it continue?  Why can some people use drugs without any problems, while others can’t?  What really causes addiction?  What happens if you choose a radically different policy?

Okay, three questions.  Still, simple, basic questions, to which any reasonable person might reasonably expect a reasonable answer.

Yet we spend billions, bully small countries, destroy thousands if not millions of lives, all in pursuit of policies which rest on an appalling lack of informed justification.  By the end of the book you come to see that there are no answers to those simjple questions which in any way validate our continuing on with things as they are.  As they have been for a hundred years.

Hari goes back to beginning of the war, before drugs were illegal, and takes use through the history.  Back to Harry Anslinger, the head of the Federal Bureau of Narcotics, who fueled the drug war for the twin purposes of maintaining his agency and exorcizing demons that had possessed him since childhood, even in the face of substantive research and sound arguments denying his approach to the problem had any merit and more than ample evidence—Prohibition—that it would lead to catastrophe, not only on a national but on a global level.  Hari details Anslingers battle to destroy Billie Holiday and his use of intimidation and police tactics and, subsequently, U.S. foreign policy to assure the continued crusade to eradicate drugs and destroy addicts.

Not because of any evidence Anslinger understood that led him to think this was the only if not the best way, but because he believed it in spite of growing mountains of evidence that this was a wrongheaded approach.  He suppressed evidence, hounded physicians who dared present alternative models to the drug problem, intimidated politicians—except those who could secure him funding—and strenuously denied the validity of any evidence that  contradicted his belief.

He believed.

As many did and still do.  It is this that trumps hard evidence.

Even as a young adolescent I thought the argument for our drug policies was lacking.  I thought at the time that I just didn’t understand.  Never having been in the least interested in drugs and knowing few if any who were involved with anything more serious than marijuana, it seemed not to concern me.  Later, I did a short stint as a volunteer drug counselor, but the work was far more than I could handle at the time.  I trusted the people in charge knew what they were doing and certainly the gang violence associated with drugs seemed to make a persuasive case that this was a worthwhile and often desperate fight.

But as the years went by and the war continued, I began to notice bits of research here and there and how certain politicians contradicted themselves and how the prison population, especially in the wake of Nixon’s near militarization of the police community and the drug war ethos, was growing and in very worrisome ways.  I began to seriously rethink my position with Reagan’s zero tolerance policies and the mandatory sentencing guidelines he established through Ed Meese, one of the notable Puritans of the modern age.  Even so, I had the nagging suspicion that maybe I was just missing something.  Certainly I didn’t approve of drug addiction, but I more and more came to believe that these people needed help, not punishment.  We understand that process very well with alcohol, why is it different with narcotics?

And besides, there are plenty of people who receive perfectly legal prescription narcotics and never become addicts.

The number of holes in the picture kept growing.  I no longer trusted stated drug policy, but I didn’t understand the instransigence of people over reform.

Hari’s book lays it out very clearly.  Money is high on the list.  We fund too many of the wrong people at too high a level for them to be easily weaned from the teat.  Foreign policy is also tied to this, especially in these days of international terrorism which has a strong drug component.  But the factor that ties this in to Jerry A. Coyne’s book is the one Hari covers only glancingly.

Belief.

It is easier to rely on what we have always believed than to look at evidence that requires us to change our mind.

Many aspects of our lives are covered by this observation, but where problems arise are those with political and social ramifications.  The persistent beliefs about the poor, about minorities, about people with different beliefs, even when evidence is provided which significantly challenges such beliefs and suggests strongly that not only are they wrong but that we would be better off discarding them, we cling to them.

Hence my conflation of these two books and the suggestion that they share a common idea.

Not that I argue that all beliefs are wrong.  What is wrong is the intractable nature of unquestioned belief.  The only reason the drug war continues is that it has considerable popular support and the only reason it has that is that many people cannot bring themselves to change their minds about it in the face of not only evidence but what we euphemistically call common sense.

But that can be said of so many things which directly and indirectly impact our lives.

Perhaps it is a stretch and perhaps I argue out of my own biases, but it seems to me the most valuable tool we can have in our intellectual toolbox is the ability to say “well, I believe that but I might be wrong.”  Faith cannot maintain, however, among people who are able to say that and apply it to anything and everything.  Science is built on exactly that principle—all knowledge is conditional—but belief, as exemplified by religion, thrives by the absence of that principle.  It says some knowledge is absolute, not to be questioned.

In the contemplation of matters of theological concern, this perhaps offers consolation, comfort, a certain utility, to be sure.  But it is easy for unquestioning acceptance of arguments from authority to become habit and then apply it to anything that our prejudices suggest we may wish to avoid examining.  Drug addiction is an unfortunate affliction and it may be uncomfortable for people to see it for what it is—a disease, like alcoholism.  That discomfort makes room for assertions from authority offered by people who claim to be working on our behalf.  Our unwillingness to ask evidentiary questions is a cozy environment for demagogues and despots.

When you ask “What does this have to do with me?” you might be surprised at the consequences of avoidance and the price of unquestioning trust.  Why should we learn science and how to apply the discipline of examining evidence?  So we don’t hurt ourselves out of the easy habit of belief.

Inside Outside: Two Views of Science Fiction

Histories and analyses of science fiction are often fragmentary. Like histories of rock’n’roll, there are just too many different facets to be meaningfully comprehensive. That is not to say there aren’t excellent works that manage to deal with essential elements of science fiction, only that inevitably something will be left out or overlooked or, now and then, misunderstood.

I recently read two books about the subject that represent the poles of such analyses—those done from the inside and those done from the outside—and between them a clarity emerges about the fundamental misunderstandings that abound about the nature of science fiction.

Brian W. Aldiss’s almost majestic Billion Year Spree was published in 1973, a good year to attempt an overview like this, which covers precursor works as well as traces the development of the specific qualities of the genre through the 19th Century and then treats the major corpus of what we have come to recognize as science fiction from the 20th Century. Aldiss is very smart, very savvy, and his wit is equal to his intelligence in putting things in perspective. It is in this book that the idea that Mary Shelley’s Frankenstein is the first genuine science fiction novel was presented. Most dedicated readers of science fiction may be acquainted with this proposition, which has gone viral within the field, but may not have read Aldiss’s arguments in support. They are worth the time.

The second book is very recent. Margaret Atwood’s In Other Worlds, which does not purport to be an overview like Aldiss’s work. Instead it is a very personal history with opinions and judgments. It covers Atwood’s association with science fiction and showcases her take on it as a genre. In some ways it resembles a memoir. On the question of what the first SF work was, Atwood is much less rigorous and far more concerned with SF as myth than Aldiss, so we find allusions to Gilgamesh and several other works along the way, which she does not specifically name as the primogenitor.

Which makes perfect sense by the end of the book because—and she pretends to nothing else—she doesn’t know. She doesn’t seem to know what science fiction is as practiced by those who work mainly within the field, nor does she seem to understand the nature of the particular pleasure of SF for the dedicated fan. And as I say, she never claims to.

This would normally not even be an issue but for the fact that Atwood has been committing science fiction for some time now. But it’s not her primary interest, as represented by a long and successful career writing and publishing what is generally regarded as mainstream literary fiction and commentary upon it. It’s not her sandbox, even though she is clearly attracted to it and likes to come over and play.

The different focus of her appreciation of science fiction highlights aspects of the longrunning and disputatious relationship between the so-called literary establishment and the declassé realms of genre fiction. Especially after having read Aldiss on science fiction, the bases of mutual incomprehension across the fictive divide becomes clearer.

Aldiss establishes his premises early:

No true understanding of science fiction is possible until its origin and development are understood. In this respect, almost everyone who has written on science fiction has been (I believe) in error—for reasons of aggrandisement or ignorance. To speak of science fiction as beginning with the plays of Aristophanes or some Mycenean fragment concerning a flight to the Sun on a goose’s back is to confuse the central function of the genre; to speak of it as beginning in a pulp magazine in 1926 is equally misleading.

In chapter one he then sets out his operating definition:

Science fiction is the search for a definition of man and his status in the universe which will stand in our advanced but confused state of knowledge (science), and is characteristically cast in the Gothic or post-Gothic mould.

Contrast this to Atwood’s opening stab at definitions:

Much depends on your nomenclatural allegiances, or else on your system of literary taxonomy…I realized that I couldn’t make a stand at the answer because I didn’t really grasp what the term science fiction means anymore. Is this term a corral with real fences or is it merely a shelving aid, there to help workers in bookstores place the book in a semi-accurate or at least lucrative way?
…sci fic includes, as a matter of course, spaceships and Mad Scientists, and Experiments Gone Awfully Wrong…

Then later, this:

In a public discussion with Ursula K. Le Guin in the fall of 2010…I found that what she means by “science fiction” is speculative fiction about things that really could happen, whereas things that really could not happen she classifies under “fantasy.”
…In short, what Le Guin means by “science fiction” is what I mean by “speculative fiction,” and what she means by “fantasy” would include some of what I mean by “science fiction.”

There are harbingers in this which emerge meaningfully later in the book.

My own definition of science fiction is less specific than Aldiss’s and far more rigorous than Atwood’s—science fiction is at heart epistemological fiction: it is concerned with how knowledge (and subsequently technology) forces change on humans. You might argue that any good spy novel would meet that criteria, and certainly many spy novels (and movies) contain large dollops of science fiction, but only as collateral concerns. The change in a spy novel is earnestly resisted and often successfully so—the status quo is all important. Science fiction usually starts with (the authorial) belief that any status quo is an illusion and goes from there. Again, any surrealist novel might meet that definition, but I said epistemological, which is the tell-tale, because we’re talking about knowledge and knowing and acting, which is a communal experience, across society. And so the Federation of Star Trek qualifies as an epistemological proposition while the Isle of Avalon does not. And of course the second important condition—force—is essential in this regard. If there is a classical myth at the heart of SF it is Pandora’s Box. Open that lid—which is an act of will—and then deal with the consequences of uncontrollable environmental change.

I take it as read that there are other definitions of science fiction. This one is mine. It has the virtue of being completely independent of tropes—those spaceships and Mad Scientists of which Atwood speaks. Which brings something like Herman Hesse’s Magister Ludi into the fold quite plausibly while leaving something like Allen Drury’s Throne of Saturn out.

Aldiss proceeds in chapter one to make his case for Frankenstein and he does so adroitly. For SF to be true to itself, a change must be apparent that can be prompted and shaped no other way than by the conceit of the Sfnal idea. Dr. Frankenstein has learned how to reanimate dead tissue. The change this causes in him is to be faced quite unmetaphorically with the responsibility of being a god.

What separates this effectively from a straightforward horror novel is the utter humanity of Victor Frankenstein and the absence of any hint of either the divine or the demonic. What unfolds is a human drama anyone would face under similar circumstances. Frankenstein is not “mad” but becomes so. The Creature is not supernatural, it’s a construct. The questions of soul and moral responsibility permeate the drama—unresolved and unresolvable. Frankenstein has made a change in the world and has to figure out how to deal with it. He fails, but it’s the wrestling with it that brings the book into the fold of science fiction, because the change is both external and personal and depicted as humanly possible.

The rest of the novel is a Gothic—namely, it partakes of the tropes that define the Gothic: lonely castles, empty landscapes, isolation, darkness, and a kind of vastness that seems ponderously empty (but may not be). In that respect, Aldiss is correct about SF being in the tradition of the Gothic. It deals with vastness, isolation, the alien as landscape—and moral conundrum.

Atwood seems to think it’s all about utopias, which is why she seems unable to locate a definable beginning to the genre. There is a palpable reluctance throughout her book to deal with the subject directly, in a way that addresses the particular history of the stories that comprise the principle body of what we call science fiction, as if by searching around the perimeter she might find the point where it can all be subsumed into the larger, primary literary history of the last couple of millennia.

Aldiss talks throughout Billion Year Spree about the writers who informed the genre ever since it split off into its own distinct digs in 1926 with the founding of Amazing Stories by Hugo Gernsback, who Atwood barely mentions in passing. In Aldiss we have complete discussion of Gernsback, of Edgar Rice Burroughs, of E.E. “Doc” Smith, Leigh Brackett, A.E. Van Vogt, Heinlein, Clarke, Asimov—names which are oddly absent from the Atwood even though it is hardly possible to discuss SF meaningfully in their absence.

The writers they do cover, both of them, are Aldous Huxley, Jonathan Swift, George Orwell. Aldiss talks about them as what they are—literary writers who found useful tools in the SF toolbox, but who in most ways barely acknowledged the existence of the genre. (In Swift’s case, obviously so, since the genre did not exist in his day. But this itself is telling, since Swift is excluded by Aldiss as a precursor SF writer while Atwood sees him as primary.) Aldiss is remarking on how the same observations led to writers of quite different dispositions to do work recognizable to the main body of SF in its own day. To be sure, such writers are often used by the genre in a kind of reflexive self-defense, as if to say “See, serious writers do it, too!” But while Aldiss shows how these are basically one-offs, Atwood seems to think these writers represent the central goal of the genre—that all SF writers might be aspiring to the level of Huxley and Orwell. Perhaps in matters of craft and even art, but not necessarily in terms of theme or subject.

Atwood begins the biographical parts of her association with the genre in an understandable but curious place—in comics. (She also read H. Rider Haggard as a child, which left a distinct impression on her.) The trouble seems to be that she did not move from comics to the major magazines, and so what she shows is an attempt to make whole the literary connections between the superhero motifs of the 30s and 40s and classical myth. A valid and fruitful analysis, certainly, but it leaves one of the principle distinguishing features of the science fiction of the same period unaddressed—technology. Greek myths care not a fig for how Zeus generates his lightning bolts. They are super natural, beyond such understanding, as befits the divine. Science fiction is all over those bolts and how they are made—and, consequently why.

I would argue that while he did not create the first SF, Homer gave us the first SF character in Odysseus. In his own way, he was a technophile and a geek. He did not believe the gods were utterly inscrutable and unchallengeable and spent the length of the Odyssey figuring out how to beat them. He was a clever man, a man of reason, who clearly believed there was something to be understood about everything.

The mistake many literary critics make in their regard toward science fiction is in consistently assuming SF is all about its gadgets—i.e. its tropes—when it is really about the people who make them, understand them, use them, and all those who are changed by them.

Aldiss clearly understands this. He rarely argues for less science and tech, only for better human depictions. Because SF is about the world those tools are allowing us to make.

The question that springs to mind while reading Atwood’s examination is whether or not she ever read anything “of the canon,” so to speak—like Sturgeon or Herbert or Niven or Brin or Cherryh or even Butler—or if, having read it, she simply found it not worth discussing in the same breath as her token SF writer, Le Guin, and the others she selects to dissect, like Marge Piercy. Even in the case of Piercy, the work she chooses to examine is the one that can be read differently, Woman On The Edge Of Time, rather than the less ambiguous He, She, and It. In the closing paragraph of her examination on Piercy’s time travel-cum-woman-under-pressure novel, Atwood says:

Woman On The Edge Of Time is like a long inner dialogue in which Piercy answers her own questions about how a revised American society would work. The curious thing about serious utopias, as opposed to the satirical or entertainment variety, is that their authors never seem to write more than one of them; perhaps because they are products, finally, of the moral rather than the literary sense.

Even in praise, there seems to be a reservation about the work in question. Not literary, then, but a moral work. In this regard, Aldiss would seem to agree with her:

The great utopias have better claim to our attention, for utopianism or its opposite, dystopianism, is present in every vision of the future—there is little point in inventing a future state unless it provides a contrast with our present one. This is not to claim that the great utopias are science fiction. Their intentions are moral or political…
The idea of utopianists, like our town-planners, is to produce something that is orderly and functions well.

One of the chief drawbacks of utopias is this achievement of function. Basically, the whole point of them is to end history. They are “nowhere” because once attained there is theoretically no further need for people to change. In fact, they must not change, lest they destroy the perfection. As Aldiss goes on to say:

The trouble with utopias is that they are too orderly. They rule out the irrational in man, and the irrational is the great discovery of the last hundred years. They may be fantasy, but they reject fantasy as part of man—and this is a criticism that applies to most of the eighteenth-century literature…

Given this, one wonders what it is that Atwood is attempting in implicitly—and sometimes explicitly—treating SF as utopianism without a nod toward the thing at its core, namely the embrace of inexorable change. Because change is the driving fascination in science fiction and for it to have any valence in the imagination or utility in its constructs, it must present as something other than metaphor. Let me give you two quotes from a pair of SF writers, one of whom seems to be Atwood’s choice of exceptional ability:

Science fiction is a tool to help you think; and like anything that really helps you think, by definition is doesn’t do the thinking for you. It’s a tool to help you think about the present—a present that is always changing, a present in which change itself assures there is always a range of options for actions, actions presupposing different commitments, different beliefs, different efforts (of different qualities, different quantities) different conflicts, different processes, different joys. It doesn’t tell you what’s going to happen tomorrow. It presents alternative possible images of futures, and presents them in a way that allows you to question them as you read along in an interesting, moving, and exciting story.
Samuel R. Delany, The Necessity of Tomorrows

If science fiction has a major gift to offer literature, I think it is just this: the capacity to face an open universe. Physically open, psychically open. No doors shut.
What science, from physics to astronomy to history and psychology, has given us is the open universe: a cosmos that is not a simple, fixed hierarchy but an immensely complex process in time. All the doors stand open, from the prehuman past through the incredible present to the terrible and hopeful future. All connections are possible. All alternatives are thinkable. It is not a comfortable, reassuring place. It’s a very large house, a very drafty house. But it’s the house we live in…and science fiction seems to be the modern literary art which is capable of living in that huge and drafty house, and feeling at home there, and playing games up and down the stairs, from basement to attic.
Ursula K. Le Guin, Escape Routes

Taken together, these point to the disconnect with traditional literary forms, traditional literary expectations. Science fiction contains utopias, certainly (and dystopias, clearly) but it is not in the main about them. Nor is it about some desired escape from the present into an alternative world that may offer some kind of release for a mind at odds with itself, which seems to be the basis of so much neurotic fiction. The focus is on the wrong point here. It is about living in a changed milieu.

The problem with utopias was summed up concisely by Virginia Woolf “There are no Mrs. Brown’s in Utopia.” Like all superlatives, counterexamples can be found, but in the main this is a self-consistent criticism of the form which Atwood seems intent on using as her functional definition of science fiction. There is no room for ordinary people in Thomas More’s Utopia—if they are ordinary, they aren’t people, they’re memes. If they aren’t ordinary, Utopia doesn’t stand a chance of surviving.

And most ordinary people, when you get down to it, are not ordinary.

Which seems to be the major concern of most literary fiction—ordinary people. Which, by a tortuous logic of taxonomic reassessment, means, since Atwood seems to believe SF is principally utopian, that science fiction cannot deal with ordinary people and therefore, though she does not come right out and say this, cannot be considered relevant to mainstream literary concerns.

Welcome back to the ghetto.

In a blatantly dismissive review of Atwood’s own Oryx and Crake, Sven Birkerts asserted that SF can never be [true] literature because it “privileges premise over character.” In other words, the world at hand is more important than the people in it—which, of course, would make it utopian.

Henry James famously claimed “Landscape is character.” (Of course, he then criticized H.G. Wells for dealing more with “things” than characters—in other words, his landscapes.)

Birkerts and Atwood are on the same page, it seems, though Atwood is striving to come to terms with a form she clearly likes, even while misapprehending it. Perhaps had she found a stack of Astounding Stories instead of H. Rider Haggard and comics in the attic as a child she might have understood where the divergence happened and SF split off from two millennia of myth-driven fantasy. Novelty can overwhelm truth-seeking and a great deal of SF falls into the pit of self-involved gizmo geekery, but at those times when the work rises out of that pit to deal with the future and science and their immanence within the human soul it is unfair to not see its true worth. It’s like comparing Sherlock Holmes to the Hardy Boys and dismissing Holmes because he comes from the same stock.

It’s interesting that Atwood chooses Marge Piercy’s Woman On The Edge Of Time as her example, because Piercy worked a further subversion, perhaps unwittingly so, in the scenario she examines. Connie is regarded by everyone around her as insane. But she knows she isn’t, she’s dealing with a real situation, the future. But the world she lives in, the given world, her context, insists of denying the reality of that future and treating her involvement with it as symptom rather than legitimate experience. The parallel to the way in which the science fiction writer and his or her work is treated by those who see themselves as the keepers of context is remarkable. This is a metaphor which Atwood overlooks. The question of whether or not Piercy is writing what Atwood thinks she is or has understood the nature of the form she’s indulging is open.

The misunderstanding is simple but with complex consequences. Most genre fiction—mystery, western, war, spies, even romance—takes advantage of altered context to set mood or establish a range of possible action. Done well, these shifts target different thematic concerns and aim at specific moral (or telec) points. But in all but science fiction (and to a lesser extent the related genre of fantasy) the context would seem to be more attitudinal than material. Except in westerns, but we tend to treat the context of the western as “our” world insofar as it is historical and therefore, legitimately or not, we see it as familiar. The differences fade into background and the metaphor run out of our sight, almost as window dressing.

Science fiction dramatically reverses this relationship.

Which makes it a very uncomfortable place, especially for the writer who has spent his or her career writing from character rather than from landscape through character. Instead of seeing the world as a consequence of character, in science fiction the world is a character and must be dealt with concretely, as if to say “Here’s your new reality (context), now learn to live in it.”

It is precisely that discomfort that is the drug of choice for the reader of SF.

Attempts to corral it into a more familiar tradition run up against what must often seem like a perverse and intractable exoticism on the part of the writers.

Of the two books at hand, the Aldiss is the more taxonomically useful as well as æsthetically relevant. Aldiss, after all, is a science fiction writer. He has lived within the genre, knows it to its marrow, and, while critical of its excesses and irrelevancies, clearly loves it for itself, redheaded stepchild though it may be to others.

Which is not to say the Atwood is a failure. She is just as clearly fond of science fiction and has done considerable grappling with its conventions and conceits. But for her, it feels as if SF was an important love affair that last a summer or a year and then ended, leaving her with good memories and an impression of something missed, a road not taken. Nothing she regrets but it might have been nice for it to have lasted longer. She doesn’t know it the way Aldiss does, but she doesn’t fear it the way some of her colleagues have in the past and may still. So while her observations may seem coincidental, there’s worthy insight, if only of the tourist variety. Taken together, the two books give one a view of SF both from the inside and from the outside and the distinctions are telling.

Way back in my youth, when rock’n’roll had muscled its way into the serious attention of people who, not too many years earlier, once derided it as loud, obnoxious “kid’s stuff” I found an album by Andre Kostelanetz, who led an orchestra that specialized in symphonic renditions of popular music. He would take Sinatra or Como or Crosby or film themes or light jazz and turn them into quasi-classical pieces. This album was his take on the band Chicago. I remember listening to it bemused. It was interesting and it was “accurate” but it lacked some vitality that I at first couldn’t define. But then I realized that he had stripped everything out of it that said “rock’n’roll” and all that remained was the melody, the chord changes, and the form, but none of the guts. He’d taken music that could, in its original, get you churned up, excited, and agitated in a particular way and converted it into something palatable for the inspection of people who did not understand rock music but may have been curious about it. Unfortunately, he missed the point and the result was “interesting.”

I often feel that way about attempts at science fiction by people who do not understand it.

More importantly, however, is the dialogue between those who get it and those who don’t and in this respect Atwood has written a very useful book with considerable care and insight. It is, ultimately, less about science fiction than about her attempts to alchemically transform it into something familiar to her own early impressions of magical and dissociative fictive experiences. This is underscored by the Aldiss, which is about the heart and soul of science fiction. Reading them in tandem clarifies the ongoing misapprehensions and perhaps shows us how and why SF seems to be infecting much of today’s literary fiction. There must be a good reason why someone like Atwood now writes it, even if she doesn’t seem entirely to embrace it for itself.

 

Light Fallen

I’ve read three books in tandem which are connected by subtle yet strong filaments.  Choosing which one to begin with has been a bit vexatious, but in the end I’ve decided to do them in order of reading.

The first is an older book, handed me by a friend who thought I would find it very much worth my while.  I did, not, possibly, for the reasons he may have thought I would.  But it grounds a topic in which we’ve been engaged in occasionally vigorous debate for some time and adds a layer to it which I had not expected.

William Irwin Thompson’s  The Time Falling Bodies Take To Light  is about myth.  It is also about history.  It is also about grinding axes and challenging paradigms.  The subtitle declares: Mythology, Sexuality & the Origins of Culture.  This is a lot to cover in a mere 270-some pages, but Mr. Thompson tackles his subject with vigor and wrestles it almost into submission.

His thesis is twofold.  The first, that Myth is not something dead and in the past, but a living thing, an aggregate form of vital memes, if you will, which recover any lost force by their simple evocation, even as satire or to be dismissed.  Paying attention to myth, even as a laboratory study, brings it into play and informs our daily lives.

Which means that myth does not have a period.  It is ever-present, timeless, and most subtle in its influence.

His other thesis, which goes hand in hand with this, is that culture as we know it is derived entirely from the tension within us concerning sex.  Not sex as biology, although that is inextricably part of it, but sex as identifier and motivator. That the argument we’ve been having since, apparently, desire took on mythic power within us over what sex means, how it should be engaged, where it takes us has determined the shapes of our various cultural institutions, pursuits, and explications.

It all went somehow terribly wrong, however, when sex was conjoined with religious tropism and homo sapiens sapiens shifted from a goddess-centered basis to a god-centered one and elevated the male above the female.  The result has been the segregation of the female, the isolation of the feminine, and the restriction of intracultural movement based on the necessity to maintain what amounts to a master-slave paradigm in male-female relationships.

Throughout all this “fallen” power play, ancient myths concerning origins and the latent meanings of mutual apprehensions between men and women (and misapprehensions) have continued to inform the dialogue, often twisted into contortions barely recognizable one generation to the next but still in force.

There is much here to consider.  Thompson suggests the rise of the great monotheisms is a direct result of a kind of cultural lobotomy in which the Father-God figure must be made to account for All, subjugating if not eliminating the female force necessary for even simple continuation.  The necessity of women to propagate the species, in this view, is accommodated with reluctance and they are, as they have been, shoved into cramped confines and designated foul and evil and unclean in their turn, even as they are still desired.  The desire transforms the real into the ideal and takes on the aspects of a former goddess worship still latent in mythic tropes.

Certainly there is obvious force to this view.

The book is marred by two problems.  I mentioned the grinding of axes. Time was published originally in 1981 and, mostly in the first third, but sprinkled throughout, is an unmasked loathing of evolutionary psychology and sociobiology.  He takes especial aim at E.O. Wilson for promulgating certain reductive explanations for prehistoric cultural evolution based wholly on biological determinants.  Thompson’s prejudice is clear that he wants even early homo sapiens to be special in its cultural manifestations and he derides attempts at exclusively materialist explanations.  The fact that E.O,. Wilson himself has moved away from these earlier “purely” biological considerations one hopes would result in an updating.

But interestingly, part of Thompson’s rejection of such early modeling comes from an apparent belief in Race Memory.  Not, as I might find plausible, race memory as deeply-entrenched memes, but apparently as some undiscovered aspect of our genome.  He never quite comes out claims that such race memory is encoded in our DNA, but he leaves little room for alternative views.

Hence, he asserts, the genuine power of myth, since it is carried not only culturally, but quasi-biologically, as race memory.  Which we ignore at our peril.

He does not once mention Joseph Campbell, whose work on the power of myth I think goes farther than most in explicating how myth informs our lives, how myth is essentially meaning encoded in ideas carried in the fabric of civilization.  He does, however, credit Marija Gimbutas, whose work on goddess cultures extending back before the rise of Sumer and the constellation of civilizations commonly recognized as the “birth” of civilization was attacked by serious allegations of fraud in order to undermine her legitimacy and negate her thesis that early civilizations were certainly more gender equal if not outright female dominated.  (Just a comment on the so-called “birth” of civilization: it has been long remarked that ancient Sumeria appeared to “come out of nowhere”, a full-blown culture with art and some form of science.  But clearly common sense would tell us that such a “birth” had to be preceded by a long pregnancy, one which must have contained all the components of what emerged.  The “coming out of nowhere” trope, which sounds impressive on its face, would seem to be cultural equivalent of the virgin birth myth that has informed so many civilizations and myth cycles since…)

My complaint, if there is any, is that he undervalues the work of geneticists, biologists, and sociometricians, seeking apparently to find a causation that cannot be reduced to a series of pragmatic choices taken in a dramatically changing ecosystem or evolutionary responses to local conditions.  Fair enough, and as far as it goes, I agree.  Imagination, wherever and whenever it sprang into being, fits badly into the kind of steady-state hypothesizing of the harder sciences when it comes to how human society has evolved.  But to dismiss them as irrelevant in the face of an unverifiable and untestable proposition like Race Memory is to indulge in much the same kind of reductionist polemic that has handed us the autocratic theologies of “recorded history.”

Once Thompson moves out of the speculative field of, say, 8,000 B.C.E. and older and into the period wherein we have records, his attack on cherished paradigms acquires heft and momentum and the charm of the outsider.  (His mention, however, of Erich von Daniken threatens to undo the quite solid examination of the nature of “ancient” civilizations.)  It is easy enough to see, if we choose to step out of our own prejudices, how the march of civilization has been one of privileging male concerns and desires over the female and diminishing any attempt at egalitarianism in the name of power acquisition.  The justification of the powerful is and probably has always been that they are powerful, and therefore it is “natural” that they command.  Alternative scenarios suffer derision or oxygen deprivation until a civilization is old enough that the initial thrill and charm of conquest and dominance fades and more abstruse concerns acquire potency.

But the value of The Time Falling Bodies Take To Light  may be in its relentless evocation of institutional religion as a negation of the spiritual, as if to say that since we gave up any kind of natural and sane attitude toward sexuality and ignored the latent meaning in our mythologies we have been engaged in an ongoing and evermore destructive program to capture god in a bottle and settle once and for all what it is we are and should be.  When one looks around at the religious contention today, it is difficult if not impossible to say it is not all about men being in charge and women being property.  Here and there, from time to time, we hear a faint voice of reason crying out that this is a truly stupid thing to kill each other over.

Is the Novel Still Dying?

In 1955, Normal Mailer was declaring the death of the novel. A bit more than a decade later, it was John Barth’s turn.  There have now been a string of writers of a certain sort who clang the alarm and declare the imminent demise of the novel, the latest being a selection of former enfants terrible like Jonathan Franzen and David Foster Wallace.

Philip Roth did so a few years back, adding that reading is declining in America.  The irony of this is that he made such claims at a time when polls suggested exactly the opposite, as more people were reading books in 2005 (as percentage of adult population) than ever before.  In my capacity as one-time president of the Missouri Center for the Book I was happily able to address a group of bright adolescents with the fact that reading among their demographic had, for the first time since such things had been tracked, gone precipitously up in 2007.

And yet in a recent piece in the Atlantic, we see a rogues’ gallery of prominent literateurs making the claim again that the novel is dying and the art of letters is fading and we are all of us doomed.

Say what you will about statistics, such a chasm between fact and the claims of those one might expect to know has rarely been greater.  The Atlantic article goes on to point out that these are all White Males who seem to be overlooking the product of everyone but other White Males.  To a large extent, this is true, but it is also partly deceptive.  I seriously doubt if directly challenged any of them would say works by Margaret Atwood or Elizabeth Strout fall short of any of the requirements for vital, relevant fiction at novel length.  I doubt any of them would gainsay Toni Morrison, Mat Johnson, or David Anthony Durham.

But they might turn up an elitist lip at Octavia Butler, Samuel R. Delany, Tannarive Due, Nalo Hopkinson, Walter Mosley, or, for that matter, Dennis Lehane, William Gibson, and Neal Stephenson (just to throw some White Males into the mix as comparison).  Why?

Genre.

The declaration back in the 1950s that “the novel is dead” might make more sense if we capitalize The Novel.  “The Novel”—the all-encompassing, universal work that attempts to make definitive observations and pronouncements about The Human Condition has been dead since it was born, but because publishing was once constrained by technology and distribution to publishing a relative handful of works in a given year compared to today, it seemed possible to write the Big Definitive Book.  You know, The Novel.

Since the Fifties, it has become less and less possible to do so, at least in any self-conscious way.  For one thing, the Fifties saw the birth of the cheap paperback, which changed the game for many writers working in the salt mines of the genres.  The explosion of inexpensive titles that filled the demand for pleasurable reading (as opposed to “serious” reading) augured the day when genre would muscle The Novel completely onto the sidelines and eventually create a situation in which the most recent work by any self-consciously “literary” author had to compete one-on-one with the most recent work by the hot new science fiction or mystery author.

(We recognize today that Raymond Chandler was a wonderful writer, an artist, “despite” his choice of detective fiction.  No one would argue that Ursula K. Le Guin is a pulp writer because most of her work has been science fiction or fantasy.  But it is also true that the literary world tries to coopt such writers by remaking them into “serious” authors who “happened” to be writing in genre, trying ardently to hold back the idea that genre can ever be the artistic equivalent of literary fiction.)

The Novel is possible only in a homogenized culture.  Its heyday would have been when anything other than the dominant (white, male-centric, protestant) cultural model was unapologetically dismissed as inferior.  As such, The Novel was as much a meme supporting that culture as any kind of commentary upon it, and a method of maintaining a set of standards reassuring the keepers of the flame that they had a right to be snobs.

Very few of Those Novels, I think, survived the test of time.

And yet we have, always, a cadre of authors who very much want to write The Novel and when it turns out they can’t, rather than acknowledge that the form itself is too irrelevant to sustain its conceits at the level they imagine for it, they blame the reading public for bad taste.

If the function of fiction (one of its function, a meta-function, if you will) is to tell us who we are today, then just looking around it would seem apparent that the most relevant fiction today is science fiction.  When this claim was made back in the Sixties, those doing what they regarded as serious literature laughed.  But in a world that has been qualitatively as well as quantitatively changed by technologies stemming from scientific endeavors hardly imagined back then, it gets harder to laugh this off.  (Alvin Tofler, in his controversial book Future Shock, argued that science fiction would become more and more important because it taught “the anticipation of change” and buffered its devotees from the syndrome he described, future shock.)

Does this mean everyone should stop writing anything else and just do science fiction?  Of course not.  Science fiction is not The Novel.  But it is a sign of where relevance might be found.  Society is not homogeneous (it never was, but there was a time we could pretend it was) and the fragmentation of fiction into genre is a reflection that all the various groups comprising society see the world in different ways, ways which often converge and coalesce, but which nevertheless retain distinctive perspectives and concerns.

A novel about an upper middle class white family disagreeing over Thanksgiving Dinner is not likely to overwhelm the demand for fiction that speaks to people who do not experience that as a significant aspect of their lives.

A similar argument can be made for the continual popularity and growing sophistication of the crime novel.  Genre conventions become important in direct proportion to the recognition of how social justice functions, especially in a world with fracturing and proliferating expectations.

Novel writing is alive and well and very healthy, thank you very much, gentlemen.  It just doesn’t happen to be going where certain self-selected arbiters of literary relevance think it should be going.  If they find contemporary literary fiction boring, the complaint should be aimed at the choice of topic or the lack of perception on the part of the writer, not on any kind of creeping morbidity in the fiction scene.

Besides, exactly what is literary fiction?  A combination of craft, salient observation, artistic integrity, and a capacity to capture truth as it reveals itself in story?  As a description, that will do.

But then what in that demands that the work eschew all attributes that might be seen as genre markers?

What this really comes down to, I suspect, is a desire on the part of certain writers to be some day named in the same breath with their idols, most of whom one assumes are long dead and basically 19th Century novelists.  Criticizing the audiences for not appreciating what they’re trying to offer is not likely to garner that recognition.

On the other hand, most of those writers—I’m thinking Dickens, Dumas, Hugo, Hardy, and the like—weren’t boring.  And some of the others—Sabatini, Conan Doyle, Wells—wrote what would be regarded today as genre.

To be fair, it may well be that writers today find it increasingly difficult to address the moving target that is modern culture.  It is difficult to write coherently about a continually fragmenting and dissolving landscape.  The speed of change keeps going up.  If such change were just novelty, and therefore essentially meaningless, then it might not be so hard, but people are being forced into new constellations of relationships and required to reassess standards almost continually, with information coming to them faster and faster, sometimes so thickly it is difficult to discern shape or detail.  The task of making pertinent and lasting observations about such a kaleidoscopic view is daunting.

To do it well also requires that that world be better understood almost down to its blueprints, which are also being redrafted all the time.

That, however, would seem to me to be nothing but opportunity to write good fiction.

But it won’t be The Novel.

Iain Banks Is Gone

I have nothing much to say that I didn’t already say.  He wrote some of my all-time favorite books.  I envied the scope and depth of his creations.  If I imagined what kind of work I wanted to write in my ideal world, Banks’ Culture  stories would be one of the examples.

He went much too soon.  He thought he’d have more time.  We thought so, too.

One of the pitfalls of science fiction is that we can read about all these wonderful places and times where things like this can be dealt with and the world is more at our command than it is, but when the book is finished and we close the cover, we still live here.  And here we lose people every day to things we know we should be able to beat.  Because we’ve seen that future, laid out for us by fine writers and great minds.

Some day.  Writers like Iain Banks showed us.  Some day.

Veering Into The Present

An attractive pitfall of popular history is the Pivotal Moment.  The writer centers on an event or an idea that signals a shift in the course of history, leading somewhere other than where it had been heading.  The Donation of Constantine, the First Crusade,  the invention of moveable type, Galileo’s confrontation with the Church, Newton’s codification of the law of gravity, things like that.  The point being made is that these events are so tectonic that Everything Changes.

The pitfall is not so much that they are wrong but that they are taken as solely responsible, isolated moments, forks in the road.  It is easy to ignore or forget everything else around them.  Focusing only on the Emperor Constantine can suggest that without him, Christianity might not have become the official religion of Rome and thus history might have taken a different course.  (Personally, I think Constantine’s moving the capital of the empire east was far more significant as something he alone could have done, or caused to be done.)  It overlooks the fact that Christianity had become a tremendous movement by then.  Had Constantine been of a mind to resist it, he might have delayed its ascension for another emperor, but it would have become what it did in any event.  Constantine was being politically astute.  (After all, he left Rome to the Church even as he moved the center of imperial power to the new city of Constantinople.  It’s telling that he chose to isolate them geographically.)  The Crusades were important as expressions of political currents leading to a contraction of Rome’s vision of itself and certainly set the stage for subsequent events in the Levant, but not even the death of Richard the Lionheart changed all that much in even British history.

Newton might be arguably more important, at least for the calculus, but such things were in the wind.  Leibniz, rival and competitor to Newton, invented a calculus, and while the debate goes on as to who was first and which was better, such a mathematical tool was going to emerge.

Picking pivotal events, therefore, is a challenge.  Placing them in context is a duty and one it is often tempting to underplay.  It makes a better story if the singular event is the hero, as it were.  But it can sometimes make for bad history.

Stephen Greenblatt avoids that problem admirably in The Swerve: How The World Became Modern.  Even though the title is a bit hyperbolic and suggests the kind of history more consistent with a tabloid approach, what one finds within it first-rate history written for a general audience about a rather arcane subject:  the way ideas can change entire cultures.

The story is about the discovery of a manuscript, De Natura Rerum, an epic poem by the Roman Lucretius (99 B.C.E. to 55 B.C.E.), an acolyte of Epicurean philosophy who died just before Rome became an Empire instead of a Republic.  De Natura Rerum—“On the Nature of Things”—is a a surprising work in that it espouses ideas which we think of now as wholly modern.  That the universe is composed of atoms, that time and space are unbounded, that life evolves, that matter is all there is.  If one squints, one sees the foundational ideas of contemporary physics in all this.  Physics and cosmology.

But it continued on to suggest that pleasure is the highest moral purpose, that doing that in life that increases one’s pleasure and the pleasure of those around us, is the primary aim of a moral life.

It’s easy to see how this might run afoul the kind of moral philosophy that has dominated Western culture since before the rise of Christianity.  But Lucretius was not advocating hedonism, but the more constrained program of Epicurus, the 4th Century B.C.E.  Greek philosopher who advocated philosophy based on the two standards of ataraxia and aponia, namely peace and freedom from fear (ataraxia) and the absence of pain (aponia).  To do this, one must lead a self-sufficient life surrounded by friends and occupying the mind with constructive and pleasing contemplations and treating the body to that which brings pleasure.  Though the name has been linked to self-indulgence, hedonistic abandon, and all the ills of unrestrained pleasure-seeking, what Epicurus had in mind was something very different, and defined by moderation.  The kind of self-indulgence we assume attends such a life he did not see as peaceful, pleasurable, or free from pain.

He also believed that when we die, nothing survives.  The soul is an aspect of our physical existence like anything else and fades to nothing once the container ceases to function.

Lucretius wrote a poem of purportedly great beauty in support of this philosophy.

It is a common misapprehension that the Greco-Roman world of that time would have embraced all this eagerly.  The fact is, Christianity rather easily took root in the Roman Empire because it bore much in common with ordinary Roman morality.  Epicurus was almost as disdained under the Caesars as his ideas were later despised under the popes.  Christianity succeeded largely because of its commonalities with pagan culture, a culture which found Epicurean ideas almost as off-putting as any later devout Catholic might.

A culture which fully embraced the spiritual side of attitudes toward the material world that relegated this life to a condition of transient, burdensome necessity, pain, and suffering which must be borne with the faith and dignity of an acolyte who seeks a better existence in an afterlife, fully convinced that nothing in this realm matters.  A culture that had no use for the idea of atoms, that believed the universe to be bound tightly in a very local set of spheres, and with a time limit on its existence that was easily comprehensible—a few thousands of years.  People wanted the comfort of believing existence to be closely bound, finite, with a way out.

Lucretius’ poem faded from memory.  Rome’s collapse was as much a result of neglect as of catastrophe, and by the 9th century, much of the written legacy was sequestered in monasteries, scattered, mouldering, often ignored, certainly unstudied.  It required that civilization rise back to a certain material level before interest in ideas, old manuscripts, and the past could matter.

Enter Poggio Bracciolini, Florentine, scholar, humanist.

Humanist meant something a bit different in the 14th and 15th centuries than it does today, but it is possible to see the connection.  Poggio was one of that group of avid collectors who scoured the monastic libraries for old books.  Most of them were copies of even older books, the remnants of a vast ancient world epitomized by the Library of Alexandria, most of which seemed to offer glimpses into a Golden Age.  Aristotle had long been seen as the basis for rationalizing certain troublesome aspects of Christian theology.  The flood of recovered books from the Reconquista has been both benefit to a slowly recovering European civilization and troublesome bane to a Church that saw itself as the final arbiter of what it was proper to know, to consider, to believe.

Poggio worked for a succession of popes.  In his “spare” time, he hunted manuscripts, and helped return them to circulation.  He found Lucretius’ tome in Germany, a 9th century copy.  According to Greenblatt, he may not even have realized what it was.  He’d only heard it mentioned with respect and some reverence in other ancient manuscripts.

The Swerve reveals the events surrounding the poem’s creation, loss, rediscovery, and subsequent dissemination throughout a culture that was on the verge of becoming something other than what it had been.  The ideas embraced in the eloquent lines are ideas with which we are more than familiar today.  Indeed, they are common coin in debates on the right and the good and resonate in the foundations of modern science.  Greenblatt suggests that it was this book—its reintroduction to a wide audience—that caused the veer into what has become a secular civilization.

He is careful, however, to contextualize his assertions.  Something like this, it seems, would have had to be invented if it hadn’t been found.  Its arrival at the onset of the Renaissance was fortuitous.  Coming along when it did—when science was beginning to coalesce out of the mish-mash of alchemy and reactions to Aristotelianism, when people like Bruno, Galileo, Newton, and many others were present to respond—hastened events, gave focus to certain schools of thought, fed the furnace that was recasting conceptualizations of nature and the universe.  It lent the weight of a more complete philosophical conception to the fragmented components of what would one day become the modern world.

It is perhaps surprising (and somewhat disillusioning) that the arguments spawned by De Natura Rerum are still being waged today.  Reading Greenblatt’s examination of the central ideas of the poem and the subsequent responses to it is itself a lesson in historical context, because we can look around and find exactly the same kinds of debates—and sometimes bitter battles—going on around us.

But it is also encouraging.  Ideas survive.  People keep them alive, even over centuries, millennia.  Greenblatt is, in his own way, continuing that fragile, necessary, and yet astonishingly powerful tradition, passing on to the future what is important not only for today but what has been important all along.

Clarity

One of the most perverse aspects of American culture is the contradiction between our self-professed guiding ethos and what many of us actually do.  This is the country of the self-made, the independent thinker, the individualist.  We build elaborate mythologies extolling the virtues and victories of our heroes, who are all of a piece, wholly their own creatures, dependent on no one and nothing to be what they are.  Daniel Boone to Thomas Edison to Steve Jobs and Bill Gates, the self-sufficient American is our national role model.

Yet a look at our actual history shows that we as a people are surpassing great joiners.  We attach ourselves to collectives, to movements, to institutions, and borrow ideologies from them, speaking with a group voice and shunning those whose independence of thought causes them to criticize whatever party our fellows have joined that gives them a sense of worth.  We have been known as the most religious country on Earth, per capita, and any close look at the religious movements that have swept this country over more than two centuries shows a deep approval of support for such causes even at the expense (sometimes especially at the expense) of those who are genuinely independent in thought and action.  Americans often readily bury their freedom of conscience in support of all manner of mass social incarnations, be they labor unions, political parties, or churches.

For a nation founded on an idea of letting people be who they wish to be, America has a questionable track record, with periods of tolerance punctuated by spasms of intolerance, but always with an apparent acceptance of a preference for belonging that runs counter to our professed pride of independence.  This also runs counter to the related “virtue” we like to boast of being hard-nosed skeptics.  To be sure, many of us are, and most of us exercise a degree of skepticism at least in certain areas of our lives, but again we are inconsistent, especially, it seems, when it comes to religions.

Lawrence Wright’s new book, Going Clear: Scientology, Hollwood, & the Prison of Belief, delves into one of the most quintessentially American religions of the 20th Century.  Generated in the 1950s out of the imagination of one man, it has grown to international proportions, and along the way has been subject to as much if not more controversy than any other movement of comparable size, in some ways akin to Mormonism.  (In significant ways, Scientology and Mormonism share a great deal—both creations of single individuals who then went on to uproot a community of followers, creating an insular ideology that separated members from the wider world, based on cosmologies invented almost from whole cloth, establishing themselves in the minds of their adherents with such visceral force that no amount of fact seems capable of dislodging faith in the central tenets, fact in both instances far more easily produced and demonstrated than in most other religions.)

Going Clear

Many books have been written about Scientology, the majority by or about former members whose objectivity may be doubted.  This is not, on the inside, a religion that seems content to allow its membership the kind of options we expect from more mainstream faiths.  You may join the Baptists, stay awhile, and then, if it doesn’t suit, leave.  According to most accounts by ex-Scientologists, there is no apparent regard for such an option, and those who do leave are rarely left alone.  (By contrast, when a Mormon repudiates the faith, the opposite tends to happen—they are closed out and shunned.)

Wright has no axes to grind.  He is an investigative journalist telling a story.  He did exhaustive research, covered as much material as he could, found many people to talk to, both in and out of the church, and has produced what may be to date one of the most evenhanded treatments of the subject yet published.  The evolution of the movement, from the imagination of its founder, Lafayette Ronald Hubbard, is charted clearly, as is the growth of the church from the size of a club to a cult to a major religion boasting millions of members.   One of his guiding questions, however, has to do with volition:

If Scientology is based on a lie…what does it say about the many people who believe in its doctrine…?

Throughout the book, this question hovers in the background.  We see people from all walks of life encounter Scientology and then surrender themselves to it, sometimes for life, sometimes for a few years, for a myriad of reasons.  Wrights finds people who swear by the efficacy of the doctrines, who use it to be better people.  He seems to find just as many who have apparently few other options for self-discovery and actualization.  After long enough, it becomes difficult if not impossible to conceive of life outside the church.

The ones that cause the deepest stirrings of concern are those born into it, at least those born into it within the deepest circles, the Sea Org and administration.  They grow up never knowing enough, if anything, about the outside world to be able to function anywhere but within the church.

There are orders of renunciates the world over, retiring groups who close themselves off from the world at large.  Their existence calls into question criticism of Scientology for doing essentially the same thing.  However, as the story of the interior world Hubbard created unfolds, we see a disturbing absence of all the aspects of free will, free choice that we take for granted.  Yes, strictly speaking, these people joined on their own and stay by choice.

But so, too, did the followers of Jim Jones or David Koresh.  A close look at Sun Myung Moon’s Unification Church reveals a similar break from the standards of free association we associate with the exercise of rights.  Coercion takes many forms and the most effective are those that manage to convince people to place the chains on themselves.

And yet…and yet…

The doctrines created—invented—by Hubbard come straight out of science fiction.  Hubbard was a pulp writer in the 1930s, he wrote fantastic fiction (as in content not necessarily quality), he was a colleague of Heinlein, de Camp, others who established the idioms of what we know today as science fiction.  When you read the ideas that informed Hubbard’s central mythos for the church, it is straight out of science fiction, but of an earlier era where some of the constraints of science, even in passing,  did not pertain.  It is difficult to take any of it seriously.  Much of it flies in the face of physical fact (the universe is 14 billion years old, not 4 quadrillion) and defies the logic of evolution.  It combines elements of pop psychology with Antlantean mythology with flights of fancy that would be ridiculed today by savvy readers if the attempt were made to foist it onto them.  How can anyone swallow this stuff, we may ask, incredulous at the apparent gullibility of adherents.

But, then, the same could be said of the basic doctrines of any religion.  Joseph Smith was a con artist and his frauds were documented, yet people virtually worship him as the avatar of their theological universe.  Fact has little bearing on the need to join and believe exhibited by so many people.  Cordons sanitaire are drawn around the primary ideologies of any religion, exempting them from even the most mundane of critical analysis.

Few have been so closely guarded as those of Scientology.

What is striking, though, is the apparent ease with which such movements attract followers in a place where supposedly the defining cultural motifs all promote the idea of not being gulled, not being fooled, not be led unquestioningly.  Wright has no answers to such dilemmas.  What he has given us, however, is a clear-eyed look at method and process and, it may be hoped, a possible antitode to self-imposed slavery.