Intrusions

The latest eruption of reaction from certain viewers of the new Sandman series on Netflix is another example of a phenomenon that I, in my 20s, would never have thought to indulge: the intrusion of the audience directly into the aesthetic choices of an author. I grew up in a time in which you either liked or did not like something, and if you did not like it you would then go off to find something you did like. What you did not do was presume to publicly dictate to the creators what was wrong with the work as if you had any place in that process.

Professional (and amateur) critics would analyze and examine and write pieces about a given work to explain what does and does not work, but rarely, if ever, would you find a demand that a work be different. Certainly lively discussions among those interested over a given work were common and healthy, but that work would be accepted as presented, to be dissected and studied, liked or disliked, as it stood.

Today it would seem the audiences harbor elements that take it as given that there is a right to tell the creator to rewrite, reconstruct, or otherwise revise a given work, based on the apprehension that said work is “wrong” and should be fixed. Among this group there seems little interest in examining those objectionable aspect to discern the whys of the creator’s choices—and thereby maybe learn something from them—or even the consideration to simply say “this is not for me” and go find something else. This intrusion of a self-assumed participation (which becomes strident, because obviously it ought not and seldom does have any result on the work in question) has become a fixture of the current literary and media zeitgeist.

We see this presently in the splenetic condemnation of so-called Woke aspects in something and an implied—or explicit—demand that they be gotten rid of. It seems not to occur to such tyros that maybe an examination—of self as well as the work (which, in the best of worlds, become one in the same, because that is what the best work does for us)—would be edifying and perhaps personal growth might result. It seems not to occur to them (and others not so vocal about their personal discontents) that the whole purpose of engaging with a work that may challenge preconceptions is to force a bit of self-analysis.

Given that the United States now ranks far down the ratings of literacy in the world today, it would seem that we have a massive group of people who have decided that the literary world, be it in print or film, must conform to their definition of acceptable and allow them the comfort of never getting out of their heads.

This is a level of intrusion I find toxic. Even though it may well be a minority, these days numbers seem not to matter in relation to degree of attention. For the purposes of this essay, let me just speak to the lone individual who, disgusted by Dr Who being a woman or the aspect of two boys or two girls kissing, or the appearance of any minority in a role long-assumed to be the province of white people, reacts with a public display of condemnation and a demand that this not be allowed.

You are to be pitied. You have locked your soul into a box so that it is never touched by anything other than the presumptions chasing each other inside your skull. You do not know how to read (and by that I mean the vicarious immersion through connection with a character and a text that offers something New for consideration; indeed, consideration itself would seem a foreign and hateful thing to you) and you no doubt have caged your empathy in such a way that you flinch at any suggestion that the world is not what you wish it to be. You see something like this (Sandman) and you look forward to being dazzled by the special effects and the novelty of magic and other worldly mysteries, yet any hint of the personal that might challenge your prejudices is unwelcome because what you want is to be wowed, not enlarged. Literature is, at its best, a gateway to parts and places in the world you have not had and might never have direct access to—that is the point.

You do not have the right—nor fortunately, as yet, the authority—to tell a writer he or she should take something out because it disturbs you. Go read/view something else and leave this to those who do appreciate it.

It’s this attitude, this sense of privilege that suggest because you are a fan you own the property and can dictate the landscape, that troubles me. It’s ugly. It’s selfish and small and poisonous. And, as I said, pitiable.

And just an observation…if something bothers you that much, odds are it’s not irrelevant at all. Rather it may be the most relevant thing about it and it would be a good idea to maybe look into that a bit deeper. If it was genuinely gratuitous, it likely would not cause even a minor stir in your psyche.

Equations and Kindness

Over the course of my “literary” life, I’ve encountered numerous divisions, prejudices, aversions, proclivities, and preferences. Most of them come down to taste—this school parts company from that one, fans of one writer cannot abide this other one, subject matter produces occasional extreme reactions. Then there is the endless sortings according to style or period or region. Genre can be a minefield of antagonisms, categorical dismissals, harsh critical responses, or simple disinterest. Taste, aesthetics, predilection—all personal, really, even when a case is made of a more substantial kind involving theory, academic attitudes, or even ethics, but by and large it comes down to a kind of triage: what do you want to spend your time on, that satisfies or fulfills?

In my youth, the most prominent division among those of us reading the so-called Classics was best exemplified by those who loved Jane Austen…and those who did not. I fell into the latter category. For years, Austen, for me, was a mannered, formalized, high-end kind of soap opera. I would hear people declare her genius and scratch my head. Many years later, having indulged my personal interest by way of thousands of novels and short stories in science fiction, I came back to Austen and discovered a vein of brilliance I had theretofore missed. While the “soap” aspect was certainly there, the fact is she was writing insightfully about systems. Social systems, mainly, but there were ancillary systems. She examined the social milieu of her day as sets of constraining protocols, barriers, and arrangements that dictated individual choice. 

I describe that in order to explain how most divisions among the wide range of literary forms are often arbitrary, petty, and at best only serve to point us in preferential directions—here be what you like. Read widely enough, we find what we like in places we thought devoid of our preferred pleasures, and hence the distinctions are…porous.

Most of them are harmless and serve at times as sources of productive discourse. One, however, has always dismayed me, because it extends beyond the literary to permeate many other aspects of our lives. What C.P. Snow labeled the Two Cultures—the division between art and science.

As if the two are incompatible, that somehow science is anti-art, and by extension anti-human. (It is one of the underlying dismissals by some of science fiction.) At some point since Newton, this idea has become more entrenched and has led to some arguably toxic consequences. 

In the 20th Century, many people recognized the negative aspects of this division and sought to bridge the divide. Notable among them were Carl Sagan, Isaac Asimov, Stephen Jay Gould, Rachel Carson, Lynn Margulis, Lisa Randall, and Michelle Thaller. The ability to write and convey science in language accessible by the lay public has become something apparently deserving of celebrity status, as in the case of Neil DeGrasse Tyson. 

While it is understandably difficult to convey the details of certain aspects of science, perhaps one of the problems has been that for too long it was just accepted that these things are too complex for the nonspecialist to grasp. It’s difficult to know because examples of excellent communication for the general public do seem to be rare. (Not as rare as it seems, but to know that one would have to be inclined to look, and if through life one is constantly told not only how hard science is but also, in some instances, how “inhuman” it is, the odds are good that one has been set up to be disinclined to pay attention.)

I think it is safe to say that never before has a public understanding of science been so important. After all, public policy, which has always required an understanding on some level of science, is now being directly impacted by such comprehension. 

So the so-called Popularizer has never been more important.

But in order for the message to reach people, it is fair to say it must be made relevant to our humanity.

Enter Carlo Rovelli.

Rovelli is an Italian theoretical physicist (his field is quantum gravity) who over the last several years has published a handful of exemplary books, beginning with Seven Brief Lesson On Physics which, in a very short space covers much of the important history and nature of modern physics. In each of his books, threaded through the explications of science, is a humanness that renders the work emotionally accessible.

His latest, however, is something different. There Are Places In The World Where Rules Are Less Important Than Kindness is a collection of essays which share the theme of a scientist looking at the world at large and revealing the empathy through which the intellect sees. There are historical pieces about Newton and Einstein and revolution and geology, and political pieces touching on policy and the consequences of both understanding and ignorance, and travel pieces ranging from Africa to Scandinavia. Throughout it all, we see through the eyes of a scientist who loves and is delighted and laughs and is occasionally afraid—who is, basically, human.

The problem science presents for some people is the point at which it seems to throw up a wall and tells us no, you cannot do that, you cannot go there, you cannot have a particular way. Entropy is unsympathetic, and the apparently non-negotiable rejections of certain preferences can be off-putting. What Rovelli does is show us another door, because while science reveals a universe with certain restrictions, it shows us new possibilities all the time. It offers more options than we knew existed. 

But it is also important, if we are to increase our understanding of the world, to learn science as a human art.

That divide I spoke about, between art and science, is the most artificial of divisions. It grew out of the point at which philosophy seemed to lose relevance in the face of answers provided by science that fulfilled certain demands for useful answers. We forgot somewhere along the line that Aristotle was as much a scientist as a moral philosopher, and that he saw no meaningful distinction between the physical world and human ethics.

Rovelli talks about that and many other “points of departure” where some healing is in order, and perhaps a few new bridges. 

And he writes well. He observes very well. He conveys the essential humanness of science and somehow makes it a warmer thing to contemplate. There is hopefulness in his observations. Joy as well, and above all a kindness rarely encountered in any specialty.

Once we read this, I would recommend continuing with his other books. This is fun material as well as challenging and enlightening. Rovelli conveys an almost childish exuberance when talking about science and his own field. It is infectious and perhaps these days being caught up in the delight of exploring—which is, after all, where science begins—might just see us all through to a kinder place.

Visceral Coding

Few things generate sustained anxiety as much as genetic engineering. Both positive and negative, for the possibilities and the dread. Since Watson and Crick revealed the double helix of DNA, the science has proceeded apace, and we now live in an era wherein “programming” can refer to both computers and our genes.

Jennifer Doudna is a name to conjour with in this transformational time. In 2020 she won the Nobel Prize with Emmanuelle Charpentier for their work on CRISPR cas9. CRISPR has become the label in media stories for a process of “editing” genes with the use of a form of RNA. (Almost no one outside the biochemistry and medical community seems to no what it stands for: Clustered Regularly Interspersed Short Palindromic Repeaters.) Basically minute segments of code in a strand of RNA that repeat and can be used to, effectively, insert modified segments of code into a gene sequence.

What began as “pure” research into the methods by which bacteria defend against viruses became a revolutionary method of dealing with all manner of genetic circumstances, including potential treatments and vaccines for the most recent scourge, COVID-19.

Walter Isaacson’s new biography of Doudna (pronounced DOWD-na), Code Breaker, is also a history of the sometimes chaotic, sometimes life-affirming, often unexpected world of scientific research and its interface with the rest of the world.

Isaacson has given us not only a biography of a remarkable individual, but a look at the often surprising world of research and development. The image of the scientist, austere and removed, still to some extent dominates our imagination. It comes as a surprise (and occasionally something of a betrayal) when we are forced to recognize that scientists are human, just like the rest of us, with all the flaws and foibles to which “ordinary” people are prone. One aspect of the public conception of The Scientist I think requires adjustment is the fact that scientists continue to grow, to mature, to evolve. Too often, it seems that once the Ph.D. is earned, the scientist becomes a static icon, unchanging, and is expected to Know All or at least is frozen into an unchanging assemblage of stereotypes. On some level, this seems to offer comfort—one of the things people tend to be bothered by is an admission of not knowing. Worse still, is a change of mind, which is inevitable in the light of new evidence. But ordinary people can do both. A scientist is not supposed to.

This has led to unrealistic expectations, loss of trust, and the unfortunate “gaming” of science (never mind truth) in public policy. Primarily, this is from a profound lack of understanding on the part of the public. For another, it emerges from the misuse of science as a political talking-point.

Isaacson does an excellent job of taking the reader through the various aspects of a discovery, its initial reception, its development, its transition from pure research to useful tool, and the social and political impact along the way. And along with this, he explains just what that science is.

Jennifer Doudna is central to the unraveling of genetic codes and the inner workings of the templates of life. Basically, she became a nexus for many strands of research, each adding to the overall picture. Her work with French scientist Emmanuelle Charpentier ultimately earned them a shared Nobel Prize.

What they have developed is a tool by which the template for biological forms can be modified. Edited. This offers the possibility eventually of correcting genetic “errors’ that produce diseases like cystic fibrosis, sickle cell anemia, Tay-Sachs, and many others. The drive to “decode” the human genome contained the hope and ambition to one day be able to deal with these things, which are different from pathogenic illnesses. But even in the case of viral and bacterial infections, the ability to address illnesses from at a genetic level offers exciting possibilities—and in fact has been vital to the handling of the COVID-19 outbreaks. The speed and facility with which the scientific and health community have been able to respond is in important ways attributable to Jennifer Doudna’s work.

There is drama, intrigue, fascinating people, and the makings of a good thriller in certain aspects of this story. But the most important thing is the profound humanization of a complex community and the people in and from it. Scientists are not fundamentally different from anyone else. Their interests may seem esoteric and the degree of concentration they bring to their passions may seem other-worldly at times, but in truth what they have is a deeply useless set of tools and the willingness to abide by the rules those tools require for sound use. What must be understood, and often is obscured by the dizzying aspects of the science itself, is their humanity and how they represent, often, the best possibilities of all of us. (Of course there are those who are not as good at what they do as they should be, those who are more concerned with fame or wealth than the work itself, those who are flawed in unfortunate ways—just like any other group of people in any other area of activity—but we should look to the best for our examples and not allow the worst to color our perceptions of the people doing amazing work.)

Finally, understanding something is the best way to stop being afraid of it. At the end of the day, that is the real gift scientists give us—they work to understand things previously hidden and unknown and thereby help the rest of us to stop being afraid.

Out of the Mists

The common assumption put forward by several decades of anthropology and associated fields concerning that vast fog known as Prehistory runs as follows: humans, after emerging from the crapshoot of evolution, roved the savannah in small bands, gathering and hunting and painfully inching their way toward a point where they began to make tools (other than spear points and such). Then came a long period of migration, scattered attempts at settlement, until, a critical population mass achieved, agriculture was developed, and very quickly came the abandonment of hunter-gatherer society, leading to regular towns, art, and gradually more impressive engineering feats to serve the expanding agro-economy. At some further point, all this became the foundation of nascent states, after which the whole thing rolled into the “historic” era (marked by the advent of record-keeping) and kings and empires and slavery, and so forth.

This is more or less the way it was presented to me back in school, and, I suspect, still pretty much the popular conception of prehistory.

The problem with this is that we are talking about roughly 200,000 years of that undifferentiated, featureless, unchanging landscape. Taken at face value, it says that human beings conducted themselves as essentially immutably “innocent” creatures, either incapable or uninterested in doing anything more with themselves or their environment until they learned to plow a field and write things down. If, as the evidence suggests, modern homo sapiens had been roaming around the planet for two hundred millennia, with all that “modern” implies, this begs the question of what “we” were doing all that time and why, all of a sudden, about 10,000 years ago, we started living entirely differently.

Put that way, there is no reasonable answer. It is on its face an absurd assumption.

One that is not supported by any of the evidence we actually have.

So why cling to the narrative?

In The Dawn Of Everything: A New History Of Humanity, authors David Graeber and David Wengrow explore exactly that question and in so doing turn over multiple apple carts, debunk many myths, and shake up the common assumptions about that vast and murky period. They begin with a look at Jean-Jacques Rousseau and the question of equality.

When we first embarked on this book, our intention was to seek new answers to questions about the origins of social inequality. It didn’t take long before we realized this simply wasn’t a very good approach. Framing human history in this way—which necessarily means assuming humanity once existed in an idyllic state, and that a specific point can be identified at which everything started to go wrong—made it almost impossible to ask any of the questions we felt were were genuinely interesting.

They proceed then to reexamine as many assumptions as possible with the space of reasonably-sized book to show that Rousseau’s apparent point in his Discourse On The Origins Of Inequality is a bit of a cheat—unless Rousseau was being absurd to a purpose. For instance, Graeber and Wengrow remind us (assuming we ever knew) that the so-called “indigenous critique” of European civilization that informed much of Enlightenment thinking was not an invention of the philosophes but a genuine critique delivered by Native Americans after they had witnessed firsthand European civilization (often as captives/slaves, sometimes a diplomats). The sources were credited by the philosophes themselves as being from Native Americans, but later historians chose to ignore this to the point where it was forgotten and the natives were relegated to that pool of prehistoric humanity too “simple” to understand complex culture and socio-political structures.

From that point on, Graeber and Wengrow take nothing at face value and conduct a thorough reevaluation. If human beings have been phsyiologically “modern” for 200,000 years, it is ridiculous to assume they did not conduct themselves with as much sophistication and complexity as we do. Often, as it turns out, with strikingly different results.

The scope of the book is global. Between them, they cover archaeological finds from Central America to Turkey to Japan and points in between and carefully examine what is thee to be seen and what it means in relation to our understanding of how communities function. It is an eye-opening tour.

Much here is speculative. What makes prehistory difficult is the lack of, well, history. Written history. All we have are the remnants. But with a clear eye, those remnants are quite expressive. One thing that emerges consistently is that our previous assumptions are wrong.

From the end of the last ice age till now, we have enough to trace humanity’s presence and draw conclusions about its progress. But for the most part we still cling to the simplistic story of “primitive” societies living subsistence existences until the point where it become possible to form what subsequently became great states—Egypt, Babylon, Rome, the Indus Cultures. The implication being that once we reached that level we never looked back and marched forward into the present building roughly the same kinds of civilizations. And that at some point we collectively began to realize that we had become in thrall to despotisms and began what we know as the battle for equality. We seldom question the progression.

But, Graeber and Wengrow ask, why don’t we question it? Because even within historic times, it just isn’t the case, at least not universally.

If anything is clear by now it’s this. Where we once assumed ‘civilization’ and ‘state’ to be conjoined entities that came down to us as a historical package (take it or leave it forever), what history now demonstrates is that these terms actually refer to complex amalgams of elements which have entirely different origins and which are currently in the process of drifting apart. Seen this way, to rethink the basic premises of social evolution is to rethink the very idea of politics itself.

What is revealed by their analysis is that the smooth trajectory of assumed historical progress is an oversimplified, biased gloss from too few perspectives. The reality—that which can be demonstrated with evidence and that which can then be surmised by constructive deduction—is far more complicated, complex, and frankly compelling. Part of the telos of those simplistic constructions is that all that has gone before inevitably led to now—to us. We are as we must be by decree of historic processes which are inevitable.

The truth is, what we are now is only one possibility of what we might have become.

And this is the meat and bone of Graeber and Wengrow’s argument—that to justify ourselves as we are it is better to paint the ancient past as a homogenous, almost featureless whole. Had people twenty, thirty, or fifty thousand years ago not been the pastoral simpletons we’ve presented them to be, then where are the great kingdoms and empires, the technologies, the earthworks, the cities that would mark them as complex thinkers? While to a certain extent that is a not unimportant question, it overlooks examples that have left traces, even up to the present period, that fail to fit the expectations engendered by such a view. The decay of time certainly has something to do with the paucity of physical evidence, but what we do have is not so insignificant that the standard narrative has any claim to remain unchallenged.

While a good portion of The Dawn Of Everything is speculative, enough evidence and solid analysis is presented to more than justify such speculations, at the very least insofar as a challenge to our assumptions and a reconsideration of modern expectations. Quite a bit of non-Western critique was suppressed or ignored to help in building a picture of the past that supported the hegemony of the West’s self-importance. (Quite a lot of what became the political revolution of United States came from indigenous sources, accepted wholesale by the philosophes and then subsequently forgotten. The thinking was sophisticated, philosophically trenchant, and necessary to challenge what had become a standard view of the West’s view of itself.)

David Graeber passed away in 2020, at the age of 59. More volumes were to follow this one, according to his collaborator David Wengrow. One assumes many of the critiques that will inevitably emerge regarding this first book would be addressed in those books that follow—for instance, this—because clearly there was insufficient room in one volume to cover all the material avbailable. We may see more, but what they produced here is one of those books designed to upset apple carts. There is no inevitability in history, tempting though such narratives are. In order to free ourselves of the chains of a presumed inevitable present, we must go back and reexamine the past and find those “missing” parts that demonstrate the possibilities and the promises of other roads. This is what we have in this book.

Why Read

In light of the last few years, the question bites. Indulge me in a venting plea.

In my experience, limited though it is, I have found that the better read a person is, the more likely they will be to cope with reality, to defend against the twisting delights of both conspiracy theory and pseudoscience, and to be less vulnerable to charlatanry.

Not always. Some deceptions come wrapped in marvelous packages that can appeal to the puzzle-solver in us all and present as aesthetically compelling. In my own life I have followed white rabbits in tweed down a number of holes, some part of me convinced that truth lay in some hidden recess along the way.

I have been relegated to many sidelines since childhood because of reading, sidelines which at the time seemed harsh and unfair, but in retrospect were actually relatively safe places. Time and space are necessary for a mind to develop. Exposure to stimulating material does not work its magic immediately, sometimes not even soon, but eventually all those books and stories and articles result in a set of pathways and memories and organizing concepts that allow for the skills to deal with what may otherwise be just confusion.

No, let me be more definite—“may” has little to do with it. People who read, in my experience, are generally more present, more conscious, more adaptable than people who only watch and subsequently go through life skimming a surface which too often becomes a mirror and allows them to ignore what is beneath. In fact, those surface presentations often depend on not knowing what underlies them, may actively resist analysis, and with few exceptions deceive by suggesting there is nothing more.

Not all. But it is also true that those not intended to deceive largely depend on an audience that reads to reveal their full meaning.

There are many studies about the physiological and cognitive benefits of reading, especially fiction. Here’s one. There is an increases in synaptic structure associated with regular reading. Memory improves. Your brain responds by providing better tools.

Then, of course, you have to apply the tools. For me, this makes fiction and, in a similar way, history indispensable. Reading other kinds of books, while important in many ways, can leave you unaware of irony, of conflict, or paradox, all of which are fundamental to the so-called Human Condition. We read novels to grapple with the contradictions of being human. We read fiction because in doing so we learn the value of Other Minds attempting to do this thing we all own as a birthright—-living.

Occasionally we see a nod to this in popular entertainment. In the tv series Castle, Detective Becket is presented as an exceptional and gifted detective. In the first episode we hear from one of her colleagues that he likes “a simple Jack killed Jill over Bill” rather than the “freaky” ones. Becket responds, “Oh, but the freaky ones require more.” And then she challenges them: “Don’t you guys read?” As the series progresses we can see that she just brings more to the game and in that first episode the difference is made explicit.

We undervalue reading, often while making a big deal about it. Writers become celebrities, usually once one or more of their books is made into a film. And their fans may well read everything they publish, but that’s not beneficial reading. Like anything else, if you do not expand your horizons, complicate your diet, move out of your comfort zone, you end up trapped in a self-referential, reaffirming loop that grows nothing.

We must read so our apprehension of the world is less frightening, amenable to recognition, and manageable. So that people are not so alien and culture not so forbidding. Certainly someone can read a great deal and still be unable to decipher the world, but I believe such people to be a minority, and most of us benefit from the increased clarity that comes from an ongoing encounter with Other Minds.

The greatest benefit comes from a catholic indulgence: read widely, daily—fiction, science, history, philosophy, memoir—because at some point you will find it all reinforcing, that insights gained in one place can be enriched and enlivened by another source. And somewhere along the way, we may find that we are no longer easily fooled.

The most valuable ability of late would seem to be this, the awareness to not be fooled.

I make no prediction that a sudden upsurge of deep reading would solve our problems. Humans can be contradictory, perverse creatures. But it seems obvious that an illiterate populace is an easily-tricked, easily swayed populace. Given that those who are invested in people watching their shallow offerings rather than go off somewhere to read are generally those who would sell us shiny bits that delight and fail, it would be a good strategy to take up books and stop being led like myopic sheep.

But I have a rather more personal reason for urging people to set aside whatever prejudices they acquired in primary and secondary schools that turned them against reading-for-pleasure. When I set a book aside, as one must, and go out into the world, I would like to have meaningful contact with other people, and ignorance is a depressing barrier to that.

Why read? To be more. To hopefully be yourself. And possibly to be free.

Seeking Meaning In Sand

I have not yet seen the new film version of Dune. I may write about it after I do, although it is not the entire story. What I am interested in here is the ongoing obsession with the novel. This will be the third cinematic iteration. Famously, there are two uncompleted versions, one by Jodowrosky and another by Ridley Scott. We know how far the former came because there is a fascinating documentary about it, but as for Scott’s version there are mainly rumors and statements that he wanted to do one. Personally, I would have been interested to see that one—I very much like Ridley Scott’s palette: even those of his films that don’t quite work for other reasons I find wonderful to look at—and in some ways he has perhaps played around the edges of it through his Alien franchise. (The first film starts on a world that might have been Arakkis, the second is evocative of Gede Prime, the others keep returning to desert worlds, in theme if not setting. And Ripley becomes a kind of ghola as she is resurrected again and again.)

What is it about the original novel that compels the ongoing obsession, not only of filmmakers, but of fans? (There would be no funding for the films if the audience were not so large and committed. That speaks to the book.)

The history of the novel is something of a publishing legend, like other groundbreaking books. Multiple rejections, ultimate publication, often in a limited way, and a growing audience over years. Dune was famously rejected something like 27 times before finally being taken up by a publisher better known for automobile manuals.

It was, however, serialized in one of the top science fiction magazines, Analog, so dedicated SF readers were the first to encounter it, and doubtless formed the primary audience. I remember reading the ACE paperback from the late Sixties. Its impact on me was almost too large to detail.

I was used to science fiction novels being under 200 pages—average then was 160. From the Golden Age forward you rarely found one more than 250 pages. Stranger In A Strange Land by Robert A. Heinlein was an outlier at 408 in its first hardcover incarnation. So here I find this massive book more like the so called classics I’d been reading—Dickens, Dumas, Dostoevsky, Tolstoy—crammed into the cover of a mass market paperback which included a glossary and indexes, explanatory material (every bit as fictional as the main narrative). It felt important. I was 14, it was dense, I struggled through it. (It led to a profound teaching moment in how to read which I’ve written about elsewhere.) I could feel my horizons expand, even though at the basic level of story it was no more or less fascinating than most other good science fiction novels I had read. But it opened possibilities for narrative depth.

A handful of other novels came out around that time that exploded the confines of the thriller-format SF had been kept to—John Brunner’s Stand On Zanzibar and The Sheep Look Up; Robert A. Heinlein’s The Moon Is A Harsh Mistress; Ursula K. Le Guin’s The Dispossessed and The Left Hand Of Darkness; and the coming rage for trilogies (many of which were single narratives published economically in three volumes). By the mid-Seventies publishing had changed to accommodate a new idea of what an SF novel could be, including expanded length to include what has become known as World Building (a technique which in some instances supplanted more important aspects of fiction). Not all by itself, but certainly as a point of history, Dune helped make this possible by creating a market for fuller expositions and more detailed construction. This alone might make it significant.

But that alone would not have made it a perennial seller, almost constantly in print ever since. If Frank Herbert had written nothing else, Dune would have made his career.

It was followed up by two more—Dune Messiah and Children of Dune—completing a cycle. That first trilogy stands as a unified work. The second two books are plot-driven indulgences, but not superfluous. The second trilogy…publishing had discovered by then that science fiction could be best-selling fiction and a frenzy of large advances and high-profile publications mark the late Seventies and early Eighties. Herbert’s publisher enabled him to indulge himself with a second trilogy that often leaves people puzzled. But it kept the spotlight on the primary work.

David Lynch’s movie enlarged the audience again. That film, by a director with a certain reputation for examining the macabre oddnesses of humanity, is a spectacular curiosity. It is a mixed bag of brilliance and weird choices.

Then came a modestly-budgeted miniseries on the SyFy Channel, which went on to include the second two novels. It did a much better job of telling Herbert’s story. The chief complaints seem to be the results of that budget (and that Sting did not reprise the role of Feyd Rautha). It gets dismissed too readily, as if the world were waiting for the “real” cinema treatment.

Which we now, by all accounts, have.

As I say, I have not seen it yet. I want to address the book and its seeming tenacity.

One of the things Herbert did was lace his tale with wise-sounding profundities in the form of aphorisms and epigrams. Each chapter starts with a quote from some serious work by the presumed chronicler of the hero’s life. They sound like quotes from works like the I Ching or SunTzu’s Art of War. This was not a new trick when Herbert did it, but he was particularly adept at it in this book. It is a far future in which, presumably, philosophy has transformed along with everything else. The quasi-feudal politics and economics are given a veneer of newness this way, as if to signal that while it looks like something one would find in the 12th Century, it is not quite the same thing, but you have to take the author’s word for it, because it is the future. The quotes set an aesthetic tone that, among other things, allows us to assume something else is going on instead of just the same old historical thing. In science fiction, veneers matter—they work like orchestrations in a symphony, selecting the right instrument for the right phrase, coloring it. (Veneers should never be mistaken for the story or the theme, which is something unobservant critics do all the time.)

Seriousness established, every significant decision becomes inhabited by purpose, meaning, resonance, and a justification that raises the level of what we read almost to that of destiny, certainly of mythmaking. With this, the writing itself need not be spectacular, just functional.

There are passages in Dune that are breathtaking in what they describe. The ecological aspects of the novel, while in some ways absurd in terms of actual science, take on the same immanence as anything the actors possess. In a way, Dune is one of the first terraforming novels, embracing the idea that human action can transform an entire world. (A couple of years later, we see much more of this, often more pointedly, as in works like Le Guin’s The Word For World Is Forest—again, the novel opens up a field of possibilities, or at least prepares an audience for more of the same.)

But the characters are hard to relate to—this is a story about archetypes and aristocrats in conflict with emperors and churches. The ordinary people get lost amid the giant legs of the SF manifestations of Greek Heroes. We read this novel for the plot and world and the political revelations. We become engaged because this is in important ways a Lawrence of Arabia story—one toxically mixed with Faust. We read it because we are aware that gods and deserts change the world.

We read it because, as well, we are enamored of the idea of Enlightenment in a Pill.

Herbert was always working in the fields of mind-altering drugs—possibly his best and most relatable novel in this vein is The Santarroga Barrier—and with Melange, the Spice, he created the ultimate in mind-expanding temptations. Its use gives humanity (and others) the universe. Time and space can be brought to heel with it. Visions, prophecies, and clarity are on offer. But it is the ultimate Faustian bargain, for its loss will destroy everything.

It is aptly named. Melange, a mixture of often incongruous elements. A mess, if you will, but messes can evoke wonder, even seem beautiful.

At the heart of this Faustian conundrum are the Fremen, patterned after the Bedu of the Middle East. They are trapped on a world with profoundly limited resources and must be kept that way for the benefit of the rest of the universe. Not quite slaves, but certainly not masters of their own world. Freeing them courts disaster—because part of that freedom entails remaking their world, making it wet. Water, though, is poison to the giant worms that produce the Spice.

Trap after trap after trap populates the novel. Disaster looms. The plot compels.

And of course the relevance to our reality could not be plainer. The teetering sets of balances, all of them with ethical pitfalls, allow Dune to remain trenchant, relevant, challenging. Added to this is the clear connection to the Greek tragedians (especially in the second trilogy—I suggest boning up on Aeschylus and Euripedes before trying them) which gives the book its ongoing frustration of clear, ethical resolution. (And cleverly he took the possibility of building machines that might aid people in their problem-solving off the table, by outlawing thinking machines. It’s all on us and what we bring to the game.)

A final thread woven through the book that seems to make it constantly popular is that it is a coming-of-age story that contains a biting critique of privilege. Whatever Paul might want to be for himself, he is born into a web of expectations that impose their demands from all sides, making any choice he might make impossible outside of a constructed destiny. The adolescent struggling to make sense of the world and find a way to live in it, thinking if only he were god and could command everything to be rational or at least amenable. Paul’s tragedy is that he in fact can become god—and then discovering that this is no solution, either.

How well this new movie deals with all this, I look forward to seeing. For the moment I simply wanted to examine some of the reasons this novel continues to find audiences and why so many filmmakers are drawn to it. The elements it contains transcend the limitations from which it suffers. But whatever the case, this is a novel that allows readers to find meaning—whether that meaning is in the novel or not.

Strange New Worlds

Fifty-five years ago a television show appeared that changed everything.

it didn’t seem like it at the time. It was clumsy, but for the time it was a marvel of production values. The scripts were occasionally tortured constructs, the characters stiff, the plots absurd. It lasted three seasons, got canceled, and drifted into the twilight zone of fondly-remembered might-have-beens.

Then fandom took over, kept it alive, and eventually it was revived. Not in the way of retreads, as those we see today—reboots that quite often, though with better production values, are not exactly new—but in a resumption. We’ve gotten used to some of this today, what with franchise switching from one network to another, evading cancellation. We’ve even gotten used to quality reboots.

But Star Trek was the first to do all this successfully, in several incarnations.

I recently finished viewing the third season of Star Trek: Discovery and then began a rewatch of the original series. It has become the thing to do to make fun—usually mild fun—of the original, especially Shatner’s over-acting, but also the inconsistency of the universe, the poor special effects, all the flaws that pretty much any television show back then suffered from. And yes, compared to now, the show lacks. But there is a remarkable familial consistency between them. In 1966 Gene Roddenberry helmed a work of fiction that came to exist well outside the confines of the screen. Most of the fare of the day only ever existed during its broadcast window and inside the square of the picture tube. The Federation, in other words, was real.

We’re used to this in written fiction—novels and short stories. World-building that offers the heft and texture of a real place is expected. Television was not like that. The ephemeral nature of the product may have contributed to the attitude that only so much work need be done to make what ended up on the screen serve for a half-hour or hour of viewing. Cancellation was right around the corner. Even those shows with unusual longevity usually relied on the viewers to fill in whatever extended aspects were needed. The Old West was a mythical place most people already believed in. Crime shows only needed the daily news to lend that kind of weight to the stories.

In science fiction is was unprecedented on television. Star Trek offered the kind of substantive world that readers of science fiction had encountered for decades. Despite the awkwardness of some of the episodes, that was the thing that drew many of us. Almost from the first episode, we tuned in to a place different from our world that felt almost as real.

It was a remarkable achievement, one that made possible the best of SF tv that came after. The lesson was hard-learned and it took a few decades, but it was the important element.

As to the rest…Kirk, Spock, and McCoy, the Enterprise, Starfleet—none that would have made any lasting impact without that world.

And about them. They reflected other trios of characters in other shows, most notably (to me, at least, others may have different examples) the principals in Gunsmoke. Matt Dillon, Festus, and Doc. And when you watch, really watch, the acting was superb. It had to be. They were required to convey “belonging” in a world quite alien to ours. Their actions had to seem natural for that context. They had to speak dialogue that would make no sense anywhere else. When McCoy waxes empathetic about the past barbarities of medicine, it conveys several things at once, about the future of medicine, about the sentiment attached to his profession, about the history that has elapsed within the show’s reference between then and now, hence providing actual historical context, not to mention McCoy’s heart and his attitude.

Even Shatner’s performances are less bombastic than the jokes would seem to suggest. The byplay between Kirk and Spock is rather remarkable.

And Nimoy…

One felt it possible to step through the screen and live there, because there would be a There to live in.

Once the franchise was revived, first in the films and then in a new series (Next Generation), the extent of that creation began to manifest more clearly. For 55 years now we have been exploring the Strange New Worlds of that universe. That each new series manages to be as impressive as they are, it becomes even clearer that Star Trek has become a dialogue generator. I mean in the philosophical sense. It puts questions to us that need answers—not for then, not for the 23rd or 24th Centuries, but for Now. The philosophical challenges of the franchise have brought about a massively useful conversation. At the center of it is, perhaps, a simple question that may seem minor: what does it mean to be human? Yes, this is a core question in most if not all drama, but in the case of science fiction it takes on added weight because we find actual representations of different possibilities of Human. And in Star Trek we have a popular forum for that question, asked in that way, in a medium that reaches a much larger audience.

What we learn is that Human has no single, concrete definition—but whatever it is, it seems to be realer than anything else.

Exploring that question…well, that’s the real Five Year Mission, isn’t it? Therein we find the strange new worlds.

Clearly, it has not been, nor cannot be, limited to just five years.

The Relevance of Science Fiction

Kingsley Amis, in his book on science fiction, named Frederik Pohl as possibly the best practitioner of the craft. For some inside the field, it was a curious choice, but over time it has become difficult to deny. Pohl had one of the longest careers in SF, working at one time or another in just about every aspect of the genre—writer, agent, editor, certainly promoter. His novel Gateway is still one of the most memorable and poignant reads ever produced in the field and his work as editor of Galaxy and If brought many superb writers in.

He was also one of the great collaborators. He worked with Jack Williamson, Thomas T. Thomas, Isaac Asimov, Lester Del Rey. But perhaps his best collaborations were with Cyril Kornbluth.

Especially The Space Merchants.

Much has been written about the so-called “predictive” qualities of science fiction. Those familiar with the field weary of this. The whole point of science fiction is speculation based on what we currently know. The anticipation of technologies is not meant to be specific, even though the first magazine dedicated to it (Hugo Gernsback’s Amazing and Wonder Stories) quite explicitly intended to showcase gadgetry. By the time SF had grown into what we see today, this notion was viewed with chagrin and some impatience. Yes, spaceships are cool. Yes, mile-high buildings would be amazing. Yes, aliens and that they imply.

But the point is to set up a different arrangement of conditions based on the idea of social, technological, and material change and then see how this affects people.

So we open a novel like The Space Merchants and almost at once, from our perspective, find the gimickry of the setting amusing and/or embarrassing, because it was written in the 1950s and it shows. This is supposed to be about the 21st Century, after all. And what we find is something made up of parts of The Man In The Gray Flannel Suit, Brave New World, a touch of Captains Courageous, and The Manchurian Candidate. Advertising agencies run the world. It is an overpopulated planet, highly stratified, resources uncomfortably limited, with a propaganda machine run on brainwashing, narcotics, and a gleeful refusal to see anything wrong with any of it.

I will not here describe the plot, which is pretty much spy thriller-esque and moves the story along nicely. What matters here is the prediction. Not of the specifics of the scenario—that is exaggerated, pushed to an almost absurd extreme in service to the theme of the book, which is among those perhaps best characterized as in the “If This Goes On” variety.

Coming out of World War II, one of the underlying motivations informing politics and economics was a desire to make sure it never happened again. The world had beaten itself to a pulp. The political and social components of that disaster were much debated and quite naturally there was concern that it could happen again.

A number of things coincided to provide an apparent way through. First, the emergence of behavioral science, which sought to explain why people do what they do. Secondly, the joining of Madison Avenue advertising culture with politics (Eisenhower’s campaign was run by ad agencies while his opponent, Adlai Stevenson, rejected them out of hand). Thirdly, the apparent victory of capitalism as the solution to all material problems (thrown into stark contrast by a similar attempt at dominance by the soviet blocs). America came out of the war not only whole but in the de facto role of world savior.

To some extent, The Space Merchants is commentary on the embrace of capitalism as a kind of religion. That runs through the novel as a nerve-jangling given. The world built by ad agencies depends on the blind allegiance of consumers, which expresses itself in categorical denials of any other possible solution to what have, in the novel, become patently unmanageable global problems.

But not quite catastrophically unmanageable. It still seems to those in the upper layers to be fixable. Just push things a little more—for instance, by opening the planet Venus for colonization.

Reading it today creates a buzz of recognition. If one ignores the trappings of the scenario—the pedal-driven cabs, the “contract” marriages, the cheesy ad campaigns—one can see the lineaments of a future we have ourselves come to inhabit. The details are different but the essential gestalt is very much as Pohl and Kornbluth suggested it might be. Blind devotion to a capitalism that is more religion than tool, the easy acceptance of a class system that relegates people to poverty, the fervent belief that looming disasters are nothing of the kind and we don’t have to actually do anything about them.

Jill Lepore’s latest book, If Then, chronicles the rise in the Fifties of the factors which can easily be discerned in the background assumptions of The Space Merchants. The way in which, out of a desire to control the future and avoid ever having to deal with the kind of things that resulted in WWII, we have placed our hopes and energies in systems that have, frustratingly, become the stuff of 1950s cautionary tales. Looking out our collective windows, we see essentially the country, if not the world, run by Ad Men.

I do not wish to be too dire here. The resonances are far from one-to-one. But the work done in The Space Merchants suggests where the whole idea of predictive SF may come from. As always, it has little to do with the “stuff” and everything to do with people.

Recurrence and Renewal

William Gibson’s new novel, Agency, is a sequel to his superb The Peripheral, which is arguably one of the best of three or four time travel novels ever written.  Here he continues with several of the same characters, still exploring the peculiarities of the Stubs, and it is clear now that the matter at hand is alternate solutions to a set of problems faced in the present world.

Wilf Netherton, Ainsley Lowbeer, Ash, and Rainey intervene in a new stub that is on the brink of nuclear war. As the narrative unfolds, it become evident that this is not Our World. The 2016 presidential election did not go the way ours did, for one thing.

But the changes seem minor as far as they go.

At a certain point, though, none of these distinctions matter, because Gibson has tapped into the truth that we all live in our own stubs. Reality is comprised of an enormous amount of shared background, but details vary across a variety of platforms—social, economic, cultural, educational, political, informational, geographical, and temperamental.

By separating them out as if they were physically distinct realities, Gibson permits an examination of the elements that comprise distinctive characteristics—with the possibility of corrective interference. In the case of the first novel, the stub was based on aspects of rural, post-agrarian southern culture. In this new novel, it is very much West Coast venture capitalist techie.

Verity has just been hired by a company called Tulpagenics to beta test an interactive piece of eyewear. Immediately, Gibson is playing a long game through naming. Verity, which is linked to truth, to verify, to, ultimately, reality, and Tulpa, a concept of spiritualism coming from the Tibetan sprulpa, meaning “emanation” or “manifestation.”

It seems simple enough. Verity, though, apparently has been chosen because she has been successfully avoiding media attention after her breakup with a billionaire entrepreneur named Stetson, who generally drew the attention of all the popular sources of celebrity quasi-news. Verity has been sleeping on the couch of a friend named Joe-Eddy, who in his own way is a highly resourceful independent…something. Her ability to stay invisible seems important to Tulpagenics for this field test.

The glasses, though…she becomes quickly acquainted with Eunice, who turns out to be an AI program of fairly unique characteristics. They begin to build rapport. In fact, Eunice becomes so important to Verity that—

Enter Wilf Netherton, Ainsley Lowbeer, and Connor from the last novel. The stubs, using the same sort of informational technology Lowbeer and Netherton avail themselves of, can interact. To remind, Connor is a veteran given purpose in The Peripheral as an operative who then becomes the chief of security (bodyguard) of his friend-elected-president, Leon. Connor is remarkable at remote operations—drones—and is enlisted here to assist Verity and Eunice to avoid capture and death at the behest of the parent company of Tulpagenics, Cursion.  (Cursion roughly means “running, to run.”)

This stub is edging close to nuclear war. Lowbeer and company are intent on bringing it back from that edge. Eunice may be instrumental to that. It is hinted at—strongly—that while the stubs are not part of the “main” continuum, events in them have an effect. Of course, there’s some question raised as to whether the London of 2136 is the main continuum, but that’s a question to be answered (perhaps) later.

Gibson’s narrative approach is fascinating. A series of otherwise ordinary-seeming actions around key moments of invention that accumulate to a climax that, in hindsight, feels right and inevitable but still comes as a surprise. Occasionally it seems that if you take any given paragraph out and examine it, there’s not much in it, but the wavefront generated in context is inexorable.  He has always presented as a “simple” writer, but this is a serious misjudgment.  And the long game he always indulges impresses in ways we least expect.

But one thing he is completely engaged with is the idea of emergent properties of intelligence. In Neuromancer the end-game was the creation-emergence of a fully autonomous A.I. In each of his fictive creations, there is this fascination and examination of what might loosely be termed Singularites (they aren’t, but the road leading to them feels the same), and in this current work he’s playing across continua while dealing with the same suite of notions about A.I. and pivot points and paradigm shifts.

It’s not that he’s writing about the same idea. It’s that the idea is so massively encompassing that one can almost say everything is about it.

In this formulation, the Singularity can be used to label any moment where enough different threads and forces converge to leverage a pronounced conceptual change. Before this moment, we knew the world one way. After it, we see things differently.

He achieved this revelation to great effect in his previous trilogy, which was not science fiction so much as science fiction-al.  It was entirely set in our present world, with only changes in emphasis about the technology and the ways in which it manifests and is manipulated, yielding a portrait of a paradigm shift in process.  He seems to be plowing the same fields in this present work, only from a determinedly SFnal position, that of a species of time travel which is based on the communication of information across continua. The effect, interestingly, is similar to what one might experience traveling from one segment of our global society to another, with the attendant culture shocks and privileged dispositions in play.

In this, Gibson shows himself to be one of the sharpest observers we have, whose work is subversively relevant. He understands how all this “development” impacts and has a genius for dramatizing emergent properties while spinning a fascinating yarn.

 

 

Sleepless In Present Time

Nancy Kress is one of those writers who comes up in conversations about good science fiction who elicits knowing nods and smiles of appreciation, sometimes even among people who may not have read anything. The name is known and she has written material that influences.

In particular, a novel which can be regarded as a classic.

The word gets over-used and misapplied, but in the sense of meaning something of on-going value, with a tendency to remain relevant to present issues, and a reliably fascinating read, Beggars In Spain qualifies.

Let’s get the mechanics out of the way first. With regards to elegant sentences, smooth plotting, well-drawn characters, and thematic cohesion, this is as good as it gets in any genre. Published in 1993, the only thing that has “dated” is the actual timeframe in which it is set. This is a problem of most near-future SF. But here it intrudes so lightly that one may mentally move the frame forward. After a while, it ceases even to be a distraction.

As for the substance of the novel…

This is an excellent example of the kind of science fiction which is sometimes described as ideal—make one change and follow the consequences, rigorously and tenaciously. One change. One major speculative change.

I emphasize that “speculative” change because there are the usual kinds of speculations one expects in good SF. Changes in technology, changes in certain political arrangements, and so forth. This is the future, after all, it would be odd if something ordinary weren’t different. So we have a new kind of power source, Y energy, and therefore new distribution systems for it. Details.

The Change that matters, however, is singular and presented with an enviable plausibility. Gene editing has reached the point of on-demand modifications. What some people—ambitious people, hopeful people, people with means—-opt for it to create children who do not require sleep.

The Sleepless, as they become known, are in this respect a variant with possibilities of becoming a separate species. But the immediate result is a growing resentment among “Sleepers” who realize quickly the pronounced competitive advantage the Sleepless will have. All things being equal, they will outperform the unmodified simply by virtue of an extra eight hours per day to work.

There are people who require far less sleep than most of us (some as little as an hour or two per twenty-four hour cycle) and then there is the terrible disorder Fatal Familial Insomnia, which deprives its victims of sleep completely, leading to a number of unfortunate consequences and, eventually, death. Sleep is essential. We have learned even since Kress first published this novel in 1993 how complex and essential sleep is to our health, but she posits the condition in such a way that one can ignore these downsides, at least for the purposes of the story. It is a genetic modification, which comes with other unintended “benefits” which figure into the plot.

What they chiefly lose, though, are dreams.

Which she introduces into the story in a fine development that adds to the overall thought experiment.

But the question running throughout is both philosophical and sociological. By creating the Sleepless, Kress opens the subject of prejudice and, while never actually using the word, eugenics in ways that allow for an examination of the process as it manifests.

The Sleepless are born into privilege. It’s an elective in vitro procedure, very expensive, so naturally only the wealthy will be able to afford it. This introduces the class nature of it and the first one to whom we are introduced—Leisha—is the result of her father’s desire to give her the advantages he imagines for her, so she can be even more successful than he is.  In a nice twist, Leisha becomes enamored of law—not in its economic sense but in its application for justice. She becomes the focus of the various arguments pro and con over the Sleepless and a champion for tolerance—on both sides.

Asd the debate heats up over the Sleepless, the economy is changing, leaving devastation behind in many places. This is not at all ahistorical.  This happens. It’s happening now.  But with the advent of the Sleepless, there is a source of blame and a cause for rally. The hoped-for accommodation expressed by some on both sides of this genetic divide, while not ineffective, becomes compromised in the on-rush of sadly predictable politics.

And then there is a further step taken with the potential to divide even the Sleepless against each other.

This is a finely-wrought, complex narrative about the ramifications of technological changed and social reaction to that change. Into the mix, Kress throws a couple of well-chewed economic arguments with which we are all familiar, in questions of “deserving” and socialism and boot-strap judgments that attempt to organize our ethical choices according to work, ability, and social responsibility.  Kress is very good at arguing from both sides, lending plausibility to positions we can see as both tragically forceful and straw-man positions. As one reads, one knows it would play out this way.

The Sleepless are, in the larger sense, an example of what might be seen as unfair advantage in the hands of the few who can afford it. In reality, it could be anything: better access to information systems, travel options most do not have, entrée to persons or organizations barred to most, an unshared network, or simply technological enhancements.  Gaining and maintaining personal advantage in a competitive world is a constant and has always been, initially, a benefit of the privileged. By making it a genetic modification, Kress removes the illusion that what the Sleepless can do anyone can if only they had the opportunity. A certain equilibrium is maintained in our world by the surface tension of the assumption that, one day, we’ll all have “that” tool.

The only winning scenario in this is Change. Things will be different and those who can accept that and learn to live with it tend to have an advantage over those who can’t. Of course, it is never quite that simple, and the richness of Kress’s story is in the demonstration throughout of how not simple it is.