Tag Archives: reading

Book Magic

Book Magic

If you’d prefer to listen to this as an audio essay, please visit The Natural Curiosity Project or click here.

The hardest thing about writing a book isn’t coming up with the story, or inventing the complicated relationships that help define the characters, or making sure the story flows the way it’s supposed to. It isn’t the painstaking process of finding all the typos and misspellings and missing quotes, or fact-checking every tiny detail so that a reader who has it in for you discovers with chagrin that there’s little to criticize. Nope—it’s none of those, although those do require work.

The hardest thing about writing a novel is creating the one-paragraph synopsis that goes on the back cover. Think about it. The publisher says to the author, “Please take your 140,000-word, 468-page novel and describe it in 125 words or less, in a way that will cause a prospective reader to drool uncontrollably all the way to the checkout counter at the bookstore.”

Good luck with that. Like I said: Hard.  

I’m about to publish a new novel, my fifth, called “The Sound of Life.” My editors have gone through it with their editorial microscopes, identifying mistakes, errors and omissions. My cadre of readers have gone through it, uncovering awkward dialogue, technical errors, and flow problems that I inevitably missed. The final manuscript is called ‘The Sound of Life v48F,’ which means that the book went through 48 complete rewrites before I deemed it ready for publication—although there will be at least two more read-throughs before I give it the final go-ahead.

I’m proud of this book. It’s my 106th title (bad habit), and I felt a sense of letdown when I typed the last sentence and knew it was done. That’s never happened to me before. Because of the story that magically emerged from the creative mists before me, the wonderful characters I met along the way, and the journey they allowed me to join them on, when I typed the last word of the final sentence, I felt like I was pulling into the driveway after a long, memorable road trip. I needed a medicine for melancholy, because it was over.

Author Alice Munro wrote, “A good book makes you want to live in the story. A great book gives you no choice.” That’s how I felt with this one. And please understand, this isn’t my ego talking. I experienced something as I wrote this book that rarely happens, like seeing the mysterious and elusive “green flash” over the ocean at sunset. At some point along the creative journey, I realized that I was no longer writing the book: it was writing itself. My job changed from creative director to scribe. It was like it was saying to me, ‘Here’s the keyboard. Try to keep up.’

Author M.L. Farrell said this about books:

A book is not mere paper and words.

It is a door and a key.

It is a road and a journey.

It is a thousand new sights, sensations and sounds.

It holds friendships, experiences, and life lessons.

A book is an entire world.”

There’s so much truth in that. I’m at the point with this one where people are asking me what “The Sound of Life” is about, and now that I know, I’m excited to tell them. But as I describe the 56-foot boat that’s central to the story, the journey from the eastern Caribbean through the Panama Canal then up the coast to Northern California, the rich interactions among the characters, and the happenings in Peru that tie much of the narrative together, I realize somewhat sheepishly that every time I tell someone what the book’s about, I speak in the first person. Not ‘they,’ but ‘we.’ Well, sure—I was there. I was along for the ride. Why wouldn’t I speak in the first person?

Stephen King is a writer whom I admire greatly, for many reasons. “Books are a uniquely portable magic,“ he once said. A uniquely portable magic. I think about the complexity, richness, excitement, laughter, and delicious food that’s captured between the covers of this book. I think about the immensely likable people and their relationships, around whom the story revolves. I think about the sights and sounds and smells and tastes they experience along the way. And I think about what it felt like when my characters, my good friends, got back on the boat and motored away, waving as they left me behind on the dock, en route to their next adventure. 

A uniquely portable magic.

“The Sound of Life” will be released in December 2025.

The Wonderful, Terrible Gift of Science

*A note before you begin to read: This is a long post; if you’d rather listen to it, you can find it at the Natural Curiosity Project Podcast.

Part I

LIFE IS VISUAL, so I have an annoying tendency to illustrate everything—either literally, with a contrived graphic or photo, or through words. So: try to imagine a seven-sided polygon, the corners of which are labeled curiosity, knowledge, wisdom, insight, data, memory, and human will. Hovering over it, serving as a sort of conical apex, is time. 

Why these eight words? A lifetime of living with them, I suppose. I’m a sucker for curiosity; it drives me, gives my life purpose, and gives me a decent framework for learning and applying what I learn. Knowledge, wisdom, insight, and data are ingredients that arise from curiosity and that create learning. Are they a continuum? Is one required before the next? I think so, but that could just be because of how I define the words. Data, to me, is raw ore, a dimensionless precursor. When analyzed, which means when I consider it from multiple perspectives and differing contexts, it can yield insight—it lets me see beyond the obvious. Insight, then, can become knowledge when applied to real-world challenges, and knowledge, when well cared for and spread across the continuum of a life of learning, becomes wisdom. And all of that yields learning. And memory? Well, keep listening.

Here’s how my model came together and why I wrestle with it. 

Imagine an existence where our awareness of ‘the past’ does not exist, because our memory of any action disappears the instant that action takes place. In that world, a reality based on volatile memory, is ‘learning,’ perhaps defined as knowledge retention, possible? If every experience, every gathered bit of knowledge, disappears instantly, how do we create experience that leads to effective, wisdom-driven progress, to better responses the next time the same thing happens? Can there even be a next time in that odd scenario, or is everything that happens to us essentially happening for the first time, every time it happens?

Now, with that in mind, how do we define the act of learning? It’s more than just retention of critical data, the signals delivered via our five senses. If I burn myself by touching a hot stove, I learn not to do it again because I form and retain a cause-effect relationship between the hot stove, the act of touching it, and the pain the action creates. So, is ‘learning’ the process of applying retained memory that has been qualified in some way? After all, not all stoves are hot.

Sometime around 500 BC, the Greek playwright Aeschylus observed that “Memory is the mother of all wisdom.” If that’s the case, who are we if we have no memory? And I’m not just talking about ‘we’ as individuals. How about the retained memory of a group, a community, a society?

Is it our senses that give us the ability to create memory? If I have no senses, then I am not sentient. And if I am not sentient, then I can create no relationship with my environment, and therefore have no way to respond to that environment when it changes around me. And if that happens, am I actually alive? Is this what awareness is, comprehending a relationship between my sense-equipped self and the environment in which I exist? The biologist in me notes that even the simplest creatures on Earth, the single-celled Protozoa and Archaea, learn to respond predictably to differing stimuli.

But I will also observe that while single-celled organisms routinely ‘learn,’ many complex multi-celled organisms choose not to, even though they have the wherewithal to do so. Many of them currently live in Washington, DC. A lifetime of deliberate ignorance is a dangerous thing. Why, beyond the obvious? Because learning is a form of adaptation to a changing environment—call it a software update if you’re more comfortable with that. Would you sleep well at night, knowing that the antivirus software running on your computer is a version from 1988? I didn’t think so. So, why would you deliberately choose not to update your personal operating system, the one that runs in your head? This is a good time to heed the words of Charles Darwin: It is not the strongest that survive, nor the most intelligent, but those that are most adaptable to change. Homo sapiens, consider yourselves placed on-notice.

Part II

RELATED TO THIS CONUNDRUM IS EPISTEMOLOGY—the philosophy that wrestles with the limits of knowledge. Those limits don’t come about because we’re lazy; they come about because of physics. 

From the chemistry and physics I studied in college, I learned that the convenient, simple diagram of an atom that began to appear in the 1950s is a myth. Electrons don’t orbit the nucleus of the atom in precise paths, like the moon orbiting the Earth or the Earth orbiting the Sun. They orbit according to how much energy they have, based on their distance from the powerfully attractive nucleus. The closer they are, the stronger they’re held by the electromagnetic force that holds the universe together. But as atoms get bigger, as they add positively-charged protons and charge-less neutrons in the densely-packed nucleus, and layer upon layer of negatively charged orbiting electrons to balance the nuclear charge, an interesting thing happens. As layers of electrons are added,  the strength with which the outermost electrons are held by the nucleus decreases with distance, making them less ‘sticky,’ and the element becomes less stable. 

This might be a good time to make a visit to the Periodic Table of the Elements. Go pull up a copy and follow along.

Look over there in the bottom right corner. See all those elements with the strange names and big atomic numbers—Americium, Berkelium, Einsteinium, Lawrencium? Those are the so-called transuranium elements, and they’re not known for their stability. If a distant electron is attracted away for whatever reason, that leaves an element with an imbalance—a net positive charge. That’s an unstable ion with a positive charge that wants to get back to a stable state, a tendency defined by the Second Law of Thermodynamics and a process called entropy, which we’ll discuss shortly. It’s also the heart of the strange and wonderful field known as Quantum Mechanics.

This is not a lesson in chemistry or nuclear physics, but it’s important to know that those orbiting electrons are held within what physicists call orbitals, which are statistically-defined energy constructs. We know, from the work done by scientists like Werner Heisenberg, who was a physicist long before he became a drug dealer, that an electron, based on how far it is from the nucleus and therefore how much energy it has, lies somewhere within an orbital. The orbitals, which can take on a variety of three-dimensional shapes that range from a single sphere to  multiple pear-shaped spaces to a cluster of balloons, define atomic energy levels and are stacked and interleaved so that they surround the nucleus. So, the orbital that’s closest to the nucleus is called the 1s orbital, and it’s shaped like a sphere. In the case of Hydrogen, element number one in the Periodic Table, somewhere within that orbital is a single lonely electron. We don’t know precisely where it is within the 1s orbital at any particular moment; we just know that it’s somewhere within that mathematically-defined sphere. This is what the Heisenberg Uncertainty Principle is all about: we have no way of knowing what the state of any given electron is at any point in time. And, we never will. We just know that statistically, it’s somewhere inside that spherical space.

Which brings us back to epistemology, the field of science (or is it philosophy?) that tells us that we can never know all that there is to know, that there are defined limits to human knowledge. Here’s an example. We know beyond a shadow of a doubt that the very act of observing the path of an electron changes the trajectory of that electron, which means that we can never know what its original trajectory was before we started observing it. This relationship is described in a complex mathematical formula called Schrödinger’s Equation.

Look it up, study it, there will be a test. The formula, which won its creator,  Erwin Schrödinger, the Nobel Prize in 1933, details the statistical behavior of a particle within a defined space, like an energy-bound atomic orbital. It’s considered the fundamental principle of quantum mechanics, the family of physics that Albert Einstein made famous. In essence, we don’t know, we can’t know, what the state of a particle is at any given moment, which implies that the particle can exist, at least according to Schrödinger, in two different states, simultaneously. This truth lies at the heart of the new technology called quantum computing. In traditional computing, a bit (Binary Digit) can have one or the other of two states: zero or one. But in quantum computing, we leave bits behind and transact things using Qubits (quantum bits), which can be zero, one, or both zero and one at the same time.  Smoke ‘em if you got ‘em.

The world isn’t neat and tidy where it matters: it’s sloppy and ill-defined and statistical. As much as the work of Sir Isaac Newton described a physical world defined by clear laws of gravity, and velocity, and acceleration, and processes that follow clearly-defined, predictably linear outcomes, Schrödinger’s, Heisenberg’s, and Einstein’s works say, not so fast. At the atomic level, the world doesn’t work that way. 

I know—you’re lighting up those doobies as you read this. But this is the uncertainty, the necessary inviolable unknown that defines science. Let me say that again, because it’s important. Uncertainty Defines Science. It’s the way of the universe. Every scientific field of study that we put energy into, whether it’s chemistry, pharmacology, medicine, geology, engineering, genetics, or a host of others, is defined by the immutable Laws of Physics, which are governed by the necessary epistemological uncertainties laid down by people like Werner Heisenberg and Erwin Schrödinger, and codified by Albert Einstein.

Part III

ONE OF MY FAVORITE T-SHIRTS SAYS,

I READ.

I KNOW SHIT.

I’m no physicist, Not by a long shot. But I do read, I did take Physics and Chemistry, and I was lucky enough to have gone to Berkeley, where a lot of this Weird Science was pioneered. I took organic chemistry from a guy who was awarded a Nobel Prize and had more than a few elements named after him (Glenn Seaborg) and botany from the guy who discovered how photosynthesis works and also had a Nobel Prize (Melvin Calvin). I know shit.

But the most important thing I learned and continue to learn, thanks to those grand masters of knowledge, is that uncertainty governs everything. So today, when I hear people criticizing scientists and science for not being perfect, for sometimes being wrong, for not getting everything right all the time, for not having all the answers, my blood boils, because they’re right, but for the wrong reasons. Science is always wrong—and right. Schrödinger would be pleased with this duality. It’s governed by the same principles that govern everything else in the universe. Science, which includes chemistry, pharmacology, medicine, geology, engineering, genetics, and all the other fields that the wackadoodle pseudo-evangelists so viciously criticized during the pandemic, and now continue to attack, can’t possibly be right all the time because the laws of the universe fundamentally prevent us from knowing everything we need to know to make that happen. Physics doesn’t come to us in a bento box wrapped in a ribbon. Never in the history of science has it ever once claimed to be right. It has only maintained that tomorrow it will be more right than it is today, and even more right the day after that. That’s why scientists live and die by the scientific method, a process that aggressively and deliberately pokes and prods at every result, looking for weaknesses and discrepancies. Is it comfortable for the scientist whose work is being roughed up? Of course not. But it’s part of being a responsible scientist. The goal is not for the scientist to be right; the goal is for the science to be right. There’s a difference, and it matters.

This is science. The professionals who practice it, study it, probe it, spend their careers trying to understand the rules that govern it, don’t work in a world of absolutes that allow them to design buildings that won’t fail and drugs that will work one hundred percent of the time and to offer medical diagnoses that are always right and to predict violent weather with absolute certainty. No: they live and work in a fog of uncertainty, a fuzzy world that comes with no owner’s manual, yet with that truth before them, and accepting the fact that they can never know enough, they do miraculous things. They have taken us to the stars, created extraordinary energy sources, developed mind-numbingly complex genetic treatments and vaccines, and cured disease. They have created vast, seamless, globe-spanning communications systems, the first glimmer of artificial intelligence, and demonstrated beyond doubt that humans play a major role in the fact that our planet is getting warmer. They have identified the things that make us sick, and the things that keep us well. They have helped us define ourselves as a sentient species.

And, they are pilloried by large swaths of the population because they’re not one hundred percent right all the time, an unfair expectation placed on their shoulders by people who have no idea what the rules are under which they work on behalf of all of us. 

Here’s the thing, for all of you naysayers and armchair critics and nonbelievers out there: Just because you haven’t taken the time to do a little reading to learn about the science behind the things that you so vociferously criticize and deny, just because you choose deliberate ignorance over an updated mind, doesn’t make the science wrong. It does, however, make you lazy and stupid. I know shit because I read. You don’t know shit because you don’t. Take a lesson from that.

Part IV

THIS ALSO TIES INTO WHAT I BELIEVE to be the most important statement ever uttered by a sentient creature, and it begins at the liminal edges of epistemological thought: I am—the breathtaking moment of self-awareness. Does that happen the instant a switch flips and our senses are activated? If epistemology defines the inviolable limits of human knowledge, then what lies beyond those limits? Is human knowledge impeded at some point by a hard-stop electric fence that prevents us from pushing past the limits? Is there a ‘there be dragons here’ sign on the other side of the fence, prohibiting us from going farther? I don’t think so. For some, that limit is the place where religion and faith take over the human psyche when the only thing that lies beyond our current knowledge is darkness. For others, it stands as a challenge: one more step moves us closer to…what, exactly?

A thinking person will experience a moment of elegance here, as they realize that there is no fundamental conflict between religious faith and hardcore science. The two can easily coexist without conflict. Why? Because uncertainty is alive and well in both. Arthur C. Clarke: Any sufficiently advanced technology is indistinguishable from magic.

Part V

THIS BRINGS ME TO TIME, and why it sits at the apex of my seven-sided cone. Does time as we know it only exist because of recallable human memory? Does our ability to conceive of the future only exist because, thanks to accessible memory and a perception of the difference between a beginning state and an end state,  of where we are vs. where we were, we perceive the difference between past and present, and a recognition that the present is the past’s future, but also the future’s past?

Part VI

SPANISH-AMERICAN WRITER AND PHILOSOPHER George Santayana is famous for having observed that ‘those who fail to heed the lessons of history are doomed to repeat them.’ It’s a failing that humans are spectacularly good at, as evidenced by another of Santayana’s aphorisms—that ‘only the dead have seen the end of war.’ I would observe that in the case of the first quote, ‘heed’ means ‘to learn from,’ not simply ‘to notice.’ But history, by definition, means learning from things that took place in the past, which means that if there is no awareness of the past, then learning is not possible. So, history, memory, and learning are, to steal from Douglas Adams, the author of The Hitchhiker’s Guide to the Galaxy, “inextricably intertwingled” (more on that phrase later). And if learning can’t happen, does that then mean that time, as we define it, stops? Does it become dimensionless? Is a timeless system the ultimate form of entropy, the tendency of systems to seek the maximum possible state of disorder, including static knowledge? Time, it seems, implies order, a logical sequence of events that cannot be changed. So, does entropy seek timelessness? Professor Einstein, white courtesy telephone, please.

The Greek word chronos defines time as a physical constant, as in, I only have so much time to get this done. Time is money. Only so much time in a day. 60 seconds per minute, 60 minutes per hour, 24 hours per day. But the Greeks have a second word, kairós, which refers to the quality of time, of making the most of the time you have, of savoring time, of using it to great effect. Chronos, it seems, is a linear and quantitative view of time; kairós is a qualitative version. 

When I was a young teenager, I read a lot of science fiction. One story I read, a four-book series by novelist James Blish (who, with his wife, wrote the first Star Trek stories for television), is the tale of Earth and its inhabitants in the far distant future. The planet’s natural resources have been depleted by human rapaciousness, so, entire cities lift off from Earth using a form of anti-gravity technology called a Gravity Polaritron Generator, or spindizzy for short, and become independent competing entities floating in space. 

In addition to the spindizzy technology, the floating cities have something called a stasis field, within which time does not exist. If someone is in imminent danger, they activate a stasis field that surrounds them, and since time doesn’t exist within the field, whatever or whoever is in it cannot be hurt or changed in any way by forces outside the field. It’s an interesting concept, which brings me to a related topic. 

One of my favorite animals, right up there with turtles and frogs, is the water bear, also called a  tardigrade (and, charmingly by some, a moss piglet). They live in the microscopically tiny pools of water that collect on the dimpled surfaces of moss leaves, and when viewed under a microscope look for all the world like tiny living gummy bears. 

Tardigrades can undergo what is known as cryptobiosis, a physiological process by which the animal can protect itself from extreme conditions that would quickly kill any other organism. Basically, they allow all the water in their tiny bodies to completely evaporate, in the process turning themselves into dry, lifeless little husks. They become cryptospores. Water bears have been exposed to the extreme heat of volcanos, the extreme cold of Antarctica, and intense nuclear radiation inside power plants; they have been placed outside on the front stoop of the International Space Station for days on end, then brought inside, with no apparent ill effects. Despite the research into their ability to survive such lethal environments, we still don’t really know how they do it. Uncertainty.

But maybe I do know. Perhaps they have their own little stasis field that they can turn on and off at will, in the process removing time as a factor in their lives. Time stops, and if life can’t exist without time, then they can’t be dead, can they? They become like Qubits, simultaneously zero and one, or like Schrödinger’s famous cat, simultaneously dead and alive.

Part VII

IN THE HITCHHIKER’S GUIDE TO THE GALAXY, Douglas Adams uses the phrase I mentioned earlier and that I long ago adopted as one of my teaching tropes. It’s a lovely phrase that just rolls off the tongue: “inextricably intertwingled.” It sounds like a wind chime when you say it out loud, and it makes audiences laugh when you use it to describe the interrelatedness of things. 

The phrase has been on my mind the last few days, because its meaning keeps peeking out from behind the words of the various things I’ve been reading. Over the last seven days I’ve read a bunch of books from widely different genres—fiction, biography, science fiction, history, philosophy, nature essays, and a few others that are hard to put into definitive buckets.

There are common threads that run through all of the books I read, and not because I choose them as some kind of a confirmationally-biased reading list (how could Loren Eiseley’s Immense Journey, Arthur C. Clarke’s Songs of a Distant Earth, E. O. Wilson’s Tales from the Ant World, Malcolm Gladwell’s Revenge of the Tipping Point, Richard Feynman’s Surely You’re Joking, Mister Feynman, and Studs Terkel’s And They All Sang possibly be related, other than the fact that they’re books?). Nevertheless, I’m fascinated by how weirdly connected they are, despite being so very, very different. Clarke, for example, writes a whole essay in Songs of a Distant Earth about teleology, a term I’ve known forever but have never bothered to look up. It means looking at the cause of a phenomenon rather than its perceived purpose to discern its reason for occurring. For example, in the wilderness, lightning strikes routinely spark forest fires, which burn uncontrolled, in the process cleaning out undergrowth, reducing the large-scale fire hazard, but doing very little harm to the living trees, which are protected by their thick bark—unless they’re unhealthy, in which case they burn and fall, opening a hole in the canopy that allows sunlight to filter to the forest floor, feeding the seedlings that fight for their right to survive, leading to a healthier forest. So it would be easy to conclude that lightning exists to burn forests. But that’s a teleological conclusion that focuses on purpose rather than cause. Purpose implies intelligent design, which violates the scientific method because it’s subjective and speculative. Remember—there’s no owners manual.

The initial cause of lightning is wind. The vertical movement of wind that precedes a thunderstorm causes negatively charged particles to gather near the base of the cloud cover, and positively charged particles to gather near the top, creating an incalculably high energy differential between the two. But nature, as they say, abhors a vacuum, and one of the vacuums it detests is the accumulation of potential energy. Natural systems always seek a state of entropy—the lowest possible energy state, the highest state of disorder. I mentioned this earlier; it’s a physics thing, the Second Law of Thermodynamics. As the opposing charges in the cloud grow (and they are massive—anywhere from 10 to 300 million volts and up to 30,000 amps), their opposite states are inexorably drawn together, like opposing poles of a gigantic magnet (or the positively charged nuclei and negatively charged electrons of an atom), and two things can happen. The energy stored between the “poles” of this gigantic aerial magnet—or, if you prefer, battery—discharges within the cloud, causing what we sometimes call heat lightning, a ripple of intense energy that flashes across the sky. Or, the massive negative charge in the base of the cloud can be attracted to positive charges on the surface of the Earth—tall buildings, antenna towers, trees, the occasional unfortunate person—and lightning happens. 

It’s a full-circle entropic event. When a tree is struck and a fire starts, the architectural order that has been painstakingly put into place in the forest by nature is rent asunder. Weaker trees fall, tearing open windows in the canopy that allow sunlight to strike the forest floor. Beetles and fungi and slugs and mosses and bacteria and nematodes and rotifers consume the fallen trees, rendering them to essential elements that return to the soil and feed the healthy mature trees and the seedlings that now sprout in the beams of sunlight that strike them. The seedlings grow toward the sunlight; older trees become unhealthy and fall; order returns. Nature is satisfied. Causation, not purpose. Physics, not intelligent design. Unless, of course, physics is intelligent design. But we don’t know. Uncertainty.

E. O. Wilson spends time in more than one of his books talking about the fact that individuals will typically act selfishly in a social construct, but that groups of individuals in a community will almost always act selflessly, doing what’s right for the group. That, by the way, is the difference between modern, unregulated capitalism and what botany professor Robin Wall Kimmerer calls “the gift economy” in her wonderful little book, The Serviceberry. This is not some left-leaning, unicorn and rainbows fantasy: it’s a system in which wealth is not hoarded by individuals, but rather invested in and shared with others in a quid pro quo fashion, strengthening the network of relationships that societies must have to survive and flourish. Kimmerer cites the story of an anthropologist working with a group of indigenous people who enjoy a particularly successful hunt, but is puzzled by the fact that they now have a great deal of meat but nowhere to keep it cold so that it won’t spoil. “Where will you store it to keep it fresh for later?” The anthropologist asks. “I store it in my friends’ bellies,” the man replies, equally puzzled by the question. This society is based on trust, on knowing that the shared meat will be repaid in kind. It is a social structure based on strong bonds—kind of like atoms. Bonds create stability; individual particles do the opposite, because they’re less stable. 

In fact, that’s reflected in many of the science fiction titles I read: that society’s advances come about because of the application of the common abundance of human knowledge and will. Individuals acting alone rarely get ahead to any significant degree, and if they do, it’s because of an invisible army working behind them. But the society moves ahead as a collective whole, with each member contributing. Will there be those who don’t contribute? Of course. It’s a function of uncertainty and the fact that we can never know with one hundred percent assurance how an individual within a group will behave. There will always be outliers, but their selfish influence is always neutralized by the selfless focus of the group. The behavior of the outlier does not define the behavior of the group. ‘One for one and none for all’ has never been a rallying call.

Part VIII

THIS ESSAY APPEARS TO WANDER, because (1) it wanders and (2) it connects things that don’t seem to be connected at all, but that clearly want to be. Learning doesn’t happen when we focus on the things; it happens when we focus on the connections between the things. The things are data; the connections create insight, which leads to knowledge, wisdom, action, a vector for change. Vector—another physics term. It refers to a quantity that has both direction and magnitude. The most powerful vector of all? Curiosity.

Science is the only tool we have. It’s an imperfect tool, but it gets better every time we use it. Like it or not, we live in a world, in a universe, that is defined by uncertainty. Science is the tool that helps us bound that uncertainty, define its hazy distant edges, make the unclear more clear, every day. Science is the crucible in which human knowledge of all things is forged. It’s only when we embrace that uncertainty, when we accept it as the rule of all things, when we revel in it and allow ourselves to be awed by it—and by the science-based system that allows us to constantly push back the darkness—that we begin to understand. Understand what, you say? Well, that’s the ultimate question, isn’t it?

The Wisdom of Loren Eiseley

One of my favorite writers is an obscure guy that most people have never heard of. His name is Loren Eiseley, and he was a physical anthropologist and paleontologist at the University of Pennsylvania for over 30 years. As a young man, during the Great Depression, he was a ‘professional hobo,’ riding freight trains all over the United States, looking for work and the occasional adventure; his academic career came later. I’ve met few people who have read his books, yet few writers have affected me as much as he has.  

Loren Eiseley in his office at the University of Pennsylvania Museum, May 12, 1960. Photo by Bernie Cleff, courtesy of the University of Pennsylvania Archives and Records Center.

I discovered Loren Eiseley when I was at Berkeley; a friend loaned me his book, All the Strange Hours: The Excavation of a Life. It’s mostly an autobiography, but it’s powerfully insightful about the world at large. He draws on his early experiences as a vagabond as much as he does as an academic, both of which yield a remarkable way of looking at the ancient and modern worlds.

I have all of his books, in both physical and ebook formats, and they’re among the few I never delete. I keep a list of quotes from Loren’s works in my phone, and I pull them up and read them every once in a while. Here are a few of my favorites. Remember, this guy is a hardcore scientist, although you’d never know it from what you’re about to read.

If there is magic on this planet, it is contained in water.

One does not meet oneself until one catches their reflection from an eye that is other than human.

The journey is difficult, immense. We will travel as far as we can, but we cannot in one lifetime see all that we would like to see or to learn all that we hunger to know.

If it should turn out that we have mishandled our own lives as several civilizations before us have done, it seems a pity that we should involve the violet and the tree frog in our departure.

When man becomes greater than nature, nature, which gave us birth, will respond. This last one strikes me as particularly prescient.

Ray Bradbury, another of my all-time favorite writers, said that ‘Eiseley is every writer’s writer, and every human’s human. He’s one of us, yet most uncommon.’

More than anything else, Loren Eiseley was a gifted observer and storyteller. In All the Strange Hours, he writes about a chance encounter on a train. I’d like to share a bit of it with you.

“In the fall of 1936 I belatedly entered a crowded coach in New York. The train was an early-morning express to Philadelphia and what I had been doing in New York the previous day I no longer remember. The crowded car I do remember because there was only one seat left, and it was clearly evident why everyone who had boarded before me had chosen to sit elsewhere.The vacant seat was beside a huge and powerful man who seemed slumped in a drunken stupor. I was tired, I had once lived amongst rough company, and I had no intention of standing timidly in the aisle. The man did not look quarrelsome, just asleep. I sat down and minded my own business. 

Eventually the conductor made his way down the length of the coach to our seats. I proceeded to yield up my ticket. Just as I was expecting the giant on my right to be nudged awake, he straightened up, whipped out his ticket and took on a sharp alertness, so sharp in fact, that I immediately developed the uncanny feeling that he been holding that particular seat with a show of false drunkenness until the right party had taken it. When the conductor was gone, the big man turned to me with the glimmer of amusement in his eyes. “Stranger,” he appealed before I could return to my book, “tell me a story.” In all the years since, I have never once been addressed by that westernism “stranger” on a New York train. And never again upon the Pennsylvania Railroad has anyone asked me, like a pleading child, for a story. The man’s eyes were a deep fathomless blue with the serenity that only enormous physical power can give. People on trains out of New York tend to hide in their own thoughts. With this man it was impossible. I smiled back at him. ‘You look at me,’ I said, running an eye over his powerful frame, ‘as if you were the one to be telling me a story. I’m just an ordinary guy, but you, you look as if you have been places. Where did you get that double thumb?’

With the eye of a physical anthropologist, I had been drawn to some other characters than just his amazing body. He held up a great fist, looking upon it contemplatively as though for the first time.”

That’s just GREAT writing. Powerfully insightful, visual, and entertaining. And, it demonstrates Eiseley’s skill as a naturally curious storyteller, and the use of storytelling as an engagement technique. His willingness to talk with the odd guy in the next seat over, to ask questions, to give the guy the opportunity to talk, demonstrates one of the most important powers of storytelling.

For most people, storytelling is a way to convey information to another person, or to a group. And while that’s certainly true, that’s not the most important gift of storytelling. The best reason to tell stories is to compel the other person to tell a story BACK. Think about the last time you were sitting with a group of friends, maybe sharing a glass of wine. People relax and get comfortable, and the stories begin. One person tells a story, while everyone else listens. When they finish, someone else responds: ‘Wow. That reminds me of the time that…’ and so it goes, around the group, with everyone sharing. 

When this happens, when the other person starts talking, this is your opportunity to STOP talking and listen—to really listen to the person. They’re sharing something personal with you, something that’s important and meaningful to them—which means that it should be important and meaningful to you, if you want to have any kind of relationship with that person. It’s a gift, so treat it accordingly. 

In west Texas, there’s an old expression that says, ‘Never miss a good chance to shut up.’ This is one of those times. By letting his seat mate talk, Loren Eiseley discovered amazing insights about him, but not just about him. He also learned about his views of society and the world. The conversation goes on for many pages beyond what I quoted earlier, and it’s powerful stuff. So never underestimate the power of the story as an insight gathering mechanism, as much as it is an opportunity to share what YOU have to say.

Here’s one last thing I want to mention. In the tenth episode of my Podcast, The Natural Curiosity Project, I talked about a book I had recently read called ‘The Age of Wonder.’  It’s the story of the scientists of the Romantic Age (1798-1837) who made some of the most important discoveries of the time—people like Charles Babbage, William Herschel, Humphrey Davy, Michael Faraday, and Mungo Park, scientists who had one thing in common: their best friends, partners, and spouses were, without exception, artists—poets and novelists, for the most part. 

These were serious, mainstream, well-respected scientists. For example, Charles Babbage was a mathematician who was the father of modern computing (he invented the Difference Engine, a mechanical calculator that had more than 25,000 brass gears). He was married to Ada Lovelace, the daughter of the poet Lord Byron, and a writer and mathematician herself. William Hershel built the world’s first very large telescopes in England, and his best friend was George Gordon, better known as Lord Byron, the romantic poet. Humphrey Davy was a chemist and anatomist who discovered the medicinal properties of nitrous oxide. His closest friend was the poet and essayist Samuel Taylor Coleridge, who wrote Kubla Khan and The Rime of the Ancient Mariner.

In Xanadu did Kubla Khan, 

A stately pleasure-dome decree: 

Where Alph, the sacred river, ran, 

Through caverns measureless to man, 

Down to a sunless sea.

John Keats, a poet and the author of Ode on a Grecian Urn, was also a medical student whose scientific pursuits shaped his poetry. Mary Shelley is well known as the author of Frankenstein; her last name is Shelley because she was married to Percy Bysshe Shelley, another romantic poet and essayist:

I met a traveller from an antique land, 

Who said—“Two vast and trunkless legs of stone 

Stand in the desert. . . . Near them, on the sand, 

Half sunk a shattered visage lies, whose frown, 

And wrinkled lip, and sneer of cold command, 

Tell that its sculptor well those passions read 

Which yet survive, stamped on these lifeless things, 

The hand that mocked them, and the heart that fed; 

And on the pedestal, these words appear: 

My name is Ozymandias, King of Kings; 

Look on my Works, ye Mighty, and despair!

Shelley’s work was filled with and flavored by the wonders of science. 

So, you may be wondering if there’s a ‘so what’ coming any time soon. The answer is yes: Don’t you find it interesting that these scientists were all supported by and influenced by their artistic friends, and vice-versa? What does that tell us about the importance of the linkage between science and the arts? Well, there’s a huge focus right now in schools on STEM—Science, Technology, Engineering, and Math. Now look: I’ll be the first to tell you that those are all important, but there’s are two letters missing: it should be STREAM. The ‘R’ is for ‘Reading,’ a necessary and critical skill, and the ‘A’ for ‘Arts’ needs to be in there as well, with as much emphasis and priority as the others. Anyone who doubts that should look to the lessons of earlier history.

And Loren Eiseley—remember him? Where does he fit into this? Well, think about it. What made him such a gifted scientist was the fact that he was, in addition to being a respected scientist, a gifted essayist and poet. During his life he wrote nine books, hundreds of essays, and several collections of poetry, all centered on the wonders of the natural world. His philosophy, his approach to his profession, embodied the learnings from the Age of Wonder. 

In one of his essays, ‘How Flowers Changed the World’ (which you’ll find in his book, ‘The Immense Journey’), Loren Eiseley had this to say:

If our whole lives had not been spent in the midst of it, the natural world would astound us. The old, stiff, sky-reaching wooden world (he’s talking about trees here) changed into something that glowed here and there with strange colors, put out queer, unheard of fruits and little intricately carved seed cases, and, most important of all, produced concentrated foods in a way that the land had never seen before, or dreamed of back in the fish-eating, leaf-crunching days of the dinosaurs.” 

Imagining the first human being who pondered the possibility of planting seeds, he writes: “In that moment, the golden towers of man, his swarming millions, his turning wheels, the vast learning of his packed libraries, would glimmer dimly there in the ancestor of wheat, a few seeds held in a muddy hand. Without the gift of flowers and the infinite diversity of their fruits, man and bird, if they had continued to exist at all, would be today unrecognizable. 

Archaeopteryx, the lizard-bird, might still be snapping at beetles on a sequoia limb; man might still be a nocturnal insectivore gnawing a roach in the dark. The weight of a petal has changed the face of the world and made it ours.” 

The poetic power of Loren’s science writing infuses the facts with human wonder. Here he is, writing about the stupefyingly boring topic of angiosperms, seeds that are enclosed in some kind of protective capsule, yet, we’re mesmerized by the imagery his words create.

What a world. And it’s ours.

HG Wells and the World Brain

Photo by George Charles Beresford, black and white glossy print, 1920

Not long ago, I started doing something I always said I would do, but honestly never thought I’d actually get around to doing. Remember when you were in high school or college and your English teacher assigned you a book to read? And it wasn’t something fun like Hardy Boys or Tom Swift or Doc Savage (showing my age, here) or Little House on the Prairie. No, it was something BORING by Mark Twain or Charles Dickens or John Steinbeck. If you were like me, you faked it badly, or in college you might have run down to the bookstore to buy the summary of the book to make your fake a bit more believable. Either way, it rarely ended well.

I’m a writer and storyteller by trade—it’s who I am. And, because I’m a writer, I’m also an avid reader—and I mean, avid. I average about 140 books a year. So, a couple of years or so ago, I decided to start mixing the classics into my normal mix of books, starting with Arthur Conan Doyle. At first, I was dreading it. But once I started reading and allowed myself to slow my mind and my reading cadence to match the pace of 19th-century writers, I was hooked with the first book. I blazed through them all, the entire Sherlock Holmes collection, and then moved on to Jules Verne and Mark Twain and H.G. Wells. Reading those books as an adult, with the benefit of a bit more life behind me, gave the stories the context that was missing when I was a kid. 

By the way, I have to interrupt myself here to tell you a funny story. I’m a pretty fast reader—not speed-reading fast, but fast. I pretty much keep the same reading cadence in every book I read, unless I’m reading poetry or a book by someone whose work demands a slower pace. Some southern writers, like Rick Bragg, slow me down, but in an enjoyable way. But a typical book of two or three-hundred pages or so, I usually blast through in about three days.

Not long ago, I read David Attenborough’s First Life, a book about the earliest organisms on the planet. That book took me two weeks to read. And it wasn’t because I wasn’t regularly reading it, or because the book was complicated, or poetic. It was because David Attenborough is one of those wonderful writers who writes the way he speaks. Which means that as I was reading, I was hearing his voice, and my reading began to mimic the pace at which he speaks on all the BBC programs: These … extraordinary creatures … equipped … as they are … for life in the shallow, salty seas … of the Pre-Cambrian world … quickly became the hunted … as larger … more complex creatures … emerged … on the scene. 

I just couldn’t do it. I tried to read at a normal clip, and I stumbled and tripped over the words. It was pretty funny. It was also a great book.

Anyway, I just finished The Time Machine by H. G. Wells. I saw the movie as a kid, loved it, but the book was, as usual, quite different from the movie. I loved Wells’ writing, and it made me want to read more. So, I decided to read one of his lesser-known works, his Outline of History, a massive work of about 700 pages. And, as so often happens when I read something new, I had an epiphany.

Let me tell you a bit about Herbert George Wells. During the 1930s, he was one of the most famous people in the world. He was a novelist and a Hollywood star, because several of his movies—The Invisible Man, Things to Come, and The Man Who Could Work Miracles were made into movies (the Time Machine didn’t hit the screens until later). In 1938, Orson Wells reportedly caused mass panic when he broadcast a radio show based on Well’s War of the Worlds, which also added to his fame. That story has since been debunked, but it did cause alarm among many.

Wells studied biology under T.H. Huxley at the UK’s Royal College of Science. He was a teacher and science writer before he was a novelist. Huxley, who served as a mentor for Wells, was an English biologist who specialized in comparative anatomy, but he was best known as “Darwin’s Bulldog” because of his loud support for Darwin’s theory of  evolution. He also came up with the term, ‘agnosticism.’  “Agnosticism,” he described, “is not a creed, but a method, the essence of which lies in the rigorous application of a single principle… the fundamental axiom of modern science… In matters of the intellect, follow your reason as far as it will take you, without regard to any other consideration… In matters of the intellect, do not pretend that conclusions are certain which are not demonstrated or demonstrable.” Pretty prescient words that need to be broadcast loudly today. Ask questions, and don’t accept a statement as truth until you know it is. That’s precisely why I started this series.

Sorry—I’m all over the place here. The Outline of History tells the story of humankind from the earliest days of civilization to the end of World War I—The Great War, The War to End All Wars. If only. 

From there, I went on to read Work, Wealth, and Happiness of Mankind, another of his lesser known works. Both are interesting takes on history and sociology, and somewhere between them, Wells invents the World Wide Web. Really.

Here’s how he begins the concept: 

Before the present writer lie half a dozen books, and there are good indexes to three of them. He can pick up any one of these six books, refer quickly to a statement, verify a quotation, and go on writing. … Close at hand are two encyclopedias, a biographical dictionary, and other books of reference.

As a writer, Wells always had reference books on his desk that he used regularly. As he developed the concept that he came to call the World Brain, he wrote about the early scholars who lived during the time of the Library of Alexandria, the greatest center of learning and scholarship in the world at the time. It operated from the third century BC until 30 AD, an incredibly long time. Scholars could visit the Library, but they couldn’t take notes (there was no Paper), and there were no indices or cross-references between documents. So, Wells came up with the idea of taking information to the people instead of the other way around, and figuring out a way to create detailed cross-references—in effect, search capability—to make the vast stores of the world’s knowledge available, on demand, to everybody.                

His idea was that the World Brain would be a single source of all of the knowledge contained in the world’s libraries, museums, and universities. He even came up with a system of classification, an information taxonomy, for all that knowledge.             

Sometime around 1937, with the War to End All Wars safely in the past, Wells began to realize that the world was once again on the brink of conflict. To his well-read and research-oriented mind, the reason was sheer ignorance: people were really (to steal a word from my daughter) sheeple, and because they were ignorant and chose to do nothing about that, they allowed themselves to be fooled into voting for nationalist, fascist governments. The World Brain, he figured, could solve this problem, by putting all the world’s knowledge into the hands of all its citizens, thus making them aware of what they should do to preserve the peace that they had fought so hard to achieve less than twenty years earlier. What he DIDN’T count on, of course, was that he was dealing with people—and the fact that you can lead a horse to water, but you can’t make him drink from the intelligence well. 

Nevertheless, he tried to raise the half million pounds a year that he felt would be needed to run the project. He wrote about it, gave lectures, toured the United States, and had dinner with President Roosevelt, during which he discussed the World Brain idea. He even met with scientists from Kodak who showed him their newest technology—the technology that ultimately became microfiche. But sadly, he couldn’t make it happen, and sure enough, World War II arrived.             

Here’s how he summed up the value of the World Brain: 

The general public has still to realize how much has been done in this field and how many competent and disinterested men and women are giving themselves to this task. The time is close at hand when any student, in any part of the world, will be able to sit with his projector in his own study at his or her own convenience to examine any book, any document, in an exact replica. 

In other words…the World Wide Web. Imagine that.

World War II caused Wells to fall into a deep depression, during which he wrote The Time Machine, which is, I think, the first post-apocalyptic novel ever written—at least as far as I know. He describes the great green structure on the hill, made of beautiful porcelain but now falling down in ruins; I suspect he was thinking about the sacking and burning of the great Library of Alexandria when he wrote that part of the book. 

Or, perhaps he was thinking of Percy Bysshe Shelley’s “Ozymandias”:

I met a traveller from an antique land,

Who said—“Two vast and trunkless legs of stone

Stand in the desert. . . . Near them, on the sand,

Half sunk a shattered visage lies, whose frown,

And wrinkled lip, and sneer of cold command,

Tell that its sculptor well those passions read

Which yet survive, stamped on these lifeless things,

The hand that mocked them, and the heart that fed;

And on the pedestal, these words appear:

My name is Ozymandias, King of Kings;

Look on my Works, ye Mighty, and despair!

Nothing beside remains. Round the decay

Of that colossal Wreck, boundless and bare

The lone and level sands stretch far away.”

Never underestimate the power of great literature. And never underestimate the power of curiosity when it’s unleashed on a problem. 

The Bad-Ass Librarians of Timboctou

Over the course of the last three months, I’ve taught a writing workshop at our local library here in Vermont. My audience was people interested in becoming better writers. Interestingly, a significant proportion of them weren’t interested in getting published; they just wanted to be better at the craft of writing. Refreshing! 

I’m embarrassed to say that I hadn’t spent much time in our library since our kids were growing up. Being the lover of books that I am, my library became Amazon, as I assembled my own cherished library at home. It’s funny: I recently did a quick survey of our house and was pleased to discover that there are books in every single room of the house—except for the dining room and bathrooms! Go figure.Anyway, the library we had back when the kids were in school and the library we have today are worlds apart. It has expanded, both physically and in terms of what it offers. The Dorothy Alling Memorial Library, situated on the Williston town green in front of the Williston Central School and adjacent to the town gazebo where the town band (the Williston Wheezers) plays on the 4th of July, still has books, but now offers a computer room with available instruction for those looking to develop their digital skills; a massive media collection; Internet access; loads of learning programs; and after-school activities for kids, which are well-attended.But they’re not unique in this, as it turns out. According to information published in The Atlantic, 84% of libraries in the country offer some form of software training, while 90% teach basic Internet skills. In fact, in 2019, 130 million people enrolled in programs offered by their local libraries, including digital literacy. In other words, libraries have gone from being passive repositories of dusty books to active educational institutions. And the value of the investment is returned handsomely: In Ohio, for every dollar spent on public libraries, the state received $5.48 in added economic value. Not a bad return on investment.

 These libraries have morphed into learning centers, digital community centers, and career hubs. Some libraries are partnering with local businesses to develop learning programs that will generate a steady flow of high-quality, skilled employees, ready to undertake work in the 21st century. 

 When was the last time you visited your local library? Check it out—it might surprise you. And if you have kids, make it a regular thing to visit with them. What this demonstrates, once again, is that knowledge really matters. It leads to the development of skills that create differentiation, opportunity, and hope. And where better to have that happen than the local public library? 

 And that’s why I want to tell you about a book I recently read. The name alone should hook you: The Bad-Ass Librarians of Timbuktu, by Joshua Hammer. It’s equal parts thriller, geography, history, and geopolitical intrigue. And, it’s all true. Here’s the story, without giving away the fun parts. Timbuktu (which means ‘Boctou’s well’ in the local dialect) has for centuries been a center of Islamic scholarship, an oasis of culture, knowledge and understanding in the center of Mali, a nation deep in the Sahara. 

 Abdel Kader Haidara, a minor government functionary in the 1980s, realized something one day: scattered across the Saharan sands of Mali there are tens of thousands of ancient manuscripts, some dating from the 5th century, all hand-illuminated, and all crumbling to paper dust because of heat, dry air, and termites. Stored in rotting chests or buried in the swirling sands of the Sahara, these books include early religious texts, medical treatises, political texts, manuals of early law, political treatises, personal journals of early explorers, accounts of travelers, and much, much more.

 Knowing the incalculable value of the knowledge captured in these books, Haidara set out on a quest that would make Don Quixote AND James Bond proud: to collect as many of them as possible, bring them to a world-class, centralized repository for restoration and digitization, thus preserving the wisdom of the ages. But there were some challenges: the restoration facility didn’t exist; and the books were mostly in the hands of families who didn’t trust the government (for good reason) and weren’t about to turn them over to a junior representative of that very same government.

 And then, there was the Al Qaeda problem.

 Sworn to destroy all vestiges of existing society and its historical foundations, Haidara knew that Al Qaeda would burn the books if they were found. So, he took on the incredibly hazardous task of preventing that from happening by mounting an enormous smuggling operation to move the books, all 350,000 of them, in secret, away from Al Qaeda.

You need to read this book—it’s a FANTASTIC story.

Apparently, reading, and books matter. I have to agree.

Who Authors Really Are

Here’s a childhood question for you. And I should qualify that—for the most part I’m talking to people who were kids in the 60s, and who shared the books they read with their own children.  Here’s the question: What do Carolyn Keene, Franklin W. Dixon, Kenneth Robeson, Laura Lee Hope, and Victor Appleton have in common? Hopefully some of those names resonate with you. The answer is that they’re all well-known authors to anyone who read The Bobbsey Twins, Hardy Boys, Doc Savage, the Campfire Girls, The Happy Hollisters, and a few others. The other thing they have in common? None of them exist, and they never did. They’re all pseudonyms.

Authors have used pseudonyms for a long time, and for all kinds of reasons. Sometimes they wrote  a new book that’s way outside of the genre they’re known for, and were afraid that it might dilute their main literary brand. Sometimes they wrote controversial content, and didn’t want it to be associated with their real name. Sometimes they had an important message that they wanted to share, but because the message was counter to prevailing opinion, or highly controversial, they chose to write under a pseudonym. For example, Silence Dogood,  Caelia Shortface, Martha Careful, Richard Saunders, Busy Body, Anthony Afterwit, Polly Baker, and Benevolus were all pseudonyms of none other than Benjamin Franklin, who was often at odds with prevailing politics. He was one of only a few early American authors who, as a man, wrote under a female pseudonym, and he usually did so to criticize the patriarchy. Another example is newspaper columnist Joe Klein, who wrote the very controversial book about Bill Clinton’s presidency called Primary Colors under the name Anonymous.

Other good examples are the authors Aaron Wolf, Anthony North, Brian Coffey, David Acton, Deanna Dwyer, John Hill, Leigh Nichols, Owen West, and Richard Page, all of which are pseudonyms used by none other than the blockbuster bestseller author of horror, Dean Koontz. He writes across many different genres, and his publishers were concerned early-on that having his name associated with books from different genres might dilute his main fan base, so they convinced him to write under different names. He’s written well over 100 novels, so I guess he can be forgiven. And it must be working, because he and his wife live in a 14,000 square foot home in Shady Canyon, the most exclusive gated community in Southern California. 

Other well-known writers have used pseudonyms as well. Stephen King, for example, wrote under the name of Richard Bachman. Others include Theodore Geisel, who we know as Dr. Seuss; Samuel Clemens, who wrote as Mark Twain; Mary Westmacott, better known as Agatha Christie; Eric Blair, who wrote 1984 as George Orwell; Marguerite Annie Johnson, whose poetry graced us as the work of Maya Angelou; Robert Galbrath, whose alter-ego, JK Rowling, gave us Harry Potter; and the well-known Snowqueens Icedragon, better known as E. L. James, who wrote the Fifty Shades series. Her real name is Erika Leonard. We also have J. D. Robb, who is the same person as Nora Roberts; Mother Goose, whose real name was Jeannette Walworth; and for anyone who enjoyed such terrific espionage books as Shibumi, The Eiger Sanction, The Loo Sanction, The Main, and The Summer of Katya, all written by the mysterious author Trevanian, we now know him to be the late Rodney William Whitaker, a well-respected film critic and the chair of the Department of Radio, TV and Film at the University of Texas at Austin.

Interesting, right? But there’s more to the pseudonym story. Not only do individual writers use pseudonyms, but so do entire publishing houses. One of the best known for this practice was the Stratemeyer Syndicate. Founded in the late 1800s, the company published under its own name until 1984, when it was acquired by Simon & Schuster. From the beginning, Stratemeyer published mystery book series for children, including Tom Swift, Rover Boys, Hardy Boys, the Bobbsey Twins, and quite a few others. 

But they weren’t published under the names of the actual authors; they were published under what are called house names, which are owned by the publisher, not by the author. This allowed them to use a stable of writers to create content under a single name, thus reducing their dependence on a single creative individual.

Here are some examples.

[Maxwell Grant was the name credited with writing the famous Shadow series, but he was actually five authors: Walter Gibson, Theodore Tinsley, Lester Dent, Bruce Elliott, and Dennis Lynds, who took turns writing the stories. Lester Dent had a successful career as the author behind the Doc Savage series, published by Street and Smith, although some of the titles in the series were written by Phillip Jose Farmer. The books were credited to house name Kenneth Robeson.

Jerry West was the author of the popular series, The Happy Hollisters; the actual author was Andrew Svenson, who wrote all the books in the series.

Victor Appleton was another house name used by Stratemeyer, under which they published the Tom Swift series, one of my favorites when I was a kid—in fact, I just bought a whole collection of them. In actuality, they were written by writer and broadcaster Howard Garis, who used several pen names including Laura Lee Hope for some of the Bobbsey Twins books, Clarence Young for the Motor Boys, Marion Davidson for the Camp Fire Girls series, and Lester Chadwick, under which he wrote a series called Baseball Joe. Interesting to me is that Garis also created a beloved character from my own childhood, Uncle Wiggly. During his long career, Garis wrote more than 15,000 Uncle Wiggly stories, which were published six times a week between 1910 and 1947.

So, there you have it—a peek behind the curtain at the seamy underbelly of the 20th century publishing industry. I had no idea.