Author Archives: drstevenshepard

Unknown's avatar

About drstevenshepard

ABOUT You can always bore yourself silly reading my bio, but here are a few highlights. I spent my early childhood in the American southwest, but when I was 13, my parents got transferred to Madrid, Spain, where we lived under Generalísimo Francisco Franco’s regime. Hmmm… from west Texas to a dictatorship—not much culture shock there. Anyway, as a result of that incredible experience, I fell in love with languages, and culture, and travel, so when I graduated high school I went to the University of California at Berkeley, where I majored in Romance Languages with a specialization in Spanish and minored in Marine Biology. When I graduated, fully armed to teach fish how to speak Spanish, I became a commercial diver and did that for five years before taking the obvious next step, which was working for Pacific Telephone as a network analyst in the San Francisco Bay Area. I did that for 11 years, and somewhere along the way I earned my master’s in international business from St. Mary’s College. Not long after, I accepted a position with a telecom consultancy in Vermont, which is identical to California, but different. So, my wife Sabine and I moved our two kids across the country to join the new firm, Hill Associates, where I worked as an educator and writer for ten years, traveling constantly. After ten years with Hill, I left to start my own company. I wanted to do more writing because I had published my first telecom book a few years earlier, and it had become a legitimate bestseller; and I wanted to do more international work. So, in 2000, I started the Shepard Communications Group, and from 2000 until 2020, when the zombie apocalypse struck, I pretty much lived on airplanes, racking up more than three million airline miles and working in more than 100 countries. Somewhere along the way, I earned my PhD from a university in South Africa where I was teaching and consulting on a regular basis. A few years ago, clients began asking me if I could create audio programs to help them tell their stories. They knew that I’d written more than 40 industry films, so I began to do audio work, which turned into voice-overs and Podcasts (see The Natural Curiosity Project, below), and which, in combination with my biology background and my ferocious love of the natural world, got me interested in recording the sounds of our non-human neighbors, the world they inhabit, and things I could do to help protect it and raise awareness of our impact on it. Hence, www.CritterChorus.com. Sabine and I married in 1981, a few weeks before I started at the phone company in California. As I write this, it’s early 2023, meaning that she has put up with me for more than 41 years, which doesn’t include the five years that we dated. With the pandemic mostly behind us, I’ve decided that it’s time to make a professional pivot once again. I’m largely (not completely, but mostly) leaving my technology career behind to focus on five things: producing The Natural Curiosity Project; writing, with a strong focus on conservation and the role that technology can play to help protect the natural world; recording the sounds of nature; being a grandfather to our five little grandkids; and trying not to get in Sabine’s way so that she’ll keep me around a while longer. BIO Click Here to download a full bio (excellent if you have a sleeping disorder). THE NATURAL CURIOSITY PROJECT I started the Natural Curiosity Project Podcast for one reason: to drive curiosity. I believe that curiosity is our sixth sense, and I also believe that we don’t use it anywhere near often enough. There are countless stories out there that need to be told, so I decided to tell them. Sometimes I do audio essays about a topic that I think my listeners will find interesting, but more often, I interview people who have a great story to tell. Want some examples? Sure. In one episode, I interviewed my best friend from high school, who is now a well-known actor. He explained how to read movie credits. What’s a best boy? Is there a worst boy? Is a gaffer’s job to jab actors with a long, pointed hook? I interviewed Dewitt Jones, a renowned former National Geographic photographer, who now travels the world with his extraordinary message to celebrate what’s right with the world, rather than wallow in what’s wrong with it. What great advice THAT is. I interviewed an elderly man I met in west Texas, Bud, whose hobby is tying messages to tumbleweeds, messages that ask whoever finds the note to send him an email telling him where it was found so that he can track how far it traveled (more than 40 miles in some cases). I’ve interviewed filmmakers, audio engineers, pilots, hikers on the Appalachian Trail, artists, scientists, and more than a few wildlife sound recordists. The only thing that ties the episodes together? The only common theme? This is something you should be curious about, so pay attention! The Natural Curiosity Project is available wherever you get your Podcasts—have a listen. Thanks! SOUND In addition to producing The Natural Curiosity Project, I also record the sounds of the natural world. There are three types of sound on this planet: biophony, which includes all the sounds made by the non-human residents of our planet; geophony, which includes the sounds made by wind, rain, water running in a stream, thunder, earthquakes, the crackling of the aurora borealis, and countless other sources; and anthropophony, the sound—noise? —made by humans, which includes airplanes, cars, chain saws, and so on. There’s way, way too much of that last one. I believe that the critters that we share this planet with have something important to say, and it’s high time we paid attention, because our very existence is taking away their voices. Their voices are every bit as important as ours, and when theirs fall silent, ours will soon follow. BOOKS Writing is not something I do; writing is something I am. As I told a friend not too long ago, in terrible English, I can’t not write. Wow—a double negative. I write every day, not because I have to, but because I want to. I’ve been writing since I was ten years old, when I completed my first book, a magnificent 23-page homage to my hero at the time, pulp fiction star Doc Savage. Silly? Absolutely. But it’s where I started, and today I have 99 books on the market with three bestsellers so far, and hopefully more to come. In terms of subject matter, they cover the waterfront: technology, leadership, storytelling, photography, writing, wildlife sound recording, three novels, children’s books, biography…like I said, it’s the craft, not the subject matter. I just love to write. I’ve also written magazine and journal articles, screenplays, technical documentation (zzzzzzzz…), and audio and video scripts.

Book Magic

Book Magic

If you’d prefer to listen to this as an audio essay, please visit The Natural Curiosity Project or click here.

The hardest thing about writing a book isn’t coming up with the story, or inventing the complicated relationships that help define the characters, or making sure the story flows the way it’s supposed to. It isn’t the painstaking process of finding all the typos and misspellings and missing quotes, or fact-checking every tiny detail so that a reader who has it in for you discovers with chagrin that there’s little to criticize. Nope—it’s none of those, although those do require work.

The hardest thing about writing a novel is creating the one-paragraph synopsis that goes on the back cover. Think about it. The publisher says to the author, “Please take your 140,000-word, 468-page novel and describe it in 125 words or less, in a way that will cause a prospective reader to drool uncontrollably all the way to the checkout counter at the bookstore.”

Good luck with that. Like I said: Hard.  

I’m about to publish a new novel, my fifth, called “The Sound of Life.” My editors have gone through it with their editorial microscopes, identifying mistakes, errors and omissions. My cadre of readers have gone through it, uncovering awkward dialogue, technical errors, and flow problems that I inevitably missed. The final manuscript is called ‘The Sound of Life v48F,’ which means that the book went through 48 complete rewrites before I deemed it ready for publication—although there will be at least two more read-throughs before I give it the final go-ahead.

I’m proud of this book. It’s my 106th title (bad habit), and I felt a sense of letdown when I typed the last sentence and knew it was done. That’s never happened to me before. Because of the story that magically emerged from the creative mists before me, the wonderful characters I met along the way, and the journey they allowed me to join them on, when I typed the last word of the final sentence, I felt like I was pulling into the driveway after a long, memorable road trip. I needed a medicine for melancholy, because it was over.

Author Alice Munro wrote, “A good book makes you want to live in the story. A great book gives you no choice.” That’s how I felt with this one. And please understand, this isn’t my ego talking. I experienced something as I wrote this book that rarely happens, like seeing the mysterious and elusive “green flash” over the ocean at sunset. At some point along the creative journey, I realized that I was no longer writing the book: it was writing itself. My job changed from creative director to scribe. It was like it was saying to me, ‘Here’s the keyboard. Try to keep up.’

Author M.L. Farrell said this about books:

A book is not mere paper and words.

It is a door and a key.

It is a road and a journey.

It is a thousand new sights, sensations and sounds.

It holds friendships, experiences, and life lessons.

A book is an entire world.”

There’s so much truth in that. I’m at the point with this one where people are asking me what “The Sound of Life” is about, and now that I know, I’m excited to tell them. But as I describe the 56-foot boat that’s central to the story, the journey from the eastern Caribbean through the Panama Canal then up the coast to Northern California, the rich interactions among the characters, and the happenings in Peru that tie much of the narrative together, I realize somewhat sheepishly that every time I tell someone what the book’s about, I speak in the first person. Not ‘they,’ but ‘we.’ Well, sure—I was there. I was along for the ride. Why wouldn’t I speak in the first person?

Stephen King is a writer whom I admire greatly, for many reasons. “Books are a uniquely portable magic,“ he once said. A uniquely portable magic. I think about the complexity, richness, excitement, laughter, and delicious food that’s captured between the covers of this book. I think about the immensely likable people and their relationships, around whom the story revolves. I think about the sights and sounds and smells and tastes they experience along the way. And I think about what it felt like when my characters, my good friends, got back on the boat and motored away, waving as they left me behind on the dock, en route to their next adventure. 

A uniquely portable magic.

“The Sound of Life” will be released in December 2025.

The Sounds Below

I earned my NAUI Certification card—my C-card, as divers call it—in 1977, and proudly pocketed my Instructor card a year later.  As a newly-minted dive shop owner, I taught basic skills in the pool every weeknight, and on weekends I was either somewhere along California’s north coast taking new divers on their first free dive, or in Monterey for final class certification dives. The ocean has always fascinated me; like so many people, I watched, enraptured, as Jacques Cousteau and his team explored the undersea world. When I was a little boy, I pulled a pair of my underwear over my head so that one leg hole served as my face mask and pulled a pair of my dad’s socks onto my feet to serve as fins. I swam down the dark hallway, Jacques at my side. Once I was certified, the ocean became the center of my life, and that has never changed.

My first open water SCUBA dive was at Monterey Bay’s Cannery Row, back when it still had the ruin and wreckage of the old canneries strung along the beach where fancy hotels and restaurants stand today. With the clarity of poignant memory I remember pushing off the surf mat, raising the BC hose over my head, and descending below the calm surface into a world that I would come to love more than just about any other place on the planet. It is a place in which I am so inordinately comfortable that I once fell asleep lying on the bottom of Monterey Bay, my hands under my regulator as I watched life go on, tiny creatures crisscrossing the sandy bottom on their mysterious errands.

In consummate awe I dropped through the kelp on my way to the bottom during my first dive. As I descended, I brushed against the kelp leaves, causing a shower of pea-size crabs, moon snails, nudibranchs and other creatures that before my descent had been in-residence on the various levels of the Macrocystis. I would later teach my own students that at as much as a foot a day, giant kelp is one of the fastest growing plants on Earth, and that its flotation bladders are filled with enough carbon monoxide to kill a chicken in three minutes.

As I approached the sandy bottom on that first dive, I realized I had a problem. I was falling too quickly. I was a new diver, and buoyancy was not yet something I controlled subconsciously. Looking down as I approached the ocean floor, I had the overwhelming realization that no matter where I landed, whether on those rocks in front of me, or that patch of sea lettuce over there to my left, or on those old, eroded pipes from the canneries, or on the flat, sandy bottom over there, in the process of touching down I would crush countless lives. So profuse was the riot of living things that there wasn’t a square centimeter anywhere that didn’t have something living on it. 

Luckily, I was able to arrest my descent before I destroyed the community below me. I managed to go into a hover, where I stayed, unmoving, just taking it all in. My sense of wonder was so great that I lacked the ability to move. But the truth is that I didn’t want to move: I would have had to drain the tank on my back and three more like it before I saw every living thing on the patch of bottom directly beneath me. In fact, I was so motionless in the water column that my instructor came over to make sure I was okay.

As I floated, unmoving, something else crept into my consciousness: the sounds of the underwater domain. The bubbles from my exhalations. The mechanical hiss and click of my regulator. The far-away sound of a propeller frothing the ocean. A deep, unrecognizable rumble, something industrial, far away.

And then there were the clicks, trills, and bloops, the buzzing and scratching and chirping of ocean life. In other words, a cacophony, a joyous symphony, the countless voices of Monterey Bay. 

At night, the score changed. There were fewer human sounds and more natural sounds, mysterious and eerie. This became my favorite time to be in the ocean; night diving is profoundly magical. Once we sank to the bottom, turned off our lights, and allowed our eyes to acclimate to the darkness, we could see remarkably well. Every movement, every fin stroke, every turn of the head created a star-storm as the moving water caused bioluminescent plankton in the water to spark alight. Every passing seal or sea lion or otter drilled a contrail of glowing green through the black water like a living comet. This was nature’s alchemy at its best. 

And, there were sounds—so many sounds. I once did a night dive at the far end of the Monterey Coast Guard Pier where a huge colony of seals and sea lions congregates. Divers know that if they turn on their powerful dive lights during a night dive, their vision goes from a dim awareness of everything around them to brilliant awareness of whatever is illuminated by that narrow white beam directly in front of them, drilling a hole into the darkness. Night divers also know that for reasons known only to them, sea lions enjoy barreling down the light beam toward the diver, blowing bubbles and roaring like a freight train—then veering off into the darkness at the last moment before colliding with the now terrified diver. It has happened to me more times than I can remember, and it still scares the hell out of me when it does.

Twice over the years I heard the siren song of whales while night diving in Monterey; once I heard the telltale blast of sonar, presumably from a submarine somewhere outside the Bay. It was mildly terrifying, and it was more than a little painful. One night I found myself on the Cannery Row side of the Coast Guard Pier, not far from the sea lion incident I just described. Sensing movement beside me, I saw that three gigantic ocean sunfish, mola mola, easily eight feet from top to bottom, had unwittingly surrounded me. They meant no harm and were most likely oblivious to me. But with them came a sound, a combination of stomach rumble and the squeak of a hand rubbing a balloon. It was all around me, and it was loud. At first I thought it was air moving around inside their swim bladders, a common marine sound, but giant sunfish don’t have swim bladders. To this day, I have no idea what I was hearing, but I’ve never forgotten it. All I know is that when the sunfish disappeared into the depths of the Bay, the sound disappeared with them.

I have long been an avid photographer, both above the surface and below it. But as time went on, I began to pay more attention to what my ears were telling me than what my eyes were. I don’t know what caused that focal shift; perhaps it was the fundamental nature of the two senses. Not long ago, on a whim, I sat down with a calculator and my photo database and did a back-of-the-envelope calculation. It turns out that from the time I started shooting seriously until today, a period that covers just shy of 50 years, I shot approximately 500,000 images. Big number. Most of them I shot at a 250th of a second, my preferred shutter speed. That means that every 250 images I shot covered one second of Earth time. 500,000 images, then, translates to 2,000 seconds, which is just over 33 minutes. In other words, my nearly 50 years of serious, near-constant shooting captured a half-hour of my life. 

On the other hand, when I go out to record sound, I often sit for an hour or more with the recorder running, capturing a soundscape. During that time, I immerse myself in the environment and become part of it, something that’s impossible to do in a 250th of a second. With my camera I click and go, rarely lingering after the famous ‘moment it clicks’ to savor the entirety of what I just captured a tiny slice of. 

Photography is about capturing a still image, a single, frozen moment in time. But what in the world is a ‘still sound’? The answer of course, is there is no answer. The difference between a photograph and a sound recording, beyond the obvious, is time. A photograph captures a moment in time; a sound recording captures a moment over time. Photography is often described as a “run-and-gun” activity. But when I go out to record, that approach doesn’t work because sound recording by definition is immersive: I have to settle down in the environment, get my gear sorted, and be quiet by being still. If I’m still, I pay attention. And if I pay attention, I notice things. My awareness of my surroundings isn’t limited to what I see through the narrow viewfinder of a camera; it’s as broad as I choose to make it, and the longer I sit, the richer my awareness becomes. 

Maybe it’s age-related. I’m older now than I was when I started photographing seriously; with age comes patience, and patience is a critical element of sound recording. Saint Augustine said, “The reward of patience is patience.” And it isn’t because I have more time now that I’m older; I have the same time now that I had when I was 21, a full 24 hours every single day. It’s a question of how I choose to use those 24 hours. Bernie Krause, writing in The Power of Tranquility in a Very Noisy World, said, “Heed the narratives expressed through the biophony. Our history is writ large within those stories. Be quiet. Listen. Be amazed.” 

Be quiet. Listen. Be amazed. Great advice for all of us.

Why I Write

I wrote my first novel, Inca Gold, Spanish Blood, in 2015. By the time I really started to work on it, I’d been a dedicated writer—meaning, I knew that writing was who I am, not what I do—for decades. By then I’d written not only books but countless magazine articles, essays, white papers, poetry, training manuals, and short stories. I’d read every book on writing I could find, and every book recommended by people who write books about writing. I had favorite authors across many genres, and I knew why they were favorites. I had attended writing workshops; I was in numerous writing groups; and I wrote constantly—not in the self-flagellant ‘force yourself to get up at 5 AM every morning and write for two hours before going to work’ way, but whenever the mood struck—which was nearly every day. Sometimes all I wrote was a paragraph, or a meaningful sentence; sometimes I wrote 40 or 50 pages. All that matters is that I wrote. 

I developed the Zen-like patience required to deal with the publishing world. I accepted the fact that the magic number for submitting an article or a manuscript or pretty much any new material to publishers is around 30, meaning, the number of publishers you must submit to, on average, before one of them takes the bait. 

And, I learned the secrets of getting noticed by an editor. I learned that the phrase “Submission Guidelines” is a lie. It should say, “Don’t even THINK about straying from these iron-clad, inviolable, unwavering, universally-applied rules for submitting your work to the publishing gods if you want anyone to even consider looking at your submission.” 

I developed a carefully-curated Council of Druids, my personal cadre of editors, each of which has the same fundamental characteristics: they’re voracious readers; they’re endlessly curious; and they’re willing to read what I write and provide detailed, brutally-naked feedback. Do you know what’s less-than-useless to a writer? Someone who provides a crazed smile, two thumbs-up, and the word ‘awesome’ as their feedback to a written piece. Empty calories. My Druids, on the other hand, are never afraid to say, “Steve, with all the love in my heart, you need to drop this back into whatever swamp you dredged it out of, and here’s why.” In other words, they actually provide feedback that’s meaningful and that can be acted upon. And as much as it hurts sometimes, I carefully read and consider, and usually incorporate, every single comment. Their reading makes my writing better.

As a result of all this, I learned my way around the English language. I became grammatically proficient. I paid close attention and learned how dialogue works—and why it often doesn’t. I found myself reading about 140 books every year, and because of that I developed an extensive vocabulary and an awareness of when not to use polysyllabic words, just because I know them (thank you, Mr. Hemingway). I paid careful attention to structure and flow. I began to realize that genre is merely a suggestion: that some of the best books have elements of romance, science fiction, history, travel, global affairs, poetry, and politics, in spite of the label they’re given by the bookstore. 

I also trained myself to ignore the naysayers, the trolls who make it their mission to savage other peoples’ work because they can. They’re cowards, hiding behind the bastion of the Internet. Some reviewers give constructive or kind comments, and for those I’m grateful. But many don’t. Do NOT let their negative comments slow you down. You wrote a book, dammit. They didn’t. Ignore them for the miserable people they are.

I began to understand that I write so that others may read. When I drive my grandkids home after a day with my wife and me, I take the responsibility very seriously indeed. And when I take my readers on a journey, I take the responsibility no less seriously.

So, you can imagine how I felt when I found myself running into roadblock after roadblock as I tried to get a publisher to look at my novel. Here’s what was clattering around in my head, like a handful of marbles. I clearly knew how to write because I’d been doing it for a long time. I was published many times over by big, well-known houses, and I had two bestsellers to my name. I always met or exceeded deadlines. Yet time and again I submitted, and time and again I got back … nothing. Crickets. Even though I followed the submission rules, I didn’t even get rejection letters to add to my already impressive folder of same.

So, I called my editor at one of the big houses whom I had known for years and with whom I had created many successful books—and a genuine friendship. I explained my situation to him, knowing that he doesn’t publish fiction but hoping he could provide some insight. He did, and his response was blunt: 

“Steve, here’s what you’re facing. The fact that you have had major success in the non-fiction realm is meaningless to editors in the world of fiction. The firewall that exists between the two domains is so thick that it’s as if you have never written or been published at all.” 

And this was the clincher: “Your chances of getting this book published are roughly the same, whether you submit it or not.”

Bummer.

This glaring realization kicked off a new chapter in my writing. I ended up self-publishing the novel, and it did well. I then wrote a second, self-published it, and it became a number-one global bestseller on Amazon for a few weeks. I wrote two more, and they also did well—not bestsellers, but readers buy them and like them. And what I realized, and frankly, what I knew all along, was that in some ways, getting a book published was more important to me than writing one. That was a significant realization, and it changed how I think about why I write, because it was the wrong perspective for a writer. Yes, of course I want my work to be published, but first, I’m a writer. Writing is enormously creative; publishing is enormously mechanical. And when I write, I write for my readers and I take that responsibility seriously. But honestly, I write for myself. I write books that I would like to read. It makes me feel good. It challenges me, forces me to work hard to be better at it. 

As writers—all writers, regardless of genre—our goal should be to write books that people want to read, and who then come back for more after they’ve done so. We shouldn’t write for the likes, or the thumbs-ups; those are more empty calories. We write because we have something to say that matters. If we do that, our audiences will find us. 

I’m currently writing sequels to two of my novels: Inca Gold, Spanish Blood, and Russet. Russet is my most recent work, so the characters and plot line are still fresh in my mind. But Inca Gold came out in 2016 and I had forgotten some of the story’s details, and I’m embarrassed to say, the names of some of the characters. So, I put on my reader hat, picked up the book, and read it, ignoring the fact that I was its author. And I mean, I really read it. And you know what? I liked it. A lot. It didn’t waste my time, and it made me want to read more. And that’s all the motivation I need to keep going.

Test Post-Please Ignore

Of Fens, Mires and Bogs

I Just finished a terrific book called Following the Water by David Carroll. I’ve read all his books; he’s a New Hampshire-based naturalist who specializes in turtle ecology. That makes me smile, because there aren’t many animals that I like as much as turtles. Following the Water is a collection of reflections on his wanderings around the streams, ponds, forests and fields that surround his home. 

I’ve spent most of my career in the technology domain, telecom mostly, so I’m very familiar with the acronyms and unique terminology that every field creates for itself. For example, I don’t play bridge, but I love to read the bridge column in the newspaper, just because I don’t have the foggiest idea what they’re talking about. Here’s an example:

In today’s deal the situation in three no-trump is complicated by South’s desire to keep West off-lead. Declarer will have seven top tricks once he has knocked out the heart ace, so must find two more tricks from somewhere. Fortunately, there are lots of extra chances: the spade finesse, an additional heart trick, and an extra club winner or more. The key, though, is for South to combine his chances in the right order.

Say what? The spade finesse and an additional heart trick? I have no clue what the author’s talking about, but reading the column is like watching a linguistic train wreck. I can’t stop myself.

So, it’s no surprise that Carroll’s book has its own words that address the needs of the aquatic ecologist. As he describes the place where water and land meet to create complex ecosystems that each produce their own unique collection of living things, he draws on a poetic collection of words to describe the hidden world that he’s devoted so much of his life to. What is so interesting to me is that as I read his book, one mysterious word leads to another, causing me to spend way too much time in the dictionary. 

As we follow Carroll through a dense tangle of willows, he describes it as a carr. A carr, it seems, is a bog or a fen, where willow scrub has become well-established. That, of course, sent me back to the dictionary in search of bogs and fens (by the way, this was almost as much fun as actually getting muddy). A fen, it turns out, is one of six recognized types of wetland and one of two types of mire. The other is a bog. Fens tend to have neutral or alkaline waters, whereas bogs are acidic. A mire, by the way, sometimes called a quagmire, is the same as a peatland. Peatlands can be dry, but mires are always wet. Mires, by the way, are the same as a swamp, except that mires tend to be colonized by mosses and grasses, while swamps usually have a forest canopy over them.

Carroll also spends a lot of time describing vernal pools and the creatures that spawn in them. I love that term, vernal; it conjures something mysterious for me, a place of unknown creatures that rise from the depths at night. Think Dr. Seuss’ McElligot’s Pool. Anyway, vernal pools are temporary pools that provide habitat for specific species, although not fish. They tend to be temporary, and are often teeming with things like tadpoles, water striders and whirligig beetles. They’re called ‘vernal’ because they’re at their deepest in the spring (the word comes from the Latin, vernalis, the word for that season), and they’re typically found in low spots or depressions in grassland habitats.

Another word that comes up a lot is riparian. Riparian describes the transition zone that lies between the land and a river or stream that runs through it. Riparian areas are important, because they filter and purify water that runs off the land and enters the waterway. A biome, by the way, is a community of plants, animals or microorganisms that inhabit a particular climatic or geographic zone. So, a riverbank would be a riparian biome.

And what about the wetlands that Carroll refers to throughout the book? Well, a wetland is an area that’s eternally saturated with water, like the Everglades. They’re standalone environments, but they can also include swamps, marshes, bogs, mangroves, carrs, pocosins [puh-CO-sin], and varzea [VAR-zea].

By the way, because you’re dying to know, a pocosin is a palustrine [PAL-e-streen] wetland with deep, acidic peat soils, sometimes called a shrub bog. Palustrine, incidentally, comes from the Latin word palus, which means swamp. Palustrine environments include marshes, swamps, bogs, fens, tundra, and flood plains.

And since we mentioned it, a varzea is a seasonally-flooded woodland specific to Brazil’s Amazon rain forest. A marsh is a wetland dominated by herbaceous rather than woody plants – grasses, rushes and reeds, instead of shrubs and trees. It’s also a transition zone that’s marinated in stagnant, nutrient-rich water. By the way, swamps, like the Everglades, move water across their surfaces, while mires move water below the surface. Marsh plants tend to be submerged; mire plants are not.

Fens, swamps, mires and bogs: who would have thought there was so much diversity at the water’s edge.

Twilight Zone, Season 1, Episode 22

A small town in America, summer, 1959. Maple Street. An ice cream vendor pushes his cart up the sidewalk, ringing a bell; kids play stick ball in the street; a neighbor mows his grass with a push mower. Another lies under his car, tinkering with it. In the distance, a dog barks.

Suddenly, the power goes out—all power. Stoves and refrigerators stop working; the radio goes silent; cars won’t start. Neighbors gather in an uneasy group. They begin to speculate about what might be causing the outage, their voices growing strident as speculation turns to suspicion. Could it be the meteor that some of them heard pass overhead earlier?

While one man argues for a rational explanation—sunspots, perhaps—another points the finger at a neighbor who isn’t present, using his odd quirks to irrationally explain the widespread lack of electricity. Then, inexplicably, power returns to a single car in a driveway, and it starts with a rumble. 

“It’s space aliens,” says a young comic book-obsessed boy. “They come to earth disguised to look just like us, and blend in. They’re different, but no one can tell because they’re identical to the rest of us.”

And the man who owns the car that mysteriously starts and stops? He’s as mystified as the other neighbors, but because it’s his car engaging in inexplicable behavior—the engine roaring to life when there’s no one at the wheel—he’s to blame. He must be the alien.

In the end, as the town tears itself apart through self-created fear, the real aliens look down on the town from their cloaked ship. One of them says to the other (and they look as human as the people in the streets below), “The pattern is always the same. They pick the most dangerous enemy they can find, and it’s themselves. Their world is full of Maple Streets. We’ll go from one to the next and let them destroy each other.” 

Rod Serling wraps up the episode as only Rod Serling can do: 

The tools of conquest do not necessarily come with bombs or explosives or fallout. There are weapons that are simply thoughts, attitudes, prejudices, found only in the minds of men. For the record, prejudices can kill, and suspicions can destroy. And a thoughtless, frightened search for a scapegoat has a fallout all its own—for the children, and the children yet unborn. And the pity of it is, these things cannot be confined to the Twilight Zone.

I want every living person in the United States to watch this episode, and then think about current events. Clearly, Rod Serling was correct: These things cannot be confined to the fantasy of the Twilight Zone, where they belong.

The Complex Dance of Curiosity, Awe, and Wonder

Part 1

“It’s called a true binary.”

I was 13 years old, and I was standing with my childhood friends Bill Meadows, Peter Norris, and Gil DePaul in the frigid interior of the home-built observatory in Bill’s backyard. The four of us stood in a sloppy circle around the telescope, taking turns looking through the eyepiece and shivering in the late-night winter air.

I like to think that our collective friendship served as the model for the TV show, “Big Bang Theory,” because just like Leonard, Howard, Sheldon, and Raj, our world revolved around the wonder of science and was powered by our collective curiosity. The main difference was that in our cadre, the counterparts for Penny, Amy and Bernadette were conspicuously absent. Clearly, we had not yet been introduced to awe.

We loved electronics, and geology, and astronomy, and all the many offshoots of biology; we would often gather for electronic component swaps, or rock and mineral trades, or just to build things together or admire each other’s latest acquisitions of exotic reptiles or amphibians. At one point, my parents gave me a Heathkit electronics project board, pre-wired with capacitors and resistors and transistors and coils, each connected to little stainless-steel springs that allowed me to run jumpers between the components to wire the projects outlined in the manual. I will never forget the day I learned that by swapping between different components and by wiring the output to a variable resistor, I could make it play wildly oscillating sounds that would be great as the background music for a science fiction film. I had invented a version of the Moog Synthesizer, before anyone knew what that was.

I learned two of life’s important lessons from Bill Meadows: the immensity of the universe, and the immensity of personal grief. The first, my 13-year-old shoulders were prepared to carry; the second, not so much. One Christmas morning after all the gifts had been opened, I called Bill to see if he wanted to get together, probably to compare Christmas bounty. He couldn’t, he told me; his Mom had just died. Maybe tomorrow, he said, with infinite grace. I didn’t know how to process that level of profound loss, but he did, and the grace with which he carried the pain is something I still think about today.

As I said, we were the Big Bang Theory gang before there was a Big Bang Theory, and Bill was our Sheldon Cooper—not in the awkward, geeky way of the show’s lovable main character, but in the brilliant, quirky, knowledge-is-everything way of smart, driven, passionate people. He went on to become a gifted composer and musician, a semiconductor designer, and of course, a top-notch quasi-professional astronomer. We’re still very much in touch; recently, he guided us when Sabine bought me my own telescope. Yes, it’s true. She’s awesome.

Like teenage boys everywhere, a glimpse at a copy of Playboy was something to be whispered about for weeks, but the publication that really got our motors humming was the annual catalog from Edmund Scientific Company. Sad, I know, but have you ever SEEN a catalog from Edmund Scientific?

Bill, like the Edmund catalog, was an endless source of knowledge and information. I can still remember things I learned from him. Like, how many sides an icosahedron has (the answer is 20). What an ellipse is, and how to make one (slice the top off a cone at an angle). How to work a Foucault Tester. What ‘New General Catalog’ and ‘Messier Numbers’ mean (unique designators for star clusters, galaxies, and nebulae). Why it was appropriate to drool on a Questar Telescope if I ever found myself in the same room with one. 

I even remember the night my Mom was driving us to a long-forgotten presentation at the junior high school. As a car went by us at high speed, the sound rose and fell with its passing. In unison, Bill and I said, “Doppler Effect,” then we laughed.  But I was a bit awestruck. I was one with the dude.

Somewhere around 1967, Bill decided that something was missing in his backyard. Not a tomato garden, or a jungle gym, or a trampoline; not a picnic table, or a barbecue grill, or a weight set. No, this 13-year-old decided that what was missing, what would really round the place out, was an observatory. His Dad agreed, and they built one. We all helped a little bit here and there, but this thing was Bill’s baby. It looked like half of a shoebox with a giant tuna can sitting on top, and the whole thing sat at roofline level on top of four pieces of drilling pipe punched into bedrock. Coming up through a hole in the center of the floor was a fifth piece of pipe which ultimately became the telescope mount, isolated from the observatory structure so that our walking about didn’t vibrate the telescope when it was focused on whatever it was focused on. The top and side of the tuna can had a two-foot-wide slit that could be opened for viewing. Many were the nights that we had sleepovers at Bill’s house, curled up and freezing in the observatory as we focused the telescope on distant celestial objects, things Bill could casually name and describe from memory, having seen them many, many times with whatever telescope he used before he built the big one.

The big one: Edmund Scientific sold it all. But buy a ready-made telescope? Piffle, said Bill, or whatever the 1967 equivalent of piffle was in west Texas. Instead, he created a shopping list:

  • First, a large quartz mirror blank, which was 12 inches or so in diameter;
  • Assorted grits to hand-grind a parabolic surface into the blank;
  • A Foucault tester to ensure the mirror curvature was correct once the grinding was done;
  • The tube for the telescope body;
  • An adjustable mirror mount;
  • The eyepiece ocular;
  • Assorted eyepieces;
  • An equatorial mount to attach the finished telescope to the center drilling pipe, with an electric star drive;
  • And of course, various accessories: counterweights, a spotting scope, and assorted mounting hardware.

We all claimed some of the credit for building that telescope because all of us spent time hand-grinding the blank under Bill’s watchful eyes. But make no mistake: it was Bill who built that thing. He ground and ground and ground, week after week after week, starting with a coarse abrasive grit and grinding pommel, then onto a finer grit, and then finer still, until he was working with red polishing rouge at the end. I remember his pink-stained fingers at school. School: it was so fitting that we attended the brand-new Robert H. Goddard Junior High School in Midland, Texas, complete with rockets mounted on stands out front. Goddard, who invented the modern liquid-fuel rocket, was long dead, but his wife came to visit the school not long after it opened. I still have her autograph.

It’s interesting to me that Goddard designed and launched his rockets near Roswell, New Mexico, where my maternal grandparents lived, and where …  well, you know.

Once Bill was done with the grinding and polishing, he shipped the mirror blank back to Edmund, and they put the mirrored surface on it and shipped it back, ready to be mounted in the telescope.

One of Bill’s goals was to do astrophotography. Keep in mind that this was 1968, and photography wasn’t what it is today. There was no such thing as a digital camera (mainly because there was no such thing yet, really, as digital anything), and there was no way to mount a standard camera on a telescope. So, Bill improvised in an extraordinary way. He took a one-gallon metal Prestone antifreeze can and cut the top off. He then flocked the inside of the can with a very dark, matte black paint to eliminate reflections. In the middle of the bottom of the can he cut a two-inch hole, and there he mounted a T-connector, which would allow him to attach it to the eyepiece holder of the telescope. 

Now came the genius part. Using tin snips, he cut and bent the open top of the can so that it had two flanges, one on each side, which would neatly and securely hold a sheet film carrier plate. The plate was about five by eight inches, and once it was in the “Prestone camera” and the environment was dark, he could slide out the cover that protected the sheet film from light, and the image of whatever was in the viewfinder would be splashed on the film. Minutes later, Bill would slide the cover back in, and after sending it off to be developed, he’d have a time-lapse photograph. In fact, I still have a photograph he gave me of the Orion Nebula somewhere in my files, along with one of a long-forgotten star cluster. 

It was cold in that observatory; a heater was out of the question, because the rippling heat waves escaping through the observatory’s viewing slit would ruin the image area—another thing I learned from Bill. So, cold it was. 

We weren’t supposed to have the kinds of conversations we did at that age, but they made sense, which was why Bill’s explanation to all of us about what we were taking turns looking at was—well, normal. “A true binary star system,” he explained, “is two stars that are gravitationally bound together and therefore orbit each other.” I can still remember, all these years later, that we were looking at Sirius, sometimes known as the Dog Star, the single brightest star in the night sky, at least in the northern hemisphere. It’s part of the constellation Canis Major. “Sirius A is a bright star and Sirius B is a bit dimmer,” Bill told us, “but the ‘scope can resolve them.” Today, every time I look up and see Sirius, I think of Bill.

This essay is about the relationship between curiosity and awe and wonder, so let me ask you a question. First, when was the last time you can remember being genuinely curious about something, something new to you, something that made you curious enough to do a little reading or research about whatever it was—and to then be awed by it? Just yesterday, June 23rd, 2025, the very first images from the brand-new Vera C. Rubin Observatory in Chile were shared with the public. Within two days of its first scan of the night sky, the Rubin telescope discovered more than 2,000 new asteroids, and astronomers predict that over the next ten years it will capture images of 89,000 new near-Earth asteroids, 3.7 million new main-belt asteroids, 1,200 new objects between Jupiter and Neptune, and 32,000 new objects beyond Neptune. Doesn’t that make you just a little bit curious about what ELSE might be lurking out there? Doesn’t it make you feel a certain amount of awe and wonder, if for no other reason than the fact that humans have developed the scientific wherewithal to build this amazing machine?

Part 2

One of the first things I realized when I got my new telescope a few months ago and began to thaw out long-forgotten astronomy knowledge, was that a telescope is a Time Machine. Here’s why.

The night sky is filled with countless observable objects, other than the moon and stars. For example, on a dark clear night, chances are very good that if you lie down on a lawn chair in your backyard and turn off the porch light, within 15 minutes you’ll see at least one Starlink satellite sweep past. If you time it right and look just after sunset, you’re likely to see the International Space Station pass overhead, the light from the setting sun reflecting off its solar and cooling panels. There’s even an app for your phone to track its location.

Then there are the natural celestial bodies. Depending on the time of year, it’s easy to spot other planets in our solar system with the naked eye, especially Mercury, Venus, Mars, Jupiter, and Saturn. They, like the Earth, orbit our sun, which is, of course, a star. It is one star in the galaxy known as the Milky Way, a collection of stars, planets, great clouds of gas, and dark matter, all bound together by gravity. The Milky Way is made up of somewhere between 100 and 400 billion stars. And remember, that’s a single galaxy. 

In the observable universe, meaning the parts of the universe that we can see from Earth with all our imaging technologies, there are between 200 billion and two trillion observable galaxies, each containing billions of stars.

So just to recap: the Earth orbits the Sun, which is one of 100 to 400 billion stars in the Milky Way Galaxy. But the Milky Way Galaxy is one of somewhere between 200 billion and two trillion galaxies in the observable universe. And the observable universe? According to reliable, informed sources—NASA, the Center for Astrophysics at Harvard, and the Smithsonian—we can observe five percent of it. 95 percent of the universe remains unknown and unseen. 

Starting to feel it yet? It’s called awe and wonder, and that itch you’re feeling? That’s curiosity.

Part 3

If you look to the north on any given spring evening, you’ll easily spot the Big Dipper, a recognizable part of the constellation, Ursa Major—the great bear. Here’s an interesting fact for you: the Big Dipper isn’t a constellation. It’s an asterism, which is a pattern of stars in the sky that people have come to know. The Big Dipper is an asterism that’s part of the more complicated constellation known as Ursa Major.

Take a look at a photo or drawing of the Big Dipper. It consists of four stars that form the “bowl” of the dipper, and three stars that make up the dipper’s curving “handle.”

The handle forms the beginning of a celestial arc, and if you extrapolate it you can “follow the arc to Arcturus,” a very bright star in the constellation Boötes. From Arcturus you can “speed on to Spica,” a fairly bright star in the constellation Virgo. You can do all of this with your naked eye.

Now: go back to the bowl of the Big Dipper. Draw an imaginary line from Megrez, the star where the handle attaches to the bowl, through Phecda, the star just below it that forms a corner of the bowl, and keep going to Regulus, the brightest star in the constellation Leo.

If you now draw a line between Spica and Regulus and look slightly above the midpoint of that line, you are staring at a region of space called the Realm of Galaxies.

I love that name; it sounds like a place that Han Solo would tell Chewy to navigate the Millennium Falcon to. Nowhere else in the visible sky is the concentration of galaxies as high as it is here. Within this space, for example, is the unimaginably huge Virgo Cluster of galaxies. How huge? Well, the local cluster, to which our spiral-shaped Milky Way and Andromeda Galaxies belong, contains a mere 40 galaxies. The Virgo Cluster has more than a thousand, but those thousand are packed into an area no bigger than that occupied by our own local cluster with its 40 galaxies. And remember, each of those galaxies is made up of billions of stars.

Galaxies are active, often destructive behemoths. When a small spiral galaxy like our own Milky Way gets too close to a larger one, things happen. The Large and Small Magellanic Clouds, which are members of our local cluster, used to be much closer to the Milky Way, but the Milky Way’s gravity stripped away many of those galaxies’ outer stars, creating distance between them and radically changing their galactic shapes. But the Milky Way hasn’t finished its current rampage: it’s now in the process of dismantling the Sagittarius Galaxy.

These things are also big—far bigger than we’re capable of imagining, as are the distances between them, which is why I said earlier that a telescope is a fully functional Time Machine. Andromeda, for example, is 220,000 light years across. You need a wide-angle eyepiece to look at it through a telescope. For context, consider this. The speed of light is a known constant—it never changes. Light travels at 186,000 miles per second, or just over 671 million miles per hour. Think orbiting Earth’s equator 7-1/2 times every second. That means that in one year, light travels 5.88 trillion miles. We call that a light year. It’s not a measure of time; it’s a measure of distance. To fly from one end of Andromeda to the other would take 220,000 years, at 186,000 miles per second. Pack a lunch.

When you look up at Andromeda, which is our closest galactic neighbor, you’re looking at an object that is two-and-a-half million light years away. What that means is that the light striking your eye has traveled 14 quintillion, 700 quadrillion miles to get to you. That’s ‘147’ followed by 17 zeroes. More importantly, it means that that light left Andromeda on its way to your eye two-and-a-half million years ago. Two-and-a-half million years ago: the Pleistocene epoch was in full swing; Earth’s polar ice caps were forming; mammoths and mastodons roamed North America; the Isthmus of Panama rose out of the sea, connecting two continents; the Paleolithic period began; and Homo habilis, the first protohumans, emerged.

All that was happening when that light that just splashed onto your retina left its place of birth. And that’s the closest galaxy to us.

So, I’m compelled to ask: is Andromeda still there? Do we have any way of actually knowing? A lot can happen in two-and-a-half million years. And now, with the breathtakingly complicated telescopes we’re placing in deep space—the original Hubble got us started, and now with the James Webb Space Telescope, we’re capturing infrared light that is 13.6 billion years old. The universe is 13.8 billion years old, which means that we’re getting close to seeing as much light as it’s possible to see from the formative edge of the universe itself—what’s known as the cosmic event horizon. Which, of course, begs the question: what lies beyond the edge? 

Part 4

Curiosity, awe, and wonder are amazing things. They feed, nourish, encourage, and drive each other, and in turn, they drive us. I love this science stuff, especially when it hits us with knowledge that is beyond our ability to comprehend. For me, that’s when curiosity, awe and wonder really earn their keep. Because sometimes? Sometimes, they’re the only tools we have.

The Lessons of History

This essay contains an important story for the ages. Given current events, and the absolute truth that history does repeat, the lesson is plain, and chilling. 

One of my treasured possessions from the years I lived in Spain is a 16th-century manuscript. It’s a big book, about fifteen by twenty inches, and it contains around 40 hand-written and hand-illuminated parchment pages. According to a faded and somewhat mysterious note inserted between two of the pages, itself very old and its ink faded, “This book contains the responsive readings and Benedictions for all the Masses of all the Saturdays.” 

My parents bought the book at a junk shop one Sunday morning in Madrid’s famous flea market, El Rastro. When asked how much the book cost, the shop owner picked it up, hefted it to assess its weight, shrugged his shoulders, and declared, “140 Pesetas.” About two dollars. Years later, They passed it on to me.

There’s nothing in the book that identifies its origins, other than its Catholic purpose. I’ve studied and researched it extensively, and spent countless hours with scholars of ancient manuscripts. Here’s what I know. The cover is most likely Spanish, as evidenced by the intricately tooled designs in the leather. The pattern is made up of rows of tiny rosettes, similar to covers from the same period which were often inlaid with ivory and precious stones. The binding mimics the German style of binding of the same period.

The contents are mid-16th-century. There is handwriting toward the end of the book appears to be in the style of the early 18th-century, which implies that the book must have been in use until at least the 1700s.

The book is divided into sections by crude index tabs, hand-labeled and made of vellum, a stronger paper than the high-quality rag of the manuscript itself. A careful examination of one of the pages under a special microscope designed for the purpose reveals a pattern of lines pressed into the surface, a consistent fraction of a millimeter apart, a result of the mold and deckle used in the paper manufacturing process. This line pattern confirms the date of the book.

The pages are hand-written in Latin, and the lines of text alternate with musical staff for the choir that chanted them.

A year or so ago, I decided that I wanted to know more about this strange and ancient book that fell into my hands. I wanted to know where it came from; who wrote it; what the ink was made from; where the paper was sourced; what church or cathedral it was used in; and what was going on in the world at the time. I wanted to know about socioeconomics, geopolitical happenings, and cultural mores. Was it used in a church that was abandoned due to declining attendance, its assets scattered? Was the book looted during the Spanish Civil War? I didn’t know, but I wanted to.

I started by looking into the time period just before the book was first created and used. I don’t know for sure and probably never will, but based on my own research and the insights of academics and scholars far more informed than I, the middle of the 15th-century seemed like a good place to start. 

During the mid-1400s, the late Middle Ages were coming to a close, and the Renaissance, with its focus on the arts, music, and the humanities, was beginning. The Hundred Years’ War between England and France was finally drawing to a close, the Byzantine Empire fell to the Ottoman Turks with the capture of Constantinople in 1453, and Spain and Portugal were demonstrating their sea powers, on the hunt for new trade routes with the rest of the world. 

Equally important was the invention of the printing press in Europe, which arrived too late for my book, but had a profound impact, nonetheless, on the spread of global knowledge, insight, awareness, and ideas. 

In essence, the mid-15th century was a period of transformation, of turnover from one set of guiding principles to another. It was here, shortly after this moment of transition, that my manuscript book came into existence.

The Iberian Peninsula, which comprises Spain and Portugal, has been a multicultural melting pot for its entire existence. For centuries, Muslims, Jews, and Christians coexisted, each playing a role in the rich cultural development of what ultimately became modern Spain. Ten centuries ago, Muslims brought science, architecture, medicine, and extraordinary art, while the Jewish community developed the country’s economy and served as its powerful merchant class. The Christians provided administrative governance. In fact, when Alfonso X, also known as Alfonso the Wise, died in the latter half of the 13th century, he ordered that it be inscribed on his tomb that he was ‘King of the Three Religions.’ Even today it’s impossible NOT to see the influence of the three belief structures that characterized ancient Spain. Look at the Mezquita Cathedral in Córdoba,  where a Catholic church has been built to surround a mosque. The country was a palimpsest of contradictions, but it worked.

In the mid-1400s, things changed in a way that is eerily reminiscent of current events. Alonso de Ojeda, a Dominican friar from Seville who had the attention of the Catholic Kings, Fernando and Isabela, told Queen Isabela during an official visit to Seville that large numbers of Jews who had converted to Christianity were actually Christians in name only—that they in fact continued to practice what came to be known as crypto-Judaism. A study, written by the Archbishop of Seville and Tomás de Torquemada (a Jewish convert himself and soon-to-be administrator of the Spanish Inquisition), offered the same conclusion. I don’t know if this was the first example of a conspiracy theory, but it certainly qualifies as one.

In response to these baseless claims, Fernando and Isabela requested a mandate from the Pope to establish an inquisition in Spain. The Pope agreed, and granted them permission to select a panel of priests to serve as Inquisitors.

In 1482 Fernando sought to take over the existing Papal Inquisition in the province of Aragon, which resulted in major resistance because it infringed on local rights. Relatives and friends of those accused complained of the brutality to the Pope, who wanted to maintain control of the Inquisition. The Pope wrote that “… in Aragon, Valencia, Mallorca and Catalonia, the Inquisition has for some time been moved not by zeal for the faith and the salvation of souls, but by lust for wealth, and that many true and faithful Christians, on the testimony of enemies … have without any legitimate proof been thrust into secular prisons, tortured and condemned as relapsed heretics, deprived of their goods and property and handed over to the secular arm to be executed, to the peril of souls, setting a pernicious example, and causing disgust to many.”

The Pope, whose position on the “new Christians” was far more tolerant than those of the Spanish Catholic Kings, tried to maintain control over the Inquisition to ensure that the punishments being meted out were appropriate and justly assigned. He issued a new order that stipulated a more tolerant approach to the practices of the Inquisition.

Fernando was outraged, arguing that no sensible pope would have published such a document. In May of 1482, he wrote a threatening letter to Rome, saying: “Take care therefore not to let the matter go further, and to revoke any concessions and entrust us with the care of this question.” In response, cowed by the power of the Spanish monarchy, the Pope changed his stance to full cooperation, and issued a new order in 1483 that appointed Torquemada as Inquisitor General of Aragon, Catalonia and Valencia, in the process creating a single entity to administer nationwide punishment without oversight. 

The first victims were burned at the stake in Aragon in 1484. Fierce opposition continued, protesting the loss of local autonomy. Meanwhile, the Pope withdrew all papal inquisitors from the region, handing total control to the Inquisitor General Torquemada, including the handling of all appeals. The Catholic Church abdicated its oversight, effectively washing its hands of the whole affair.

Keep in mind that the Jews represented the country’s merchant class—the artisans, shopkeepers, laborers, and craftspeople. In 1483, all Jews living in the province of Andalusia were expelled from the country. The Pope was troubled by this aggressive stance, but his protest fell on deaf ears because of political pressure from King Fernando, who threatened the Pope if he continued to question the actions of the Catholic Kings. The Pope backed down, and in short order Torquemada established additional arbitrary rules for persecution. One of them was that new courts could be established on an ad hoc basis as needed, with a thirty-day grace period for the accused to confess. And as for the accused, they were guilty until proven innocent based on such ludicrous things as the lack of chimney smoke coming from their homes, clear evidence that they were observing the Sabbath. The accused were allowed to confess and do penance, but if they relapsed—and all it took was the whispered word of an angry neighbor—they were executed. Those who had nothing to confess were tortured until they came up with something, anything, to make the pain stop. Then they were executed.

1492 is widely recognized as the year that Christopher Columbus received permission from the Catholic Kings to sail off to the New World in pursuit of untold riches that would add wealth to the Crown’s coffers. His voyages, often taught as brave forays into the unknown, were in fact expeditions of hegemonic terror.

The Catholic Kings gave Columbus, whose actual name was Cristobal Colón, the title of admiral, viceroy, and governor of any land he discovered. And, he was allowed to keep ten percent of any treasure he found, which motivated him greatly to do so—and by any horrific means necessary. 

But 1492 is also studied by Spanish historians because of a less well-known but far more profound event: by royal decree, all of Spain’s remaining Jews were expelled that year. They left the country by the tens of thousands, taking with them what amounted to the entire merchant class of the country—and the economy that they made possible. As a result, Spain slid into a slow but inevitable economic collapse. The country found itself morally and economically bankrupt, its trade routes disrupted, its trading partners non-existent. Spain entered its own Dark Age, hopelessly crippled.

It doesn’t take a degree in Medieval Spanish History to do a little plug-and-play exercise here, replacing 15th-century names with names from the 21st, substituting one ethnic group for another, inserting a 15th-century excuse for an unspeakable action for one that is similarly vile from the 21st-century. 

I’ve quoted George Santayana a lot lately about the state of things, so I think I’ll end with a quote this time from Polish poet Stanislaw Lec: “When smashing monuments, save the pedestals—they always come in handy.”

The Research Myth

I recently had a conversation about technology’s impact on the availability and quality of information in the world today. It’s an argument I could make myself—that tech-based advances have resulted in access to more data and information. For example, before the invention of moveable type and the printing press, the only books that were available were chained to reading tables in Europe’s great cathedrals—they were that rare and that valuable. Of course, it was the information they contained that held the real value, an important lesson in today’s world where books are banned from modern first world library shelves because an ignorant cadre of adults decides that young people aren’t mature enough to read them—when it’s the adults who lack the maturity to face the fact that not everybody thinks the same way they do in this world, and that’s okay. But, I digress.  

Image of chained books in Hereford Cathedral. Copyright Atlas Obscura.

When moveable type and the printing press arrived, book manuscripts no longer had to be copied by hand—they could be produced in large quantities at low cost, which meant that information could be made available to far more people than ever before. To the general population—at least, the literate ones—this was a form of freedom.But to those who wanted to maintain a world where books were printed once and kept chained to desks where only the privileged few (the clergy) could read them, the free availability of knowledge and information was terrifying. Apparently, it still is. Knowledge is, after all, the strongest form of power. How does that expression go again? Oh yeah: Freedom of the Press…Freedom of Expression…Freedom of Thought…Sorry; I digress. Again.

Fast-forward now through myriad generations of technology that broadened information’s reach: The broadsheet newspaper, delivered daily, sometimes in both morning and evening editions. The teletype. Radio. The telephone. Television. The satellite, which made global information-sharing a reality. High-speed photocopying. High-speed printing. The personal computer and desktop publishing software. Email. Instant Messaging and texting. And most recently, on-demand printing and self-publishing through applications like Kindle Direct, and of course, AI, through applications like ChatGPT. I should also mention the technology-based tools that have dramatically increased literacy around the world, in the process giving people the gift of reading, which comes in the form of countless downstream gifts.

The conversation I mentioned at the beginning of this essay took a funny turn when the person I was chatting with tried to convince me that access to modern technologies makes the information I can put my hands on today infinitely better and more accurate. I pushed back, arguing that technology is a gathering tool, like a fishing net. Yes, a bigger net can result in a bigger haul. But it also yields more bycatch, the stuff that gets thrown back. I don’t care about the information equivalents of suckerfish and slime eels that get caught in my net. I want the albacore, halibut, and swordfish. The problem is that my fishing net—my data-gathering tool—is indiscriminate. It gathers what it gathers, and it’s up to me to separate the good from the bad, the desirable from the undesirable.

What technology-based information-gathering does is make it easy to rapidly get to AN answer, not THE answer.

The truth is, I don’t have better research tools today than I had in the 70s when I was in college. Back then I had access to multiple libraries—the Berkeley campus alone had 27 of them. I could call on the all-powerful oracle known as the reference librarian. I had access to years of the Reader’s Guide to Periodical Literature. I had Who’s Who, an early version of Wikipedia; and of course, I had academic subject matter experts I could query. 

Technology like AI doesn’t create higher quality research results; what technology gives me is speed. As an undergraduate studying Romance Languages, I would often run across a word I didn’t know. I’d have to go to the dictionary, a physical book that weighed as much as a Prius, open it, make my way to the right page, and look up the word—a process that could take a minute or more. Today, I hover my finger over the word on the screen and in a few seconds I accomplish the same task. Is it a better answer? No; it’s exactly the same. It’s just faster. In an emergency room, speed matters. In a research project, not so much. In fact, in research, speed is often a liability.

Here’s the takeaway from this essay. Whether I use the manual tools that were available in 1972 (and I often still do, by the way), or Google Scholar, or some other digital information resource, the results are the same—not because of the tool, but because of how I use what the tool generates. I’ve often said in my writing workshops that “you can’t polish a turd, but you can roll it in glitter.” Just because you’ve written the first draft of an essay, selected a pleasing font, right and left-justified the text, and added some lovely graphics, it’s still a first draft—a PRETTY first draft, but a first draft, nonetheless. It isn’t anywhere near finished.

The same corollary applies to research or any other kind of news or information-gathering activity. My widely cast net yields results, but some of those results are bycatch—information that’s irrelevant, dated, or just plain wrong. It doesn’t matter why it’s wrong; what matters is that it is. And this is where the human-in-the-loop becomes very important. I go through the collected data, casting aside the bycatch. What’s left is information. To that somewhat purified result I add a richness of experience, context, skepticism, and perspective. Ultimately I generate insight, then knowledge, and ultimately, wisdom. 

So again, technology provides a fast track to AN answer, but it doesn’t in any way guarantee that I’ve arrived at anything close to THE answer. Only the secret channels and dark passages and convoluted, illuminated labyrinths of the human brain can do that. 

So yeah, technology can be a marvelous tool. But it’s just a tool. The magic lies in the fleshware, not the hardware. Technology is only as good as the person wielding it. 

The Generational Blame Game

It’s a fundamental aspect of human nature, I believe, for each generation to criticize the generation that preceded it, often using them as a convenient scapegoat for all that’s wrong in the world. The current large target is my own generation, the Baby Boomers. I recently overheard a group of young people—mid-20s—complaining at length about their belief that the Boomers constitute a waste of flesh who never contributed much to society. Respectfully, I beg to differ; this is my response, along with a plea to ALL generations to think twice about how they characterize those who came before.

Millennials, sometimes called Gen-Y, and the Plurals, commonly referred to as Gen-Z, often blame Baby Boomers for the state of the world: the growing wealth imbalance, the violence and unpredictability of climate change, the multifaceted aftermath of COVID because of its impact on the supply chain, and the world’s growing political and cultural divisions—in essence, the world sucks and Boomers are to blame. They often proclaim Boomers to be a generation that contributed little of value to the world. This, of course, is a long-standing social convention: blame the old people, because they know not how dumb, useless and ineffective they are.

On the other hand, there’s a lot of admiration out there for the current Millennial über meisters of Silicon Valley—people like Mark Zuckerberg, Brian Chesky (AirBnB), Alexandr Wang (Scale AI), and Arash Ferdowsi (Dropbox). They deserve admiration for their accomplishments, but they didn’t create Silicon Valley—not by a long shot. The two generations that came before them did that.

But let’s consider the boring, stumbling, mistake-prone Boomers. You know them; they include such incompetent, non-contributing members of society as Bill Gates, the Steves, Jobs and Wozniak, Peggy Whitson, who recently retired as Chief Astronaut at NASA, Larry Ellison, who founded Oracle, Oprah Winfrey, creator of a breathtakingly influential media empire, Marc Benioff, founder of SalesForce, Reid Hoffmann, co-creator of LinkedIn, and Radia Perlman, the creator of the Spanning Tree Protocol, the rule set that the 25 billion computers on the Internet, give or take a few hundred million, use to talk to each another. And I won’t even bother to mention Tim Berners-Lee, the creator of the World Wide Web. 

What a bunch of losers.

But there may be a reason for the dismissal of an entire generation’s contributions to the world that goes beyond the tradition of putting elders on a literal or figurative ice floe and shoving them off to sea. I find it interesting that the newest arrivals on the generational scene judge the value of a generation’s contributions based on the application that that generation created. All hail Facebook, X, Instagram, Uber, Amazon, AirBnB, Google, Tencent, AliBaba, TikTok, GitHub, and Instacart, the so-called platform companies. Those applications are the “public face” of massive and incomprehensibly complex technological underpinnings, yet rarely does anyone make time today for a scintilla of thought about what makes all of those coveted applications—ALL of them—work. In fact, none of them—NONE of them—would exist without two things: the myriad computers (including mobile devices) on which they execute, and the global network that gives them life and makes it possible for them to even exist.

The tail wags the dog here: without the network, these applications could not function. Want some proof? The only time the vast majority of people on the planet are even aware of the network’s existence is when it breaks, which is seldom. But when it does? When ice or wind bring down aerial transmission cables, when a car takes out a phone pole, when fire destroys critical infrastructure and people can’t mine their precious likes on Facebook, when there’s a long weekend and everybody is home downloading or gaming or watching and the network slows to a glacial crawl, technological Armageddon arrives. Heart palpitations, panting, sweating, and audible keening begin, as people punch futilely at the buttons on their devices. But consider this: the global telephone network has a guaranteed uptime of 99.999 percent. In the industry, that’s called five-nines of reliability. And what does that mean in English? It means that on average, the phone network—today, the Internet—is unavailable to any given user for eight-and-a-half minutes a year. In a standard year, there are 525,600 minutes. For about nine of those every year, the network hiccups. Take a moment to think about that. 

When we think back on famous scientists and innovators, who comes to mind? Well, people like Alexander Graham Bell, of course, who invented the telephone, but who also invented the world’s first wireless telephone, called the photophone—and yes, it worked; or Thomas Edison, who became famous for the invention of the lightbulb, but actually invented many other things, and who was awarded 2,332 patents and founded 14 companies, including General Electric; the Wright Brothers, who flew successfully at Kitty Hawk; Watson and Crick, who discovered the DNA double helix and created a path to modern genetics and treatments for genetic disease; Bardeen, Bartain and Shockley, unknown names to most people, but names attached to the three scientists at Bell Telephone Laboratories who invented the transistor; Philo T. Farnsworth, the creator of television; and Marie Curie, who did pioneering research on radioactivity. These are all famous names from the late 1800s all the way through the 1960s. But then, there’s a great twenty-year leap to the 1980s, the time when Generation X came into its own. Movies were made about this generation, some of the best ever: Ferris Buehler’s Day Off. The Breakfast Club. Home Alone. Sixteen Candles. St. Elmo’s Fire. Clerks. The Lost Boys. Karate Kid. Gen-X was a widely criticized generation, an ignored, under-appreciated, self-reliant, go-it-alone generation of entrepreneurs that includes Jeff Bezos of Amazon fame, Cheryl Sandberg of Facebook, Sergey Brin of Google, Meg Whitman of Hewlett-Packard, Travis Kalanick, of Uber, and dare I say it, Elon Musk. All major contributors to the world’s technology pantheon, some as inventors, some as innovators. The power of the Internet to allow data aggregation and sharing made it possible for platform companies like Uber, eBay, Facebook and Google to exist. Those weren’t inventions, they were innovations (and to be sure, exceptional innovations!), built on top of pre-existing technologies.

Even the much-talked-about creations of Elon Musk aren’t inventions. Let’s look at StarLink, the SpaceX constellation of orbiting communication satellites. A satellite comprises radio technology to make it work; solar cells to power it; semiconductors to give it a functional brain; and lasers to allow each satellite to communicate with others. All of those technologies—ALL of them—were invented at Bell Labs in or around the 1940s. In fact, the first communications satellite, Telstar, was created at Bell Labs and launched into orbit in 1962—more than 60 years ago—to broadcast television signals. 

That 20-year leap between the 60s and the 80s conveniently ignores an entire generation and its contributions to the world—not just techno-geeks, but content and entertainment and media people who redefined our perception of the world. This was the time of the Baby Boomers, and while you may see us—yes, I am one—as an annoying group of people that you wish would just go away, you might want to take a moment to recognize the many ways my generation created the lifestyle enjoyed by Millennials and Gen-Z—and took steps to ensure that it would endure.

The thing about Boomer researchers, scientists, and innovators was that with very few exceptions, they were happy to work quietly behind the scenes. They didn’t do great big things exclusively for money or power; they did them because they were the right things to do, because they wanted to leave the world a better place for those who came later. And they did, in more ways than you can possibly imagine.

Let’s start with the inventions and innovations that made possible, among other things, the devices on which you watch, listen or read, and the content they deliver. I know I’ve already mentioned some of these people, but they deserve a few more words. 

Let’s start with the Steves—and no, I don’t mean me. I’m talking about Steve Wozniak and Steve Jobs who did quite a few things before inventing the iconic Macintosh. Both were born in the 1950s and grew up in the San Francisco Bay Area, and met while they were summer interns at Hewlett-Packard. In 1977, seven years before the Mac, they introduced the world to the Apple II personal computer, which included color graphics, a sound card, expansion slots, and features that made it the first machine that came close to the capabilities of modern PCs. Later, they introduced what many called the “WIMP Interface,” for windows, icons, mice, and pointy fingers, the hallmarks of what later became the Mac operating system—and ultimately, Windows 95 and the generations of that OS that followed. Incidentally, the incredibly stable, highly dependable Macintosh operating system is based on UNIX, an operating system first designed and developed at—you guessed it—Bell Laboratories.

Next we have Sir Tim Berners-Lee, born in London in 1955. He grew up around computers, because his parents were mathematicians who worked on the Ferranti Mark I, the first computer in the world to be sold commercially. He became a software consultant for the CERN Particle Physics Laboratory in Switzerland, which became famous for being the home of the Very Large Hadron Collider, which was recently used by astrophysicists to discover the Higgs Boson.

While at CERN in the 1980s, Berners-Lee took on the challenge of organizing and linking all the sources of information that CERN scientists relied on—text, images, sound, and video—so that they would be easily accessible via the newfangled network that had just emerged called the Internet. In the process he came up with the concept for what became the World Wide Web, which he laid out in a terrific research paper in 1989. Along the way he developed a software language to create web pages, called HTML, along with the first web browser, which he made available to everyone, free of charge, in 1991.

Most people think of the Internet and the World Wide Web as the same thing—but they aren’t. The Internet is the underlying transport infrastructure; the Web is an application that rides on top of that infrastructure, or better said, a set of applications, that make it useful to the entire world. 

Next, let me introduce you to Ray Kurzweil, who decided he would be an inventor before he started elementary school. By the time he turned 15, he had built and programmed his own computer to compose music. After graduating from MIT with degrees in computer science and literature, he created a system that enabled computers to read text characters, regardless of the font.

Kurzweil invented many things, but he is perhaps best known for coining the concept of the Singularity, the moment when digital computers and the human brain merge and communicate directly with each other. It’s a fascinating idea. A good business PC easily operates at four billion cycles per second. The human brain, on the other hand, operates at about ten cycles per second. But: a digital PC has limited memory, whereas the human brain’s memory is essentially unlimited. So what happens if we combine the blindingly fast clock speed of a PC with the unlimited memory of the human brain? The Singularity. Cue the Twilight Zone music.

Now let me introduce you to Ajay Bhatt. Born in India, he received an undergrad degree in electrical engineering before emigrating to the U.S., where he earned a master’s degree in the same field, working on technology to power the Space Shuttle. After joining Intel in 1990, he had an epiphany while working on his PC one evening. What if, he wondered, if it was possible for peripheral devices to connect to a computer as easily as plugging an electrical cord into a wall socket? Not all that hard, he decided, and he and his colleagues invented the Universal Serial Bus, which we all know as USB.

And then we have one of my favorites, Bob Metcalfe. Another MIT grad with degrees in 

engineering and management as well as a PhD from Harvard,  he joined Xerox’s Palo Alto Research Center, better known as Xerox PARC, a well-respected facility that has been compared to the east coast’s Bell Labs. While he was there, Metcalfe and his colleagues developed a technique for cheaply and easily connecting computers so that they can share files at high speed. The technology that resulted is called Ethernet, the basis for nearly every connectivity solution in use today in modern computer networks, including WiFi. He went on to found 3Com Corporation, but for me, he will always be most famous for what has come to be known as Metcalfe’s Law: that the value of a mesh network, meaning a network in which every computer connects to every other computer in the network, increases as a function of the square of the number of devices that are attached. Want that in plain English? When a new computer loaded with data connects to a mesh network, the combined value of all that data and its shared access doesn’t increase in a linear way; it increases exponentially. Don’t believe it? Look at every one of the so-called platform companies that we discussed earlier: Apple’s App or music store, Uber, Amazon, every single social media company, and for that matter, the telephone network and the World Wide Web itself.

Dr. Robert Jarvik was a prodigy who invented a surgical stapler and other medical devices while he was still a teenager. But then he got serious. While he was an undergraduate student at the University of Utah in 1964, his father needed to have heart surgery. That ordeal influenced Jarvik to turn his curiosity, inventiveness and problem-solving skills—along with his medical degree— toward finding a method to keep patients with failing hearts alive until they could receive a transplant. While he wasn’t the first to develop an artificial heart, Jarvik’s 1982 creation, the Jarvik 7, was the first such device that could be implanted inside a person’s body. Today, Jarvik continues to work on a device that can serve as a permanent replacement organ.

Here’s another one, and this one fascinates me. Sookie Bang was born and raised in South Korea. She graduated from Seoul National University in 1974 and earned a Ph.D. in microbiology from the University of California at Davis in 1981. As a professor and researcher at the South Dakota School of Mines and Technology, her specialty is bioremediation—for example, using bacteria as an ingredient in a sealant to fix cracks caused by weathering and by freezing water that seeps into the concrete outer surfaces of buildings. Bang and her colleagues figured out how to speed up a naturally occurring process in which bacteria extract nitrogen from urea, which produces carbon dioxide and ammonia as byproducts. The CO2 and ammonia then react with water and calcium to form calcium carbonate, the chemical compound that we know as limestone. The patch created by the bacterial process seals the crack from the inside out and integrates with the porous concrete, repairing the crack. In essence, the concrete becomes self-healing.

Another Boomer name you need to know is Dean Kamen, who was born in Long Island, N.Y., in 1951. You may not know who he is, but I guarantee you know at least one of his inventions.

In the early 2000s, Kamen attracted media attention because investors were knocking each other over to be the first to fund “Project Ginger.” The project was highly secretive, but when the veil was finally lifted, the world was stunned when they were introduced to the Segway Transporter. The device incorporates sophisticated electronics and a gyroscope that allow it to self-balance, and moves, stops and turns based on subtle changes in the driver’s posture. Today, the Segway’s progeny include the ubiquitous “hover boards” that every kid seems to have. But Kamen’s invention also led to the development of an extraordinary device that has changed the lives of thousands of people: a remarkable wheelchair that, thanks to its gyros, can convert from a standard four-wheel chair to a two-wheel chair, in the process lifting the occupant up to eye level with an adult. It can even climb stairs. 

 But Kamen was an inventor long before he created the Segway. While he was still a college student at Worcester Polytechnic Institute in 1972, he invented a wearable device called the ambulatory infusion pump. It changed the lives of diabetics, freeing them from having to worry about injecting themselves with insulin. The pump did it for them.

But he didn’t stop there. After creating the ambulatory infusion pump, Kamen went after a solution for patients with severe kidney disease who had to travel to dialysis centers for the treatments they needed to survive. He invented a portable machine that allowed patients to give themselves dialysis treatments at home, while sleeping. In 1993, it was named Medical Product of the Year.

The list goes on: flexible foot prostheses, artificial skin grafts, innovative battery designs, and plenty of others, all created by experienced, gifted innovators and inventors—and dare I say it, with a small bit of pride, Baby Boomers.

The truth is, every generation yields its own crop of gifted people who make important  contributions to science, engineering, the arts, medicine, and society at-large. But without the contributions of those who came before, nothing we enjoy today would exist. The Boomers stood on the shoulders of giants from the Greatest and Silent Generations, just as Gen-X, the Millennials and Gen-Z stand on Boomer shoulders, and just as the next generations to arrive will stand on theirs. It’s easy to criticize those who came before, but it’s also not much of a stretch to recognize that the current generations of any era wouldn’t be where they are or have what they have without them. So instead of looking for the failures of prior generations, maybe we all need to take a moment to recognize their successes—and how those successes benefit us. Of course, if you still want to blame the Boomers for the Internet, mobile telephony, and the commercial success of the global semiconductor industry that makes literally EVERYTHING work, I guess I’m good with that.