I wrote my first novel, Inca Gold, Spanish Blood, in 2015. By the time I really started to work on it, I’d been a dedicated writer—meaning, I knew that writing was who I am, not what I do—for decades. By then I’d written not only books but countless magazine articles, essays, white papers, poetry, training manuals, and short stories. I’d read every book on writing I could find, and every book recommended by people who write books about writing. I had favorite authors across many genres, and I knew why they were favorites. I had attended writing workshops; I was in numerous writing groups; and I wrote constantly—not in the self-flagellant ‘force yourself to get up at 5 AM every morning and write for two hours before going to work’ way, but whenever the mood struck—which was nearly every day. Sometimes all I wrote was a paragraph, or a meaningful sentence; sometimes I wrote 40 or 50 pages. All that matters is that I wrote.
I developed the Zen-like patience required to deal with the publishing world. I accepted the fact that the magic number for submitting an article or a manuscript or pretty much any new material to publishers is around 30, meaning, the number of publishers you must submit to, on average, before one of them takes the bait.
And, I learned the secrets of getting noticed by an editor. I learned that the phrase “Submission Guidelines” is a lie. It should say, “Don’t even THINK about straying from these iron-clad, inviolable, unwavering, universally-applied rules for submitting your work to the publishing gods if you want anyone to even consider looking at your submission.”
I developed a carefully-curated Council of Druids, my personal cadre of editors, each of which has the same fundamental characteristics: they’re voracious readers; they’re endlessly curious; and they’re willing to read what I write and provide detailed, brutally-naked feedback. Do you know what’s less-than-useless to a writer? Someone who provides a crazed smile, two thumbs-up, and the word ‘awesome’ as their feedback to a written piece. Empty calories. My Druids, on the other hand, are never afraid to say, “Steve, with all the love in my heart, you need to drop this back into whatever swamp you dredged it out of, and here’s why.” In other words, they actually provide feedback that’s meaningful and that can be acted upon. And as much as it hurts sometimes, I carefully read and consider, and usually incorporate, every single comment. Their reading makes my writing better.
As a result of all this, I learned my way around the English language. I became grammatically proficient. I paid close attention and learned how dialogue works—and why it often doesn’t. I found myself reading about 140 books every year, and because of that I developed an extensive vocabulary and an awareness of when not to use polysyllabic words, just because I know them (thank you, Mr. Hemingway). I paid careful attention to structure and flow. I began to realize that genre is merely a suggestion: that some of the best books have elements of romance, science fiction, history, travel, global affairs, poetry, and politics, in spite of the label they’re given by the bookstore.
I also trained myself to ignore the naysayers, the trolls who make it their mission to savage other peoples’ work because they can. They’re cowards, hiding behind the bastion of the Internet. Some reviewers give constructive or kind comments, and for those I’m grateful. But many don’t. Do NOT let their negative comments slow you down. You wrote a book, dammit. They didn’t. Ignore them for the miserable people they are.
I began to understand that I write so that others may read. When I drive my grandkids home after a day with my wife and me, I take the responsibility very seriously indeed. And when I take my readers on a journey, I take the responsibility no less seriously.
So, you can imagine how I felt when I found myself running into roadblock after roadblock as I tried to get a publisher to look at my novel. Here’s what was clattering around in my head, like a handful of marbles. I clearly knew how to write because I’d been doing it for a long time. I was published many times over by big, well-known houses, and I had two bestsellers to my name. I always met or exceeded deadlines. Yet time and again I submitted, and time and again I got back … nothing. Crickets. Even though I followed the submission rules, I didn’t even get rejection letters to add to my already impressive folder of same.
So, I called my editor at one of the big houses whom I had known for years and with whom I had created many successful books—and a genuine friendship. I explained my situation to him, knowing that he doesn’t publish fiction but hoping he could provide some insight. He did, and his response was blunt:
“Steve, here’s what you’re facing. The fact that you have had major success in the non-fiction realm is meaningless to editors in the world of fiction. The firewall that exists between the two domains is so thick that it’s as if you have never written or been published at all.”
And this was the clincher: “Your chances of getting this book published are roughly the same, whether you submit it or not.”
Bummer.
This glaring realization kicked off a new chapter in my writing. I ended up self-publishing the novel, and it did well. I then wrote a second, self-published it, and it became a number-one global bestseller on Amazon for a few weeks. I wrote two more, and they also did well—not bestsellers, but readers buy them and like them. And what I realized, and frankly, what I knew all along, was that in some ways, getting a book published was more important to me than writing one. That was a significant realization, and it changed how I think about why I write, because it was the wrong perspective for a writer. Yes, of course I want my work to be published, but first, I’m a writer. Writing is enormously creative; publishing is enormously mechanical. And when I write, I write for my readers and I take that responsibility seriously. But honestly, I write for myself. I write books that I would like to read. It makes me feel good. It challenges me, forces me to work hard to be better at it.
As writers—all writers, regardless of genre—our goal should be to write books that people want to read, and who then come back for more after they’ve done so. We shouldn’t write for the likes, or the thumbs-ups; those are more empty calories. We write because we have something to say that matters. If we do that, our audiences will find us.
I’m currently writing sequels to two of my novels: Inca Gold, Spanish Blood, and Russet. Russet is my most recent work, so the characters and plot line are still fresh in my mind. But Inca Gold came out in 2016 and I had forgotten some of the story’s details, and I’m embarrassed to say, the names of some of the characters. So, I put on my reader hat, picked up the book, and read it, ignoring the fact that I was its author. And I mean, I really read it. And you know what? I liked it. A lot. It didn’t waste my time, and it made me want to read more. And that’s all the motivation I need to keep going.
A small town in America, summer, 1959. Maple Street. An ice cream vendor pushes his cart up the sidewalk, ringing a bell; kids play stick ball in the street; a neighbor mows his grass with a push mower. Another lies under his car, tinkering with it. In the distance, a dog barks.
Suddenly, the power goes out—all power. Stoves and refrigerators stop working; the radio goes silent; cars won’t start. Neighbors gather in an uneasy group. They begin to speculate about what might be causing the outage, their voices growing strident as speculation turns to suspicion. Could it be the meteor that some of them heard pass overhead earlier?
While one man argues for a rational explanation—sunspots, perhaps—another points the finger at a neighbor who isn’t present, using his odd quirks to irrationally explain the widespread lack of electricity. Then, inexplicably, power returns to a single car in a driveway, and it starts with a rumble.
“It’s space aliens,” says a young comic book-obsessed boy. “They come to earth disguised to look just like us, and blend in. They’re different, but no one can tell because they’re identical to the rest of us.”
And the man who owns the car that mysteriously starts and stops? He’s as mystified as the other neighbors, but because it’s his car engaging in inexplicable behavior—the engine roaring to life when there’s no one at the wheel—he’s to blame. He must be the alien.
In the end, as the town tears itself apart through self-created fear, the real aliens look down on the town from their cloaked ship. One of them says to the other (and they look as human as the people in the streets below), “The pattern is always the same. They pick the most dangerous enemy they can find, and it’s themselves. Their world is full of Maple Streets. We’ll go from one to the next and let them destroy each other.”
Rod Serling wraps up the episode as only Rod Serling can do:
The tools of conquest do not necessarily come with bombs or explosives or fallout. There are weapons that are simply thoughts, attitudes, prejudices, found only in the minds of men. For the record, prejudices can kill, and suspicions can destroy. And a thoughtless, frightened search for a scapegoat has a fallout all its own—for the children, and the children yet unborn. And the pity of it is, these things cannot be confined to the Twilight Zone.
I want every living person in the United States to watch this episode, and then think about current events. Clearly, Rod Serling was correct: These things cannot be confined to the fantasy of the Twilight Zone, where they belong.
I was 13 years old, and I was standing with my childhood friends Bill Meadows, Peter Norris, and Gil DePaul in the frigid interior of the home-built observatory in Bill’s backyard. The four of us stood in a sloppy circle around the telescope, taking turns looking through the eyepiece and shivering in the late-night winter air.
I like to think that our collective friendship served as the model for the TV show, “Big Bang Theory,” because just like Leonard, Howard, Sheldon, and Raj, our world revolved around the wonder of science and was powered by our collective curiosity. The main difference was that in our cadre, the counterparts for Penny, Amy and Bernadette were conspicuously absent. Clearly, we had not yet been introduced to awe.
We loved electronics, and geology, and astronomy, and all the many offshoots of biology; we would often gather for electronic component swaps, or rock and mineral trades, or just to build things together or admire each other’s latest acquisitions of exotic reptiles or amphibians. At one point, my parents gave me a Heathkit electronics project board, pre-wired with capacitors and resistors and transistors and coils, each connected to little stainless-steel springs that allowed me to run jumpers between the components to wire the projects outlined in the manual. I will never forget the day I learned that by swapping between different components and by wiring the output to a variable resistor, I could make it play wildly oscillating sounds that would be great as the background music for a science fiction film. I had invented a version of the Moog Synthesizer, before anyone knew what that was.
I learned two of life’s important lessons from Bill Meadows: the immensity of the universe, and the immensity of personal grief. The first, my 13-year-old shoulders were prepared to carry; the second, not so much. One Christmas morning after all the gifts had been opened, I called Bill to see if he wanted to get together, probably to compare Christmas bounty. He couldn’t, he told me; his Mom had just died. Maybe tomorrow, he said, with infinite grace. I didn’t know how to process that level of profound loss, but he did, and the grace with which he carried the pain is something I still think about today.
As I said, we were the Big Bang Theory gang before there was a Big Bang Theory, and Bill was our Sheldon Cooper—not in the awkward, geeky way of the show’s lovable main character, but in the brilliant, quirky, knowledge-is-everything way of smart, driven, passionate people. He went on to become a gifted composer and musician, a semiconductor designer, and of course, a top-notch quasi-professional astronomer. We’re still very much in touch; recently, he guided us when Sabine bought me my own telescope. Yes, it’s true. She’s awesome.
Like teenage boys everywhere, a glimpse at a copy of Playboy was something to be whispered about for weeks, but the publication that really got our motors humming was the annual catalog from Edmund Scientific Company. Sad, I know, but have you ever SEEN a catalog from Edmund Scientific?
Bill, like the Edmund catalog, was an endless source of knowledge and information. I can still remember things I learned from him. Like, how many sides an icosahedron has (the answer is 20). What an ellipse is, and how to make one (slice the top off a cone at an angle). How to work a Foucault Tester. What ‘New General Catalog’ and ‘Messier Numbers’ mean (unique designators for star clusters, galaxies, and nebulae). Why it was appropriate to drool on a Questar Telescope if I ever found myself in the same room with one.
I even remember the night my Mom was driving us to a long-forgotten presentation at the junior high school. As a car went by us at high speed, the sound rose and fell with its passing. In unison, Bill and I said, “Doppler Effect,” then we laughed. But I was a bit awestruck. I was one with the dude.
Somewhere around 1967, Bill decided that something was missing in his backyard. Not a tomato garden, or a jungle gym, or a trampoline; not a picnic table, or a barbecue grill, or a weight set. No, this 13-year-old decided that what was missing, what would really round the place out, was an observatory. His Dad agreed, and they built one. We all helped a little bit here and there, but this thing was Bill’s baby. It looked like half of a shoebox with a giant tuna can sitting on top, and the whole thing sat at roofline level on top of four pieces of drilling pipe punched into bedrock. Coming up through a hole in the center of the floor was a fifth piece of pipe which ultimately became the telescope mount, isolated from the observatory structure so that our walking about didn’t vibrate the telescope when it was focused on whatever it was focused on. The top and side of the tuna can had a two-foot-wide slit that could be opened for viewing. Many were the nights that we had sleepovers at Bill’s house, curled up and freezing in the observatory as we focused the telescope on distant celestial objects, things Bill could casually name and describe from memory, having seen them many, many times with whatever telescope he used before he built the big one.
The big one: Edmund Scientific sold it all. But buy a ready-made telescope? Piffle, said Bill, or whatever the 1967 equivalent of piffle was in west Texas. Instead, he created a shopping list:
First, a large quartz mirror blank, which was 12 inches or so in diameter;
Assorted grits to hand-grind a parabolic surface into the blank;
A Foucault tester to ensure the mirror curvature was correct once the grinding was done;
The tube for the telescope body;
An adjustable mirror mount;
The eyepiece ocular;
Assorted eyepieces;
An equatorial mount to attach the finished telescope to the center drilling pipe, with an electric star drive;
And of course, various accessories: counterweights, a spotting scope, and assorted mounting hardware.
We all claimed some of the credit for building that telescope because all of us spent time hand-grinding the blank under Bill’s watchful eyes. But make no mistake: it was Bill who built that thing. He ground and ground and ground, week after week after week, starting with a coarse abrasive grit and grinding pommel, then onto a finer grit, and then finer still, until he was working with red polishing rouge at the end. I remember his pink-stained fingers at school. School: it was so fitting that we attended the brand-new Robert H. Goddard Junior High School in Midland, Texas, complete with rockets mounted on stands out front. Goddard, who invented the modern liquid-fuel rocket, was long dead, but his wife came to visit the school not long after it opened. I still have her autograph.
It’s interesting to me that Goddard designed and launched his rockets near Roswell, New Mexico, where my maternal grandparents lived, and where … well, you know.
Once Bill was done with the grinding and polishing, he shipped the mirror blank back to Edmund, and they put the mirrored surface on it and shipped it back, ready to be mounted in the telescope.
One of Bill’s goals was to do astrophotography. Keep in mind that this was 1968, and photography wasn’t what it is today. There was no such thing as a digital camera (mainly because there was no such thing yet, really, as digital anything), and there was no way to mount a standard camera on a telescope. So, Bill improvised in an extraordinary way. He took a one-gallon metal Prestone antifreeze can and cut the top off. He then flocked the inside of the can with a very dark, matte black paint to eliminate reflections. In the middle of the bottom of the can he cut a two-inch hole, and there he mounted a T-connector, which would allow him to attach it to the eyepiece holder of the telescope.
Now came the genius part. Using tin snips, he cut and bent the open top of the can so that it had two flanges, one on each side, which would neatly and securely hold a sheet film carrier plate. The plate was about five by eight inches, and once it was in the “Prestone camera” and the environment was dark, he could slide out the cover that protected the sheet film from light, and the image of whatever was in the viewfinder would be splashed on the film. Minutes later, Bill would slide the cover back in, and after sending it off to be developed, he’d have a time-lapse photograph. In fact, I still have a photograph he gave me of the Orion Nebula somewhere in my files, along with one of a long-forgotten star cluster.
It was cold in that observatory; a heater was out of the question, because the rippling heat waves escaping through the observatory’s viewing slit would ruin the image area—another thing I learned from Bill. So, cold it was.
We weren’t supposed to have the kinds of conversations we did at that age, but they made sense, which was why Bill’s explanation to all of us about what we were taking turns looking at was—well, normal. “A true binary star system,” he explained, “is two stars that are gravitationally bound together and therefore orbit each other.” I can still remember, all these years later, that we were looking at Sirius, sometimes known as the Dog Star, the single brightest star in the night sky, at least in the northern hemisphere. It’s part of the constellation Canis Major. “Sirius A is a bright star and Sirius B is a bit dimmer,” Bill told us, “but the ‘scope can resolve them.” Today, every time I look up and see Sirius, I think of Bill.
This essay is about the relationship between curiosity and awe and wonder, so let me ask you a question. First, when was the last time you can remember being genuinely curious about something, something new to you, something that made you curious enough to do a little reading or research about whatever it was—and to then be awed by it? Just yesterday, June 23rd, 2025, the very first images from the brand-new Vera C. Rubin Observatory in Chile were shared with the public. Within two days of its first scan of the night sky, the Rubin telescope discovered more than 2,000 new asteroids, and astronomers predict that over the next ten years it will capture images of 89,000 new near-Earth asteroids, 3.7 million new main-belt asteroids, 1,200 new objects between Jupiter and Neptune, and 32,000 new objects beyond Neptune. Doesn’t that make you just a little bit curious about what ELSE might be lurking out there? Doesn’t it make you feel a certain amount of awe and wonder, if for no other reason than the fact that humans have developed the scientific wherewithal to build this amazing machine?
Part 2
One of the first things I realized when I got my new telescope a few months ago and began to thaw out long-forgotten astronomy knowledge, was that a telescope is a Time Machine. Here’s why.
The night sky is filled with countless observable objects, other than the moon and stars. For example, on a dark clear night, chances are very good that if you lie down on a lawn chair in your backyard and turn off the porch light, within 15 minutes you’ll see at least one Starlink satellite sweep past. If you time it right and look just after sunset, you’re likely to see the International Space Station pass overhead, the light from the setting sun reflecting off its solar and cooling panels. There’s even an app for your phone to track its location.
Then there are the natural celestial bodies. Depending on the time of year, it’s easy to spot other planets in our solar system with the naked eye, especially Mercury, Venus, Mars, Jupiter, and Saturn. They, like the Earth, orbit our sun, which is, of course, a star. It is one star in the galaxy known as the Milky Way, a collection of stars, planets, great clouds of gas, and dark matter, all bound together by gravity. The Milky Way is made up of somewhere between 100 and 400 billion stars. And remember, that’s a single galaxy.
In the observable universe, meaning the parts of the universe that we can see from Earth with all our imaging technologies, there are between 200 billion and two trillion observable galaxies, each containing billions of stars.
So just to recap: the Earth orbits the Sun, which is one of 100 to 400 billion stars in the Milky Way Galaxy. But the Milky Way Galaxy is one of somewhere between 200 billion and two trillion galaxies in the observable universe. And the observable universe? According to reliable, informed sources—NASA, the Center for Astrophysics at Harvard, and the Smithsonian—we can observe five percent of it. 95 percent of the universe remains unknown and unseen.
Starting to feel it yet? It’s called awe and wonder, and that itch you’re feeling? That’s curiosity.
Part 3
If you look to the north on any given spring evening, you’ll easily spot the Big Dipper, a recognizable part of the constellation, Ursa Major—the great bear. Here’s an interesting fact for you: the Big Dipper isn’t a constellation. It’s an asterism, which is a pattern of stars in the sky that people have come to know. The Big Dipper is an asterism that’s part of the more complicated constellation known as Ursa Major.
Take a look at a photo or drawing of the Big Dipper. It consists of four stars that form the “bowl” of the dipper, and three stars that make up the dipper’s curving “handle.”
The handle forms the beginning of a celestial arc, and if you extrapolate it you can “follow the arc to Arcturus,” a very bright star in the constellation Boötes. From Arcturus you can “speed on to Spica,” a fairly bright star in the constellation Virgo. You can do all of this with your naked eye.
Now: go back to the bowl of the Big Dipper. Draw an imaginary line from Megrez, the star where the handle attaches to the bowl, through Phecda, the star just below it that forms a corner of the bowl, and keep going to Regulus, the brightest star in the constellation Leo.
If you now draw a line between Spica and Regulus and look slightly above the midpoint of that line, you are staring at a region of space called the Realm of Galaxies.
I love that name; it sounds like a place that Han Solo would tell Chewy to navigate the Millennium Falcon to. Nowhere else in the visible sky is the concentration of galaxies as high as it is here. Within this space, for example, is the unimaginably huge Virgo Cluster of galaxies. How huge? Well, the local cluster, to which our spiral-shaped Milky Way and Andromeda Galaxies belong, contains a mere 40 galaxies. The Virgo Cluster has more than a thousand, but those thousand are packed into an area no bigger than that occupied by our own local cluster with its 40 galaxies. And remember, each of those galaxies is made up of billions of stars.
Galaxies are active, often destructive behemoths. When a small spiral galaxy like our own Milky Way gets too close to a larger one, things happen. The Large and Small Magellanic Clouds, which are members of our local cluster, used to be much closer to the Milky Way, but the Milky Way’s gravity stripped away many of those galaxies’ outer stars, creating distance between them and radically changing their galactic shapes. But the Milky Way hasn’t finished its current rampage: it’s now in the process of dismantling the Sagittarius Galaxy.
These things are also big—far bigger than we’re capable of imagining, as are the distances between them, which is why I said earlier that a telescope is a fully functional Time Machine. Andromeda, for example, is 220,000 light years across. You need a wide-angle eyepiece to look at it through a telescope. For context, consider this. The speed of light is a known constant—it never changes. Light travels at 186,000 miles per second, or just over 671 million miles per hour. Think orbiting Earth’s equator 7-1/2 times every second. That means that in one year, light travels 5.88 trillion miles. We call that a light year. It’s not a measure of time; it’s a measure of distance. To fly from one end of Andromeda to the other would take 220,000 years, at 186,000 miles per second. Pack a lunch.
When you look up at Andromeda, which is our closest galactic neighbor, you’re looking at an object that is two-and-a-half million light years away. What that means is that the light striking your eye has traveled 14 quintillion, 700 quadrillion miles to get to you. That’s ‘147’ followed by 17 zeroes. More importantly, it means that that light left Andromeda on its way to your eye two-and-a-half million years ago. Two-and-a-half million years ago: the Pleistocene epoch was in full swing; Earth’s polar ice caps were forming; mammoths and mastodons roamed North America; the Isthmus of Panama rose out of the sea, connecting two continents; the Paleolithic period began; and Homo habilis, the first protohumans, emerged.
All that was happening when that light that just splashed onto your retina left its place of birth. And that’s the closest galaxy to us.
So, I’m compelled to ask: is Andromeda still there? Do we have any way of actually knowing? A lot can happen in two-and-a-half million years. And now, with the breathtakingly complicated telescopes we’re placing in deep space—the original Hubble got us started, and now with the James Webb Space Telescope, we’re capturing infrared light that is 13.6 billion years old. The universe is 13.8 billion years old, which means that we’re getting close to seeing as much light as it’s possible to see from the formative edge of the universe itself—what’s known as the cosmic event horizon. Which, of course, begs the question: what lies beyond the edge?
Part 4
Curiosity, awe, and wonder are amazing things. They feed, nourish, encourage, and drive each other, and in turn, they drive us. I love this science stuff, especially when it hits us with knowledge that is beyond our ability to comprehend. For me, that’s when curiosity, awe and wonder really earn their keep. Because sometimes? Sometimes, they’re the only tools we have.
I recently had a conversation about technology’s impact on the availability and quality of information in the world today. It’s an argument I could make myself—that tech-based advances have resulted in access to more data and information. For example, before the invention of moveable type and the printing press, the only books that were available were chained to reading tables in Europe’s great cathedrals—they were that rare and that valuable. Of course, it was the information they contained that held the real value, an important lesson in today’s world where books are banned from modern first world library shelves because an ignorant cadre of adults decides that young people aren’t mature enough to read them—when it’s the adults who lack the maturity to face the fact that not everybody thinks the same way they do in this world, and that’s okay. But, I digress.
Image of chained books in Hereford Cathedral. Copyright Atlas Obscura.
When moveable type and the printing press arrived, book manuscripts no longer had to be copied by hand—they could be produced in large quantities at low cost, which meant that information could be made available to far more people than ever before. To the general population—at least, the literate ones—this was a form of freedom.But to those who wanted to maintain a world where books were printed once and kept chained to desks where only the privileged few (the clergy) could read them, the free availability of knowledge and information was terrifying. Apparently, it still is. Knowledge is, after all, the strongest form of power. How does that expression go again? Oh yeah: Freedom of the Press…Freedom of Expression…Freedom of Thought…Sorry; I digress. Again.
Fast-forward now through myriad generations of technology that broadened information’s reach: The broadsheet newspaper, delivered daily, sometimes in both morning and evening editions. The teletype. Radio. The telephone. Television. The satellite, which made global information-sharing a reality. High-speed photocopying. High-speed printing. The personal computer and desktop publishing software. Email. Instant Messaging and texting. And most recently, on-demand printing and self-publishing through applications like Kindle Direct, and of course, AI, through applications like ChatGPT. I should also mention the technology-based tools that have dramatically increased literacy around the world, in the process giving people the gift of reading, which comes in the form of countless downstream gifts.
The conversation I mentioned at the beginning of this essay took a funny turn when the person I was chatting with tried to convince me that access to modern technologies makes the information I can put my hands on today infinitely better and more accurate. I pushed back, arguing that technology is a gathering tool, like a fishing net. Yes, a bigger net can result in a bigger haul. But it also yields more bycatch, the stuff that gets thrown back. I don’t care about the information equivalents of suckerfish and slime eels that get caught in my net. I want the albacore, halibut, and swordfish. The problem is that my fishing net—my data-gathering tool—is indiscriminate. It gathers what it gathers, and it’s up to me to separate the good from the bad, the desirable from the undesirable.
What technology-based information-gathering does is make it easy to rapidly get to AN answer, not THE answer.
The truth is, I don’t have better research tools today than I had in the 70s when I was in college. Back then I had access to multiple libraries—the Berkeley campus alone had 27 of them. I could call on the all-powerful oracle known as the reference librarian. I had access to years of the Reader’s Guide to Periodical Literature. I had Who’s Who, an early version of Wikipedia; and of course, I had academic subject matter experts I could query.
Technology like AI doesn’t create higher quality research results; what technology gives me is speed. As an undergraduate studying Romance Languages, I would often run across a word I didn’t know. I’d have to go to the dictionary, a physical book that weighed as much as a Prius, open it, make my way to the right page, and look up the word—a process that could take a minute or more. Today, I hover my finger over the word on the screen and in a few seconds I accomplish the same task. Is it a better answer? No; it’s exactly the same. It’s just faster. In an emergency room, speed matters. In a research project, not so much. In fact, in research, speed is often a liability.
Here’s the takeaway from this essay. Whether I use the manual tools that were available in 1972 (and I often still do, by the way), or Google Scholar, or some other digital information resource, the results are the same—not because of the tool, but because of how I use what the tool generates. I’ve often said in my writing workshops that “you can’t polish a turd, but you can roll it in glitter.” Just because you’ve written the first draft of an essay, selected a pleasing font, right and left-justified the text, and added some lovely graphics, it’s still a first draft—a PRETTY first draft, but a first draft, nonetheless. It isn’t anywhere near finished.
The same corollary applies to research or any other kind of news or information-gathering activity. My widely cast net yields results, but some of those results are bycatch—information that’s irrelevant, dated, or just plain wrong. It doesn’t matter why it’s wrong; what matters is that it is. And this is where the human-in-the-loop becomes very important. I go through the collected data, casting aside the bycatch. What’s left is information. To that somewhat purified result I add a richness of experience, context, skepticism, and perspective. Ultimately I generate insight, then knowledge, and ultimately, wisdom.
So again, technology provides a fast track to AN answer, but it doesn’t in any way guarantee that I’ve arrived at anything close to THE answer. Only the secret channels and dark passages and convoluted, illuminated labyrinths of the human brain can do that.
So yeah, technology can be a marvelous tool. But it’s just a tool. The magic lies in the fleshware, not the hardware. Technology is only as good as the person wielding it.
It’s a fundamental aspect of human nature, I believe, for each generation to criticize the generation that preceded it, often using them as a convenient scapegoat for all that’s wrong in the world. The current large target is my own generation, the Baby Boomers. I recently overheard a group of young people—mid-20s—complaining at length about their belief that the Boomers constitute a waste of flesh who never contributed much to society. Respectfully, I beg to differ; this is my response, along with a plea to ALL generations to think twice about how they characterize those who came before.
Millennials, sometimes called Gen-Y, and the Plurals, commonly referred to as Gen-Z, often blame Baby Boomers for the state of the world: the growing wealth imbalance, the violence and unpredictability of climate change, the multifaceted aftermath of COVID because of its impact on the supply chain, and the world’s growing political and cultural divisions—in essence, the world sucks and Boomers are to blame. They often proclaim Boomers to be a generation that contributed little of value to the world. This, of course, is a long-standing social convention: blame the old people, because they know not how dumb, useless and ineffective they are.
On the other hand, there’s a lot of admiration out there for the current Millennial über meisters of Silicon Valley—people like Mark Zuckerberg, Brian Chesky (AirBnB), Alexandr Wang (Scale AI), and Arash Ferdowsi (Dropbox). They deserve admiration for their accomplishments, but they didn’t create Silicon Valley—not by a long shot. The two generations that came before them did that.
But let’s consider the boring, stumbling, mistake-prone Boomers. You know them; they include such incompetent, non-contributing members of society as Bill Gates, the Steves, Jobs and Wozniak, Peggy Whitson, who recently retired as Chief Astronaut at NASA, Larry Ellison, who founded Oracle, Oprah Winfrey, creator of a breathtakingly influential media empire, Marc Benioff, founder of SalesForce, Reid Hoffmann, co-creator of LinkedIn, and Radia Perlman, the creator of the Spanning Tree Protocol, the rule set that the 25 billion computers on the Internet, give or take a few hundred million, use to talk to each another. And I won’t even bother to mention Tim Berners-Lee, the creator of the World Wide Web.
What a bunch of losers.
But there may be a reason for the dismissal of an entire generation’s contributions to the world that goes beyond the tradition of putting elders on a literal or figurative ice floe and shoving them off to sea. I find it interesting that the newest arrivals on the generational scene judge the value of a generation’s contributions based on the application that that generation created. All hail Facebook, X, Instagram, Uber, Amazon, AirBnB, Google, Tencent, AliBaba, TikTok, GitHub, and Instacart, the so-called platform companies. Those applications are the “public face” of massive and incomprehensibly complex technological underpinnings, yet rarely does anyone make time today for a scintilla of thought about what makes all of those coveted applications—ALL of them—work. In fact, none of them—NONE of them—would exist without two things: the myriad computers (including mobile devices) on which they execute, and the global network that gives them life and makes it possible for them to even exist.
The tail wags the dog here: without the network, these applications could not function. Want some proof? The only time the vast majority of people on the planet are even aware of the network’s existence is when it breaks, which is seldom. But when it does? When ice or wind bring down aerial transmission cables, when a car takes out a phone pole, when fire destroys critical infrastructure and people can’t mine their precious likes on Facebook, when there’s a long weekend and everybody is home downloading or gaming or watching and the network slows to a glacial crawl, technological Armageddon arrives. Heart palpitations, panting, sweating, and audible keening begin, as people punch futilely at the buttons on their devices. But consider this: the global telephone network has a guaranteed uptime of 99.999 percent. In the industry, that’s called five-nines of reliability. And what does that mean in English? It means that on average, the phone network—today, the Internet—is unavailable to any given user for eight-and-a-half minutes a year. In a standard year, there are 525,600 minutes. For about nine of those every year, the network hiccups. Take a moment to think about that.
When we think back on famous scientists and innovators, who comes to mind? Well, people like Alexander Graham Bell, of course, who invented the telephone, but who also invented the world’s first wireless telephone, called the photophone—and yes, it worked; or Thomas Edison, who became famous for the invention of the lightbulb, but actually invented many other things, and who was awarded 2,332 patents and founded 14 companies, including General Electric; the Wright Brothers, who flew successfully at Kitty Hawk; Watson and Crick, who discovered the DNA double helix and created a path to modern genetics and treatments for genetic disease; Bardeen, Bartain and Shockley, unknown names to most people, but names attached to the three scientists at Bell Telephone Laboratories who invented the transistor; Philo T. Farnsworth, the creator of television; and Marie Curie, who did pioneering research on radioactivity. These are all famous names from the late 1800s all the way through the 1960s. But then, there’s a great twenty-year leap to the 1980s, the time when Generation X came into its own. Movies were made about this generation, some of the best ever: Ferris Buehler’s Day Off. The Breakfast Club. Home Alone. Sixteen Candles. St. Elmo’s Fire. Clerks. The Lost Boys. Karate Kid. Gen-X was a widely criticized generation, an ignored, under-appreciated, self-reliant, go-it-alone generation of entrepreneurs that includes Jeff Bezos of Amazon fame, Cheryl Sandberg of Facebook, Sergey Brin of Google, Meg Whitman of Hewlett-Packard, Travis Kalanick, of Uber, and dare I say it, Elon Musk. All major contributors to the world’s technology pantheon, some as inventors, some as innovators. The power of the Internet to allow data aggregation and sharing made it possible for platform companies like Uber, eBay, Facebook and Google to exist. Those weren’t inventions, they were innovations (and to be sure, exceptional innovations!), built on top of pre-existing technologies.
Even the much-talked-about creations of Elon Musk aren’t inventions. Let’s look at StarLink, the SpaceX constellation of orbiting communication satellites. A satellite comprises radio technology to make it work; solar cells to power it; semiconductors to give it a functional brain; and lasers to allow each satellite to communicate with others. All of those technologies—ALL of them—were invented at Bell Labs in or around the 1940s. In fact, the first communications satellite, Telstar, was created at Bell Labs and launched into orbit in 1962—more than 60 years ago—to broadcast television signals.
That 20-year leap between the 60s and the 80s conveniently ignores an entire generation and its contributions to the world—not just techno-geeks, but content and entertainment and media people who redefined our perception of the world. This was the time of the Baby Boomers, and while you may see us—yes, I am one—as an annoying group of people that you wish would just go away, you might want to take a moment to recognize the many ways my generation created the lifestyle enjoyed by Millennials and Gen-Z—and took steps to ensure that it would endure.
The thing about Boomer researchers, scientists, and innovators was that with very few exceptions, they were happy to work quietly behind the scenes. They didn’t do great big things exclusively for money or power; they did them because they were the right things to do, because they wanted to leave the world a better place for those who came later. And they did, in more ways than you can possibly imagine.
Let’s start with the inventions and innovations that made possible, among other things, the devices on which you watch, listen or read, and the content they deliver. I know I’ve already mentioned some of these people, but they deserve a few more words.
Let’s start with the Steves—and no, I don’t mean me. I’m talking about Steve Wozniak and Steve Jobs who did quite a few things before inventing the iconic Macintosh. Both were born in the 1950s and grew up in the San Francisco Bay Area, and met while they were summer interns at Hewlett-Packard. In 1977, seven years before the Mac, they introduced the world to the Apple II personal computer, which included color graphics, a sound card, expansion slots, and features that made it the first machine that came close to the capabilities of modern PCs. Later, they introduced what many called the “WIMP Interface,” for windows, icons, mice, and pointy fingers, the hallmarks of what later became the Mac operating system—and ultimately, Windows 95 and the generations of that OS that followed. Incidentally, the incredibly stable, highly dependable Macintosh operating system is based on UNIX, an operating system first designed and developed at—you guessed it—Bell Laboratories.
Next we have Sir Tim Berners-Lee, born in London in 1955. He grew up around computers, because his parents were mathematicians who worked on the Ferranti Mark I, the first computer in the world to be sold commercially. He became a software consultant for the CERN Particle Physics Laboratory in Switzerland, which became famous for being the home of the Very Large Hadron Collider, which was recently used by astrophysicists to discover the Higgs Boson.
While at CERN in the 1980s, Berners-Lee took on the challenge of organizing and linking all the sources of information that CERN scientists relied on—text, images, sound, and video—so that they would be easily accessible via the newfangled network that had just emerged called the Internet. In the process he came up with the concept for what became the World Wide Web, which he laid out in a terrific research paper in 1989. Along the way he developed a software language to create web pages, called HTML, along with the first web browser, which he made available to everyone, free of charge, in 1991.
Most people think of the Internet and the World Wide Web as the same thing—but they aren’t. The Internet is the underlying transport infrastructure; the Web is an application that rides on top of that infrastructure, or better said, a set of applications, that make it useful to the entire world.
Next, let me introduce you to Ray Kurzweil, who decided he would be an inventor before he started elementary school. By the time he turned 15, he had built and programmed his own computer to compose music. After graduating from MIT with degrees in computer science and literature, he created a system that enabled computers to read text characters, regardless of the font.
Kurzweil invented many things, but he is perhaps best known for coining the concept of the Singularity, the moment when digital computers and the human brain merge and communicate directly with each other. It’s a fascinating idea. A good business PC easily operates at four billion cycles per second. The human brain, on the other hand, operates at about ten cycles per second. But: a digital PC has limited memory, whereas the human brain’s memory is essentially unlimited. So what happens if we combine the blindingly fast clock speed of a PC with the unlimited memory of the human brain? The Singularity. Cue the Twilight Zone music.
Now let me introduce you to Ajay Bhatt. Born in India, he received an undergrad degree in electrical engineering before emigrating to the U.S., where he earned a master’s degree in the same field, working on technology to power the Space Shuttle. After joining Intel in 1990, he had an epiphany while working on his PC one evening. What if, he wondered, if it was possible for peripheral devices to connect to a computer as easily as plugging an electrical cord into a wall socket? Not all that hard, he decided, and he and his colleagues invented the Universal Serial Bus, which we all know as USB.
And then we have one of my favorites, Bob Metcalfe. Another MIT grad with degrees in
engineering and management as well as a PhD from Harvard, he joined Xerox’s Palo Alto Research Center, better known as Xerox PARC, a well-respected facility that has been compared to the east coast’s Bell Labs. While he was there, Metcalfe and his colleagues developed a technique for cheaply and easily connecting computers so that they can share files at high speed. The technology that resulted is called Ethernet, the basis for nearly every connectivity solution in use today in modern computer networks, including WiFi. He went on to found 3Com Corporation, but for me, he will always be most famous for what has come to be known as Metcalfe’s Law: that the value of a mesh network, meaning a network in which every computer connects to every other computer in the network, increases as a function of the square of the number of devices that are attached. Want that in plain English? When a new computer loaded with data connects to a mesh network, the combined value of all that data and its shared access doesn’t increase in a linear way; it increases exponentially. Don’t believe it? Look at every one of the so-called platform companies that we discussed earlier: Apple’s App or music store, Uber, Amazon, every single social media company, and for that matter, the telephone network and the World Wide Web itself.
Dr. Robert Jarvik was a prodigy who invented a surgical stapler and other medical devices while he was still a teenager. But then he got serious. While he was an undergraduate student at the University of Utah in 1964, his father needed to have heart surgery. That ordeal influenced Jarvik to turn his curiosity, inventiveness and problem-solving skills—along with his medical degree— toward finding a method to keep patients with failing hearts alive until they could receive a transplant. While he wasn’t the first to develop an artificial heart, Jarvik’s 1982 creation, the Jarvik 7, was the first such device that could be implanted inside a person’s body. Today, Jarvik continues to work on a device that can serve as a permanent replacement organ.
Here’s another one, and this one fascinates me. Sookie Bang was born and raised in South Korea. She graduated from Seoul National University in 1974 and earned a Ph.D. in microbiology from the University of California at Davis in 1981. As a professor and researcher at the South Dakota School of Mines and Technology, her specialty is bioremediation—for example, using bacteria as an ingredient in a sealant to fix cracks caused by weathering and by freezing water that seeps into the concrete outer surfaces of buildings. Bang and her colleagues figured out how to speed up a naturally occurring process in which bacteria extract nitrogen from urea, which produces carbon dioxide and ammonia as byproducts. The CO2 and ammonia then react with water and calcium to form calcium carbonate, the chemical compound that we know as limestone. The patch created by the bacterial process seals the crack from the inside out and integrates with the porous concrete, repairing the crack. In essence, the concrete becomes self-healing.
Another Boomer name you need to know is Dean Kamen, who was born in Long Island, N.Y., in 1951. You may not know who he is, but I guarantee you know at least one of his inventions.
In the early 2000s, Kamen attracted media attention because investors were knocking each other over to be the first to fund “Project Ginger.” The project was highly secretive, but when the veil was finally lifted, the world was stunned when they were introduced to the Segway Transporter. The device incorporates sophisticated electronics and a gyroscope that allow it to self-balance, and moves, stops and turns based on subtle changes in the driver’s posture. Today, the Segway’s progeny include the ubiquitous “hover boards” that every kid seems to have. But Kamen’s invention also led to the development of an extraordinary device that has changed the lives of thousands of people: a remarkable wheelchair that, thanks to its gyros, can convert from a standard four-wheel chair to a two-wheel chair, in the process lifting the occupant up to eye level with an adult. It can even climb stairs.
But Kamen was an inventor long before he created the Segway. While he was still a college student at Worcester Polytechnic Institute in 1972, he invented a wearable device called the ambulatory infusion pump. It changed the lives of diabetics, freeing them from having to worry about injecting themselves with insulin. The pump did it for them.
But he didn’t stop there. After creating the ambulatory infusion pump, Kamen went after a solution for patients with severe kidney disease who had to travel to dialysis centers for the treatments they needed to survive. He invented a portable machine that allowed patients to give themselves dialysis treatments at home, while sleeping. In 1993, it was named Medical Product of the Year.
The list goes on: flexible foot prostheses, artificial skin grafts, innovative battery designs, and plenty of others, all created by experienced, gifted innovators and inventors—and dare I say it, with a small bit of pride, Baby Boomers.
The truth is, every generation yields its own crop of gifted people who make important contributions to science, engineering, the arts, medicine, and society at-large. But without the contributions of those who came before, nothing we enjoy today would exist. The Boomers stood on the shoulders of giants from the Greatest and Silent Generations, just as Gen-X, the Millennials and Gen-Z stand on Boomer shoulders, and just as the next generations to arrive will stand on theirs. It’s easy to criticize those who came before, but it’s also not much of a stretch to recognize that the current generations of any era wouldn’t be where they are or have what they have without them. So instead of looking for the failures of prior generations, maybe we all need to take a moment to recognize their successes—and how those successes benefit us. Of course, if you still want to blame the Boomers for the Internet, mobile telephony, and the commercial success of the global semiconductor industry that makes literally EVERYTHING work, I guess I’m good with that.
*A note before you begin to read: This is a long post; if you’d rather listen to it, you can find it at the Natural Curiosity Project Podcast.
Part I
LIFE IS VISUAL, so I have an annoying tendency to illustrate everything—either literally, with a contrived graphic or photo, or through words. So: try to imagine a seven-sided polygon, the corners of which are labeled curiosity, knowledge, wisdom, insight, data, memory, and human will. Hovering over it, serving as a sort of conical apex, is time.
Why these eight words? A lifetime of living with them, I suppose. I’m a sucker for curiosity; it drives me, gives my life purpose, and gives me a decent framework for learning and applying what I learn. Knowledge, wisdom, insight, and data are ingredients that arise from curiosity and that create learning. Are they a continuum? Is one required before the next? I think so, but that could just be because of how I define the words. Data, to me, is raw ore, a dimensionless precursor. When analyzed, which means when I consider it from multiple perspectives and differing contexts, it can yield insight—it lets me see beyond the obvious. Insight, then, can become knowledge when applied to real-world challenges, and knowledge, when well cared for and spread across the continuum of a life of learning, becomes wisdom. And all of that yields learning. And memory? Well, keep listening.
Here’s how my model came together and why I wrestle with it.
Imagine an existence where our awareness of ‘the past’ does not exist, because our memory of any action disappears the instant that action takes place. In that world, a reality based on volatile memory, is ‘learning,’ perhaps defined as knowledge retention, possible? If every experience, every gathered bit of knowledge, disappears instantly, how do we create experience that leads to effective, wisdom-driven progress, to better responses the next time the same thing happens? Can there even be a next time in that odd scenario, or is everything that happens to us essentially happening for the first time, every time it happens?
Now, with that in mind, how do we define the act of learning? It’s more than just retention of critical data, the signals delivered via our five senses. If I burn myself by touching a hot stove, I learn not to do it again because I form and retain a cause-effect relationship between the hot stove, the act of touching it, and the pain the action creates. So, is ‘learning’ the process of applying retained memory that has been qualified in some way? After all, not all stoves are hot.
Sometime around 500 BC, the Greek playwright Aeschylus observed that “Memory is the mother of all wisdom.” If that’s the case, who are we if we have no memory? And I’m not just talking about ‘we’ as individuals. How about the retained memory of a group, a community, a society?
Is it our senses that give us the ability to create memory? If I have no senses, then I am not sentient. And if I am not sentient, then I can create no relationship with my environment, and therefore have no way to respond to that environment when it changes around me. And if that happens, am I actually alive? Is this what awareness is, comprehending a relationship between my sense-equipped self and the environment in which I exist? The biologist in me notes that even the simplest creatures on Earth, the single-celled Protozoa and Archaea, learn to respond predictably to differing stimuli.
But I will also observe that while single-celled organisms routinely ‘learn,’ many complex multi-celled organisms choose not to, even though they have the wherewithal to do so. Many of them currently live in Washington, DC. A lifetime of deliberate ignorance is a dangerous thing. Why, beyond the obvious? Because learning is a form of adaptation to a changing environment—call it a software update if you’re more comfortable with that. Would you sleep well at night, knowing that the antivirus software running on your computer is a version from 1988? I didn’t think so. So, why would you deliberately choose not to update your personal operating system, the one that runs in your head? This is a good time to heed the words of Charles Darwin: It is not the strongest that survive, nor the most intelligent, but those that are most adaptable to change. Homo sapiens, consider yourselves placed on-notice.
Part II
RELATED TO THIS CONUNDRUM IS EPISTEMOLOGY—the philosophy that wrestles with the limits of knowledge. Those limits don’t come about because we’re lazy; they come about because of physics.
From the chemistry and physics I studied in college, I learned that the convenient, simple diagram of an atom that began to appear in the 1950s is a myth. Electrons don’t orbit the nucleus of the atom in precise paths, like the moon orbiting the Earth or the Earth orbiting the Sun. They orbit according to how much energy they have, based on their distance from the powerfully attractive nucleus. The closer they are, the stronger they’re held by the electromagnetic force that holds the universe together. But as atoms get bigger, as they add positively-charged protons and charge-less neutrons in the densely-packed nucleus, and layer upon layer of negatively charged orbiting electrons to balance the nuclear charge, an interesting thing happens. As layers of electrons are added, the strength with which the outermost electrons are held by the nucleus decreases with distance, making them less ‘sticky,’ and the element becomes less stable.
This might be a good time to make a visit to the Periodic Table of the Elements. Go pull up a copy and follow along.
Look over there in the bottom right corner. See all those elements with the strange names and big atomic numbers—Americium, Berkelium, Einsteinium, Lawrencium? Those are the so-called transuranium elements, and they’re not known for their stability. If a distant electron is attracted away for whatever reason, that leaves an element with an imbalance—a net positive charge. That’s an unstable ion with a positive charge that wants to get back to a stable state, a tendency defined by the Second Law of Thermodynamics and a process called entropy, which we’ll discuss shortly. It’s also the heart of the strange and wonderful field known as Quantum Mechanics.
This is not a lesson in chemistry or nuclear physics, but it’s important to know that those orbiting electrons are held within what physicists call orbitals, which are statistically-defined energy constructs. We know, from the work done by scientists like Werner Heisenberg, who was a physicist long before he became a drug dealer, that an electron, based on how far it is from the nucleus and therefore how much energy it has, lies somewhere within an orbital. The orbitals, which can take on a variety of three-dimensional shapes that range from a single sphere to multiple pear-shaped spaces to a cluster of balloons, define atomic energy levels and are stacked and interleaved so that they surround the nucleus. So, the orbital that’s closest to the nucleus is called the 1s orbital, and it’s shaped like a sphere. In the case of Hydrogen, element number one in the Periodic Table, somewhere within that orbital is a single lonely electron. We don’t know precisely where it is within the 1s orbital at any particular moment; we just know that it’s somewhere within that mathematically-defined sphere. This is what the Heisenberg Uncertainty Principle is all about: we have no way of knowing what the state of any given electron is at any point in time. And, we never will. We just know that statistically, it’s somewhere inside that spherical space.
Which brings us back to epistemology, the field of science (or is it philosophy?) that tells us that we can never know all that there is to know, that there are defined limits to human knowledge. Here’s an example. We know beyond a shadow of a doubt that the very act of observing the path of an electron changes the trajectory of that electron, which means that we can never know what its original trajectory was before we started observing it. This relationship is described in a complex mathematical formula called Schrödinger’s Equation.
Look it up, study it, there will be a test. The formula, which won its creator, Erwin Schrödinger, the Nobel Prize in 1933, details the statistical behavior of a particle within a defined space, like an energy-bound atomic orbital. It’s considered the fundamental principle of quantum mechanics, the family of physics that Albert Einstein made famous. In essence, we don’t know, we can’t know, what the state of a particle is at any given moment, which implies that the particle can exist, at least according to Schrödinger, in two different states, simultaneously. This truth lies at the heart of the new technology called quantum computing. In traditional computing, a bit (Binary Digit) can have one or the other of two states: zero or one. But in quantum computing, we leave bits behind and transact things using Qubits (quantum bits), which can be zero, one, or both zero and one at the same time. Smoke ‘em if you got ‘em.
The world isn’t neat and tidy where it matters: it’s sloppy and ill-defined and statistical. As much as the work of Sir Isaac Newton described a physical world defined by clear laws of gravity, and velocity, and acceleration, and processes that follow clearly-defined, predictably linear outcomes, Schrödinger’s, Heisenberg’s, and Einstein’s works say, not so fast. At the atomic level, the world doesn’t work that way.
I know—you’re lighting up those doobies as you read this. But this is the uncertainty, the necessary inviolable unknown that defines science. Let me say that again, because it’s important. Uncertainty Defines Science. It’s the way of the universe. Every scientific field of study that we put energy into, whether it’s chemistry, pharmacology, medicine, geology, engineering, genetics, or a host of others, is defined by the immutable Laws of Physics, which are governed by the necessary epistemological uncertainties laid down by people like Werner Heisenberg and Erwin Schrödinger, and codified by Albert Einstein.
Part III
ONE OF MY FAVORITE T-SHIRTS SAYS,
I READ.
I KNOW SHIT.
I’m no physicist, Not by a long shot. But I do read, I did take Physics and Chemistry, and I was lucky enough to have gone to Berkeley, where a lot of this Weird Science was pioneered. I took organic chemistry from a guy who was awarded a Nobel Prize and had more than a few elements named after him (Glenn Seaborg) and botany from the guy who discovered how photosynthesis works and also had a Nobel Prize (Melvin Calvin). I know shit.
But the most important thing I learned and continue to learn, thanks to those grand masters of knowledge, is that uncertainty governs everything. So today, when I hear people criticizing scientists and science for not being perfect, for sometimes being wrong, for not getting everything right all the time, for not having all the answers, my blood boils, because they’re right, but for the wrong reasons. Science is always wrong—and right. Schrödinger would be pleased with this duality. It’s governed by the same principles that govern everything else in the universe. Science, which includes chemistry, pharmacology, medicine, geology, engineering, genetics, and all the other fields that the wackadoodle pseudo-evangelists so viciously criticized during the pandemic, and now continue to attack, can’t possibly be right all the time because the laws of the universe fundamentally prevent us from knowing everything we need to know to make that happen. Physics doesn’t come to us in a bento box wrapped in a ribbon. Never in the history of science has it ever once claimed to be right. It has only maintained that tomorrow it will be more right than it is today, and even more right the day after that. That’s why scientists live and die by the scientific method, a process that aggressively and deliberately pokes and prods at every result, looking for weaknesses and discrepancies. Is it comfortable for the scientist whose work is being roughed up? Of course not. But it’s part of being a responsible scientist. The goal is not for the scientist to be right; the goal is for the science to be right. There’s a difference, and it matters.
This is science. The professionals who practice it, study it, probe it, spend their careers trying to understand the rules that govern it, don’t work in a world of absolutes that allow them to design buildings that won’t fail and drugs that will work one hundred percent of the time and to offer medical diagnoses that are always right and to predict violent weather with absolute certainty. No: they live and work in a fog of uncertainty, a fuzzy world that comes with no owner’s manual, yet with that truth before them, and accepting the fact that they can never know enough, they do miraculous things. They have taken us to the stars, created extraordinary energy sources, developed mind-numbingly complex genetic treatments and vaccines, and cured disease. They have created vast, seamless, globe-spanning communications systems, the first glimmer of artificial intelligence, and demonstrated beyond doubt that humans play a major role in the fact that our planet is getting warmer. They have identified the things that make us sick, and the things that keep us well. They have helped us define ourselves as a sentient species.
And, they are pilloried by large swaths of the population because they’re not one hundred percent right all the time, an unfair expectation placed on their shoulders by people who have no idea what the rules are under which they work on behalf of all of us.
Here’s the thing, for all of you naysayers and armchair critics and nonbelievers out there: Just because you haven’t taken the time to do a little reading to learn about the science behind the things that you so vociferously criticize and deny, just because you choose deliberate ignorance over an updated mind, doesn’t make the science wrong. It does, however, make you lazy and stupid. I know shit because I read. You don’t know shit because you don’t. Take a lesson from that.
Part IV
THIS ALSO TIES INTO WHAT I BELIEVE to be the most important statement ever uttered by a sentient creature, and it begins at the liminal edges of epistemological thought: I am—the breathtaking moment of self-awareness. Does that happen the instant a switch flips and our senses are activated? If epistemology defines the inviolable limits of human knowledge, then what lies beyond those limits? Is human knowledge impeded at some point by a hard-stop electric fence that prevents us from pushing past the limits? Is there a ‘there be dragons here’ sign on the other side of the fence, prohibiting us from going farther? I don’t think so. For some, that limit is the place where religion and faith take over the human psyche when the only thing that lies beyond our current knowledge is darkness. For others, it stands as a challenge: one more step moves us closer to…what, exactly?
A thinking person will experience a moment of elegance here, as they realize that there is no fundamental conflict between religious faith and hardcore science. The two can easily coexist without conflict. Why? Because uncertainty is alive and well in both. Arthur C. Clarke: Any sufficiently advanced technology is indistinguishable from magic.
Part V
THIS BRINGS ME TO TIME, and why it sits at the apex of my seven-sided cone. Does time as we know it only exist because of recallable human memory? Does our ability to conceive of the future only exist because, thanks to accessible memory and a perception of the difference between a beginning state and an end state, of where we are vs. where we were, we perceive the difference between past and present, and a recognition that the present is the past’s future, but also the future’s past?
Part VI
SPANISH-AMERICAN WRITER AND PHILOSOPHER George Santayana is famous for having observed that ‘those who fail to heed the lessons of history are doomed to repeat them.’ It’s a failing that humans are spectacularly good at, as evidenced by another of Santayana’s aphorisms—that ‘only the dead have seen the end of war.’ I would observe that in the case of the first quote, ‘heed’ means ‘to learn from,’ not simply ‘to notice.’ But history, by definition, means learning from things that took place in the past, which means that if there is no awareness of the past, then learning is not possible. So, history, memory, and learning are, to steal from Douglas Adams, the author of The Hitchhiker’s Guide to the Galaxy, “inextricably intertwingled” (more on that phrase later). And if learning can’t happen, does that then mean that time, as we define it, stops? Does it become dimensionless? Is a timeless system the ultimate form of entropy, the tendency of systems to seek the maximum possible state of disorder, including static knowledge? Time, it seems, implies order, a logical sequence of events that cannot be changed. So, does entropy seek timelessness? Professor Einstein, white courtesy telephone, please.
The Greek word chronos defines time as a physical constant, as in, I only have so much time to get this done. Time is money. Only so much time in a day. 60 seconds per minute, 60 minutes per hour, 24 hours per day. But the Greeks have a second word, kairós, which refers to the quality of time, of making the most of the time you have, of savoring time, of using it to great effect. Chronos, it seems, is a linear and quantitative view of time; kairós is a qualitative version.
When I was a young teenager, I read a lot of science fiction. One story I read, a four-book series by novelist James Blish (who, with his wife, wrote the first Star Trek stories for television), is the tale of Earth and its inhabitants in the far distant future. The planet’s natural resources have been depleted by human rapaciousness, so, entire cities lift off from Earth using a form of anti-gravity technology called a Gravity Polaritron Generator, or spindizzy for short, and become independent competing entities floating in space.
In addition to the spindizzy technology, the floating cities have something called a stasis field, within which time does not exist. If someone is in imminent danger, they activate a stasis field that surrounds them, and since time doesn’t exist within the field, whatever or whoever is in it cannot be hurt or changed in any way by forces outside the field. It’s an interesting concept, which brings me to a related topic.
One of my favorite animals, right up there with turtles and frogs, is the water bear, also called a tardigrade (and, charmingly by some, a moss piglet). They live in the microscopically tiny pools of water that collect on the dimpled surfaces of moss leaves, and when viewed under a microscope look for all the world like tiny living gummy bears.
Tardigrades can undergo what is known as cryptobiosis, a physiological process by which the animal can protect itself from extreme conditions that would quickly kill any other organism. Basically, they allow all the water in their tiny bodies to completely evaporate, in the process turning themselves into dry, lifeless little husks. They become cryptospores. Water bears have been exposed to the extreme heat of volcanos, the extreme cold of Antarctica, and intense nuclear radiation inside power plants; they have been placed outside on the front stoop of the International Space Station for days on end, then brought inside, with no apparent ill effects. Despite the research into their ability to survive such lethal environments, we still don’t really know how they do it. Uncertainty.
But maybe I do know. Perhaps they have their own little stasis field that they can turn on and off at will, in the process removing time as a factor in their lives. Time stops, and if life can’t exist without time, then they can’t be dead, can they? They become like Qubits, simultaneously zero and one, or like Schrödinger’s famous cat, simultaneously dead and alive.
Part VII
IN THE HITCHHIKER’S GUIDE TO THE GALAXY, Douglas Adams uses the phrase I mentioned earlier and that I long ago adopted as one of my teaching tropes. It’s a lovely phrase that just rolls off the tongue: “inextricably intertwingled.” It sounds like a wind chime when you say it out loud, and it makes audiences laugh when you use it to describe the interrelatedness of things.
The phrase has been on my mind the last few days, because its meaning keeps peeking out from behind the words of the various things I’ve been reading. Over the last seven days I’ve read a bunch of books from widely different genres—fiction, biography, science fiction, history, philosophy, nature essays, and a few others that are hard to put into definitive buckets.
There are common threads that run through all of the books I read, and not because I choose them as some kind of a confirmationally-biased reading list (how could Loren Eiseley’s Immense Journey, Arthur C. Clarke’s Songs of a Distant Earth, E. O. Wilson’s Tales from the Ant World, Malcolm Gladwell’s Revenge of the Tipping Point, Richard Feynman’s Surely You’re Joking, Mister Feynman, and Studs Terkel’s And They All Sang possibly be related, other than the fact that they’re books?). Nevertheless, I’m fascinated by how weirdly connected they are, despite being so very, very different. Clarke, for example, writes a whole essay in Songs of a Distant Earth about teleology, a term I’ve known forever but have never bothered to look up. It means looking at the cause of a phenomenon rather than its perceived purpose to discern its reason for occurring. For example, in the wilderness, lightning strikes routinely spark forest fires, which burn uncontrolled, in the process cleaning out undergrowth, reducing the large-scale fire hazard, but doing very little harm to the living trees, which are protected by their thick bark—unless they’re unhealthy, in which case they burn and fall, opening a hole in the canopy that allows sunlight to filter to the forest floor, feeding the seedlings that fight for their right to survive, leading to a healthier forest. So it would be easy to conclude that lightning exists to burn forests. But that’s a teleological conclusion that focuses on purpose rather than cause. Purpose implies intelligent design, which violates the scientific method because it’s subjective and speculative. Remember—there’s no owners manual.
The initial cause of lightning is wind. The vertical movement of wind that precedes a thunderstorm causes negatively charged particles to gather near the base of the cloud cover, and positively charged particles to gather near the top, creating an incalculably high energy differential between the two. But nature, as they say, abhors a vacuum, and one of the vacuums it detests is the accumulation of potential energy. Natural systems always seek a state of entropy—the lowest possible energy state, the highest state of disorder. I mentioned this earlier; it’s a physics thing, the Second Law of Thermodynamics. As the opposing charges in the cloud grow (and they are massive—anywhere from 10 to 300 million volts and up to 30,000 amps), their opposite states are inexorably drawn together, like opposing poles of a gigantic magnet (or the positively charged nuclei and negatively charged electrons of an atom), and two things can happen. The energy stored between the “poles” of this gigantic aerial magnet—or, if you prefer, battery—discharges within the cloud, causing what we sometimes call heat lightning, a ripple of intense energy that flashes across the sky. Or, the massive negative charge in the base of the cloud can be attracted to positive charges on the surface of the Earth—tall buildings, antenna towers, trees, the occasional unfortunate person—and lightning happens.
It’s a full-circle entropic event. When a tree is struck and a fire starts, the architectural order that has been painstakingly put into place in the forest by nature is rent asunder. Weaker trees fall, tearing open windows in the canopy that allow sunlight to strike the forest floor. Beetles and fungi and slugs and mosses and bacteria and nematodes and rotifers consume the fallen trees, rendering them to essential elements that return to the soil and feed the healthy mature trees and the seedlings that now sprout in the beams of sunlight that strike them. The seedlings grow toward the sunlight; older trees become unhealthy and fall; order returns. Nature is satisfied. Causation, not purpose. Physics, not intelligent design. Unless, of course, physics is intelligent design. But we don’t know. Uncertainty.
E. O. Wilson spends time in more than one of his books talking about the fact that individuals will typically act selfishly in a social construct, but that groups of individuals in a community will almost always act selflessly, doing what’s right for the group. That, by the way, is the difference between modern, unregulated capitalism and what botany professor Robin Wall Kimmerer calls “the gift economy” in her wonderful little book, The Serviceberry. This is not some left-leaning, unicorn and rainbows fantasy: it’s a system in which wealth is not hoarded by individuals, but rather invested in and shared with others in a quid pro quo fashion, strengthening the network of relationships that societies must have to survive and flourish. Kimmerer cites the story of an anthropologist working with a group of indigenous people who enjoy a particularly successful hunt, but is puzzled by the fact that they now have a great deal of meat but nowhere to keep it cold so that it won’t spoil. “Where will you store it to keep it fresh for later?” The anthropologist asks. “I store it in my friends’ bellies,” the man replies, equally puzzled by the question. This society is based on trust, on knowing that the shared meat will be repaid in kind. It is a social structure based on strong bonds—kind of like atoms. Bonds create stability; individual particles do the opposite, because they’re less stable.
In fact, that’s reflected in many of the science fiction titles I read: that society’s advances come about because of the application of the common abundance of human knowledge and will. Individuals acting alone rarely get ahead to any significant degree, and if they do, it’s because of an invisible army working behind them. But the society moves ahead as a collective whole, with each member contributing. Will there be those who don’t contribute? Of course. It’s a function of uncertainty and the fact that we can never know with one hundred percent assurance how an individual within a group will behave. There will always be outliers, but their selfish influence is always neutralized by the selfless focus of the group. The behavior of the outlier does not define the behavior of the group. ‘One for one and none for all’ has never been a rallying call.
Part VIII
THIS ESSAY APPEARS TO WANDER, because (1) it wanders and (2) it connects things that don’t seem to be connected at all, but that clearly want to be. Learning doesn’t happen when we focus on the things; it happens when we focus on the connections between the things. The things are data; the connections create insight, which leads to knowledge, wisdom, action, a vector for change. Vector—another physics term. It refers to a quantity that has both direction and magnitude. The most powerful vector of all? Curiosity.
Science is the only tool we have. It’s an imperfect tool, but it gets better every time we use it. Like it or not, we live in a world, in a universe, that is defined by uncertainty. Science is the tool that helps us bound that uncertainty, define its hazy distant edges, make the unclear more clear, every day. Science is the crucible in which human knowledge of all things is forged. It’s only when we embrace that uncertainty, when we accept it as the rule of all things, when we revel in it and allow ourselves to be awed by it—and by the science-based system that allows us to constantly push back the darkness—that we begin to understand. Understand what, you say? Well, that’s the ultimate question, isn’t it?
I’m standing on the front porch because a thunderstorm is passing through, and the sky is as dark and green as the back of a catfish. If there’s a more satisfying experience out there, I honestly don’t know what it is. The hiss of rain, the random chiming of leaves, downspouts, puddles, and flower pots as the raindrops fall, the crackle and crash of thunder—it’s nature’s best symphony. And the light—I’ve always believed that the light during a thunderstorm is something you can taste. It’s more than visible; thunderstorm light glows, from within, and it comes from everywhere and nowhere.
The best part of a thunderstorm, of course, is when it ends—not because it’s over, which I always regret, but because it leaves behind a scent trail, that amazing smell, the breath of the storm, that proves that it’s alive. That smell, which we usually call ozone, isn’t ozone at all, at least not totally. It’s a very different chemical compound that I’ll introduce you to in a minute. But first, because I brought it up, let me tell you a bit about ozone, because it is a pretty important chemical.
Ozone is a weird form of oxygen. Oxygen is normally a diatomic molecule, meaning that two oxygen atoms combine to form the gas that we breathe, O2. Ozone, on the other hand, is O3, a much less stable molecule.
Everybody knows about the ozone layer up there. Well, that layer exists because ultraviolet energy from space strikes the oxygen in the upper atmosphere, changing O2 to O3 and creating a layer or shell of ozone that does a very good job of shielding us from all that UV radiation that would otherwise fry us into little masses of melanoma. At least, it protects us until we do dumb human things, like release chlorofluorocarbons that chemically eat holes in the ozone layer and let all that nasty UV energy through.
The ozone layer sits about 30 kilometers above the surface of the planet, and in spite of its name, the concentration of ozone up there is only about eight parts-per-million, while the rest is mostly just regular oxygen. But it’s that oxygen that absorbs ultraviolet energy to become the ozone that protects the planet’s surface from most of the effects of harmful radiation. And while ozone has beneficial effects in the atmosphere, they’re not all that beneficial down here on earth. It’s known to reduce crop yields when there’s too much of it in the ground, and because it’s such a powerful oxidant, it can be extremely irritating to noses, throats and lungs. It can also cause cracks in rubber and plastics, and in at least one study, it’s been shown to make arterial plaque, the fatty buildup that can lead to heart attack and stroke, worse. Talk about a love-hate relationship.
So, let’s talk about what we were originally discussing before I diverted us—and that was the wonderful smell that takes over everything after a rainstorm, that smell that makes us inhale deeply and feel good about life in general.
As it turns out, that smell doesn’t come from ozone—at least not exclusively. Ozone may be in the air if there was lightning during the rainstorm, but the chemical you’re mostly smelling is called Geosmin. You smell it after a rain, or in wet dirt that you’re digging up in the garden. The smell is so recognizable, and so wonderful, that it even has a name—Petrichor. It comes from two Greek words that mean “the smell of the substance that flows in the veins of the Gods.”
So, where does Geosmin come from? Well, it turns out that it’s created as a by-product when three types of bacteria found in the soil, actinomycetes, streptomycetes, and cyanobacteria, have their way with organic material. As they break it down, Geosmin is released. So, it’s naturally occurring, and in fact contributes to the flavor of beets, spinach, lettuce, mushrooms, even that wonderful, earthy taste of catfish. Sometimes it can be overpowering when too much of it gets into water supplies, and while it isn’t harmful, it can temporarily give water a bitter taste.
Here’s one last, interesting thing about Geosmin and its Petrichor aroma. Human noses are extremely sensitive to the smell of Petrichor, in fact, more sensitive to it than just about any other compound. We can detect it in concentrations of five parts per trillion. To put that into perspective, for the human nose to detect methanol, a fairly pungent alcohol, it has to be present in concentrations of a billion parts-per-trillion. That’s quite a difference. And why are we so amazingly sensitive to it? Well, some scientists believe that that sensitivity has been genetically selected, because it allowed our distant ancestors to find water, even in the driest places on earth. No wonder it smells so good—it helped keep us alive.
I’m a writer, which means that I’m also a serious reader. I like to say that writing is my craft; reading is my gym. And one author whose books have meant a lot to me—in fact, I’d consider him a mentor, even though we’ve never met—is a guy named John McPhee. If his books are any indication, he’s a ferociously curious guy. They all fall into the genre that I love, which is called creative nonfiction. It includes writers like William Least Heat-Moon, Bill Bryson, Annie Dillard, and of course, John McPhee. Creative nonfiction means writing about subjects that are real, but that incorporate storytelling into the narrative. In creative nonfiction, adjectives are legal.
I first ran across McPhee’s work when I took a writing workshop back in the 90s from William Least Heat-Moon, the inspiring author of one of my all-time favorite books, Blue Highways. One of John McPhee’s books, Coming Into the Country, was required reading for the workshop. It’s about homesteaders in Alaska, back in the days when the Alaska government would give land to people in exchange for their agreement to homestead it. Boring, you say? Well, consider the story of the guy who drove an old school bus up there. When he got reasonably close to the land he had acquired as part of his homesteading agreement, he parked the school bus, took a cutting torch to it, and cut off the top. He then turned the former top upside down like an overturned turtle’s shell, and drove the school bus-turned-convertible onto it. Once there, he welded the two together, attached a long shaft with a propeller on one end to the drive shaft of the school bus, shoved his contraption into the river, started the engine, and motored a few hundred miles to his newly acquired homestead. See what I mean? Story. It’s everything.
McPhee has written about a breathtaking range of topics. He wrote Annals of the Former World, in which he took a series of road trips across the United States with a geologist, looking at freeway roadcuts to understand the dynamic geology of North America, and in the process, writing a magnificent book about the geology of the continent. He wrote The Pine Barrens, the story of the great pine forests that cover most of southern New Jersey, and the people who live there. He wrote Uncommon Carriers, about the world of cargo carriers—all kinds—that form the basis of the global supply chain. He wrote Oranges, about the business of growing and selling them in Florida. He wrote Encounters with the Archdruid, about the interactions between conservationists and those they see as the enemy. And he wrote The Curve of Binding Energy, the story of Theodore Taylor, an early nuclear engineer who was also an anti-nuclear activist.
By the way, here’s a quote from Annals of the Former World that shows what kind of a writer McPhee is: “If by some fiat, I had to restrict all this writing to one sentence (and by the way, the book is two-and-a-half inches thick), this is the one I would choose: “The summit of Mount Everest is marine limestone.” Think about that.
So far, John McPhee has written more than 30 books, and I’ve read them all. I can honestly say that each one has made me a measurably better writer and thinker. But the book that really stuck with me, more than of the others, is called The Control of Nature. That book has been in my head a lot lately as I watch what’s going on in California specifically with the damage caused by heavy rains and flooding, and in the country (or world in general), as climate change has its way with us.
The Control of Nature is divided into three sections: ‘Atchafalaya’; ‘Catching the Lava’; and ‘Los Angeles Against the Mountains’. Each section tells a story of human hubris, of our largely futile efforts to make nature do something that nature doesn’t want to do—like changing the direction of the Mississippi River, or trying to redirect lava flows in places like Hawaii and Iceland away from population centers (Iceland dumped cold water on one of their flows), or protecting Los Angeles infrastructure from damage caused by flooding by building flood canals, like the cement-bound LA River. How’s that working out?
Some of you may remember a quote that I toss out a lot. It’s from Loren Eiseley, another of my favorite writers. Back in the 60s, Loren said, “When man becomes greater than nature, nature, which created us, will respond.” Well, she’s responding. And one of the lessons we can choose to learn from her response is that this is not a time for head-to-head combat. I used to tell my SCUBA diving students that it doesn’t matter how strong a swimmer you are, or how good a diver you are, the ocean is always stronger. The ocean will win, every time. So don’t even try. Discretion is the better part of valor, and to ignore that fact can be fatal.
As I said, this is not a time for head-to-head combat. Nature vs. Humanity cannot be a boxing match, because the outcome is predetermined, whether we like it or not. News flash: We don’t win this one. This is more a time for martial arts, in which we use our opponent’s weight and strength to work in our favor. Nature is telling us what to do, every day. We just seem to have a problem listening. ‘You’re not the boss of me,’ we say. ‘No, actually, you have that backward,’ nature says. ‘Here—let me demonstrate.’
The other flaw in the logic is that we have this tendency to think in terms of ‘us vs. nature,’ of ‘humans vs. the natural world,’ when in fact, we’re as much a part of the natural world as blue whales and chickadees and earthworms and slime molds. We just don’t act like it. By viewing ourselves as something apart from nature, as something better than or superior to nature, we invoke Loren Eiseley again. Nature is responding to our abuse, to our attempt to dominate, and her response is swift, sure, and painful.
So, what’s the alternative? The alternative is to shift our thinking from ‘us vs. nature’ to ‘us as an integral part of nature.’ Nice words. But, what do they mean? How do they become real, or actionable, as people like to say in the business world?
The answer is simpler than most people realize, although it requires deliberate action. There’s that word again—deliberate. The answer isn’t one great, big thing, because if that were the case, nothing would ever change. Here’s an example for the techies. Think about it: What’s more powerful: a single mainframe computer, or hundreds of personal computers, or servers, networked together? The answer, of course, is the latter. Although instead of talking about computers here, we’re talking about one-person efforts on behalf of the environment of which we are a part, that, in aggregate, amount to enormously powerful results. The whole is greater than the sum of its parts. For example, if you live in a house, you probably have a yard, which means that you probably have grass, and shrubs, and trees, and flowering plants, and other things to make it look good. The problem is that most of those are non-native, which means that they’re not always good for local pollinators, like bees and moths and butterflies and even spiders, or other local wildlife. But if each of us were to set aside an area in the back corner of the yard the size of a typical walk-in closet, say, eight feet by ten feet, that’s eighty square feet that can be allowed to grow wild with local plants, which provide habitat, including food, for native pollinators. I guarantee that if you go down to your local nursery, or Audubon Center, you can buy a shaker bottle full of local plant seeds that you can take and shake over your designated area.
Here’s another one. We often use broad-spectrum insecticides to get rid of insect pests, which they do very well. But those nicotinoid-based compounds are indiscriminate—they also kill beneficial insects like bees, butterflies, moths, and spiders, and birds, and reptiles and amphibians, and potentially humans, if they leach into the water supply—and they do. So, why not switch to environmentally friendly compounds? They’re out there, and yes, they may cost a little bit more, but not enough to be a showstopper, especially when you consider the alternative. I don’t want to be yet another alarmist here—there’s more than enough of them already—but consider this: pollinators aren’t a nice-to-have thing. Bees, moths, butterflies, spiders, and even some birds move pollen from flower to flower, a process that’s required for the flower to give rise to fruit. No bees, no pollination. No pollination, no fertilization. no fertilization, no fruits or vegetables. So think twice, please, about using that insecticide.
Other things? There are lots of them. Buy soaps and detergents in bulk, and refill the same bottle over and over, to reduce plastic consumption. Buy one of those showerheads that allow you to turn down the water pressure to a warm trickle when you don’t need the full force of the blast. An efficient showerhead still puts out about two to two-and-a-half gallons of water per minute, which over the course of a year of showering can really add up, which means that any effort to conserve falls on the correct side of the environmental balance sheet. You don’t have to turn the shower off; just turn it down. It makes a huge difference.
What else? Set the thermostat in winter one degree cooler and buy a sweater or that cool hoodie you’ve been jonesing for. There’s your excuse! Think before you get in the car to run that errand. Are you close enough to walk instead? I do it every day, a few miles each way, and I feel so much better for it.
Another thing you can do is buy as much locally produced food as you can. I’m about to write a whole series of essays on the role that technology can play to help the environment, but just consider this. California can no longer feed the nation. They’ve depleted their deep-water aquifers to the point that the ground in the central valley is measurably sinking, and the drought is making it necessary for farmers to uproot fruit and nut trees and many crops, because of the great volumes of water they consume—water that’s no longer available, or if it is, it’s too salty to use. But even if California CAN ship produce across the country, we know that that takes its toll on the environment because of the trucks and planes required to do it, and freshness is a concern. We also know that there have been outbreaks of disease—salmonella and listeria—associated with large-scale farming.
Local produce, on the other hand, is much fresher, it tastes better, it’s safer, and it supports a local farmer. And yes, you’re probably going to pay a little more, but how much is your health worth?
I’m not channeling Chicken Little here. The sky isn’t falling, but it’s a lot lower than it used to be. And before the naysayers climb all over me, yes, I know that some of the current climate change effects we’re experiencing are happening as a matter of the natural course of things. But I also know, because the science proves it, that we’re doing a lot of things that are making it worse, things that, through minor but deliberate efforts, we could change without a whole lot of personal impact. But there’s that ‘deliberate’ word again—meaning, let’s stop talking, and wringing our hands, and putting the bumper sticker on the car that says ‘save the bees,’ or wearing the ‘May the Forest Be With You’ T-shirt. Those are all fine. But a bit more minimal effort combined with deliberate action would go a very long way.
In other episodes, and in my leadership workshops, I often talk about the danger and ineffectiveness of slogan leadership—you know, putting up those motivational posters that show a crew of people on a misty river at sunrise, in a rowing scull, with the word ‘teamwork’ across the bottom. Or a person standing on top of a mountain, arms raised in celebration, silhouetted against the sunset, with the word ‘commitment’ across the bottom of the poster. That’s slogan leadership, and while the pictures are pretty, it’s a form of responsibility abdication. So, let’s not abdicate—let’s do. It shows the other corners of the natural world that we’re willing to make an effort to play well with others, and it sends the right message to our kids and grandkids.
We can’t control nature, but we can harness her awesome power to help clean up our act, like a martial arts master does against a stronger opponent. As someone who spends an awful lot of time in the natural world, I’d much rather have nature as my ally than my enemy. It’s a choice. And it’s our move.
Years ago, while still living in California, I began my writing career by submitting feature articles to local magazines in the San Francisco Bay Area. For some reason, I always gravitated toward offbeat subject matter, which apparently made my stories interesting – and desirable.
One day, at the request of my editor, I sat down to write a feature story about one of the local towns in our area. But as I started writing, it occurred to me that I really didn’t know what a feature story was, even though I’d been writing them for several years. Wikipedia, by the way, defines a feature story as a “human interest” story that is not typically tied to a recent news event. They usually discuss concepts or ideas that are specific to a particular market, and are often pretty detailed.
Anyway, I grabbed the dictionary off the shelf (this was years before the Web, and digital dictionaries were still a dream), and searched the Fs for ‘feature.’ I read the entry and satisfied my need to know and as I started to close the book, that’s when I saw it. Directly across the gutter (that’s what they call the middle of the open book where two pages come together) was the word ‘feces.’
Now I’m a pretty curious guy, so I wasn’t going to let this go. Needless to say, I know what feces is, but what was really interesting were the words at the bottom of the definition. The first one said, ‘See scat.’ So I turned to the Ss and looked up scat, and it turned out to be the word that wildlife biologists use for animal droppings. But wait, as they say, there’s more. THAT definition told me to see also, Scatologist. (You’ve got to be kidding me). But I did. You guessed it—someone who studies, well, scat.
An owl pellet (scat) from a friend’s collection.
So I called the biology department at my undergraduate alma mater, the University of California at Berkeley. When somebody answered the phone, I asked, ‘Do you have a … scatologist on staff?’ Of course, she replied, let me connect you to Dan. The next thing I knew I was talking with Dan, a very interesting guy, so interesting, in fact, that the next weekend I was with him in the hills, collecting owl pellets and the droppings of other animals to determine such things as what they eat, what parasites they might have, how predation of certain species affects populations of others, and so on. It was FASCINATING.
Remember that what got me started down this rabbit hole was the search for feature, which led me to feces. Well right underneath the suggestion that I also see scatologist, it said, see also, coprolyte. This was a new word for me, so off to the Cs I went, in search of it.
My very own coprolite.
A coprolite is, and I’m not making this up, a fossilized dinosaur dropping. A paleo-scat, as it were. I have one on my desk. OF COURSE I have one on my desk. Anyway, once again, I got on the phone, and this time I called the paleontology department at Berkeley, and soon found myself talking to a coprologist – yes, there is such a person. How do you explain THAT at a dinner party? Anyway, he agreed to meet with me, and once again I had one of those rare and wonderful days, learning just how fascinating the stuff is that came out of the north end of a south-bound dinosaur. He showed me how they slice the things on a very fine diamond saw and then examine them under a high-power microscope to identify the contents, just as the scatologist did with owl pellets and coyote scat.
Think about this for a moment. If I hadn’t allowed myself to fall prey to serendipity (Wikipedia defines it as “A “happy accident” or “a pleasant surprise”), I never would have met those remarkable people, and never would have written what turned out to be one of most popular articles I’ve ever written.
Another time, my wife and I were out walking the dogs in a field near our house. At one point, I turned around to check on the dogs and saw one of them rolling around on his back the way all dogs do when they find something disgustingly smelly. Sure enough, he had found the carcass of some recently dead animal, too far gone to identify but not so far gone that it didn’t smell disgusting. I dragged him home with my wife following about 30 feet behind and gave him the bath of baths to eliminate the smell. Anyway, once he smelled more or less like a dog again I felt that old curiosity coming on, so I went downstairs to my office and began to search Google for the source of that horrible smell that’s always present in dead things. And I found it.
In case you care.
The smell actually comes from two chemicals, both of which are so perfectly named that whoever named them clearly had a good time doing so. The first of them is called cadaverine; the second, putrescine. Can you think of better names for this stuff? Interestingly, putrescine is used industrially to make a form of nylon.
So what’s the point of this wandering tale? Storytellers are always looking for sources, and the question I get more often than any other is about the source of my stories. The answer, of course, has lots of answers, but in many cases I find stories because I go looking for them but leave my mind open to the power of serendipity. For this reason, I personally believe that the best thing about Wikipedia is the button on the left side of the home page that says, “Random Article.” I use it all the time, just to see where it takes me.
Curiosity is everything. I just wish there was more of it in the world.
Occasionally, I run across something that I just can’t ignore. Sunday morning was one of those times when my curiosity about the natural world just couldn’t be contained.
My wife Sabine and I had gone out for a walk. As we rounded the front of our house, we passed under the canopy of an apple tree grove in our front yard, Sabine pointed at the mulch and said, “OK, that’s just gross. Some dog puked in the yard.” She was right: there was a big pile of yellow goo spread out on the much, about 20 inches cross. It WAS pretty disgusting-looking, so I promised to clean it up when we got back.
When we did get back a few hours later, I grabbed a shovel to take care of the slime under the apple tree, but when I got out there I stopped dead in my tracks. Why? Because I SWEAR it was bigger, taller, and I kid you not, closer to the apple tree. In fact, there was now a blob of the yellow stuff on the side of one of the trees. Clearly this was not something that came out of a retching dog. But here’s the REALLY weird thing. When I went to pick some of it up with the shovel, a cloud of what looked like smoke erupted from it.
So I decided to leave it where it was. I went inside grabbed my iPad, and searched for “yellow slime on mulch.” Instantly, I was rewarded with a photograph of my slime – Fuligo septica, otherwise known as Dog Vomit Slime Mold. Scientists must have a blast naming things—that has to be a high point when they discover something new. And by the way, it’s also called scrambled egg slime and flowers of tan. In Mexico, they do, in fact, scramble them like eggs and eat them. In Spanish, they’re called caca de luna, which means … well, caca is the Spanish word for what comes out of the north end of a south-bound dog, luna means moon. So this is moon sh—well, you know. I’ve tried them in pueblos south of Mexico City, and they’re not bad—kind of nutty.
Anyway, slime molds are fascinating. They fall into a category called myxomycetes, which comes from two Greek words meaning “mucus fungus.” Yummy combination—that’s not much better than dog vomit. Anyway, the interesting thing about slime molds is that they pass through a development phase called a plasmodium. During the plasmodium phase, the cells that make up the organism rearrange themselves into a single, gigantic cell with millions of nuclei, that can weigh as much 45 pounds. They’re not plants, and they’re not animals—they’re something in between.
By the way, the smoke that came out of the thing when I nudged it with my shovel was a cloud of spores, on their way to propagate the species.
Here’s the other interesting thing about slime molds. They move. As in, they crawl. And how fast, you ask? Well, brace yourself: about an inch a day. That means that … never mind. You don’t want to go to sleep thinking about that. Just be sure to lock the door. In 1973, down in Dallas, people panicked when these things erupted in their gardens. They didn’t know what they were, and they thought it was an alien invasion. Of course, this WAS Dallas, and obviously there were too many people watching Invasion of the Body Snatchers that week.
I know that the vast majority of you couldn’t care less about slime molds, especially those that have ‘vomit’ and ‘mucus’ in their names. But you do have to admit that this is kind of interesting.
I feel like Egon Spengler in the Ghostbusters: “I collect spores, molds and fungus.”