The Research Myth

I recently had a conversation about technology’s impact on the availability and quality of information in the world today. It’s an argument I could make myself—that tech-based advances have resulted in access to more data and information. For example, before the invention of moveable type and the printing press, the only books that were available were chained to reading tables in Europe’s great cathedrals—they were that rare and that valuable. Of course, it was the information they contained that held the real value, an important lesson in today’s world where books are banned from modern first world library shelves because an ignorant cadre of adults decides that young people aren’t mature enough to read them—when it’s the adults who lack the maturity to face the fact that not everybody thinks the same way they do in this world, and that’s okay. But, I digress.  

Image of chained books in Hereford Cathedral. Copyright Atlas Obscura.

When moveable type and the printing press arrived, book manuscripts no longer had to be copied by hand—they could be produced in large quantities at low cost, which meant that information could be made available to far more people than ever before. To the general population—at least, the literate ones—this was a form of freedom.But to those who wanted to maintain a world where books were printed once and kept chained to desks where only the privileged few (the clergy) could read them, the free availability of knowledge and information was terrifying. Apparently, it still is. Knowledge is, after all, the strongest form of power. How does that expression go again? Oh yeah: Freedom of the Press…Freedom of Expression…Freedom of Thought…Sorry; I digress. Again.

Fast-forward now through myriad generations of technology that broadened information’s reach: The broadsheet newspaper, delivered daily, sometimes in both morning and evening editions. The teletype. Radio. The telephone. Television. The satellite, which made global information-sharing a reality. High-speed photocopying. High-speed printing. The personal computer and desktop publishing software. Email. Instant Messaging and texting. And most recently, on-demand printing and self-publishing through applications like Kindle Direct, and of course, AI, through applications like ChatGPT. I should also mention the technology-based tools that have dramatically increased literacy around the world, in the process giving people the gift of reading, which comes in the form of countless downstream gifts.

The conversation I mentioned at the beginning of this essay took a funny turn when the person I was chatting with tried to convince me that access to modern technologies makes the information I can put my hands on today infinitely better and more accurate. I pushed back, arguing that technology is a gathering tool, like a fishing net. Yes, a bigger net can result in a bigger haul. But it also yields more bycatch, the stuff that gets thrown back. I don’t care about the information equivalents of suckerfish and slime eels that get caught in my net. I want the albacore, halibut, and swordfish. The problem is that my fishing net—my data-gathering tool—is indiscriminate. It gathers what it gathers, and it’s up to me to separate the good from the bad, the desirable from the undesirable.

What technology-based information-gathering does is make it easy to rapidly get to AN answer, not THE answer.

The truth is, I don’t have better research tools today than I had in the 70s when I was in college. Back then I had access to multiple libraries—the Berkeley campus alone had 27 of them. I could call on the all-powerful oracle known as the reference librarian. I had access to years of the Reader’s Guide to Periodical Literature. I had Who’s Who, an early version of Wikipedia; and of course, I had academic subject matter experts I could query. 

Technology like AI doesn’t create higher quality research results; what technology gives me is speed. As an undergraduate studying Romance Languages, I would often run across a word I didn’t know. I’d have to go to the dictionary, a physical book that weighed as much as a Prius, open it, make my way to the right page, and look up the word—a process that could take a minute or more. Today, I hover my finger over the word on the screen and in a few seconds I accomplish the same task. Is it a better answer? No; it’s exactly the same. It’s just faster. In an emergency room, speed matters. In a research project, not so much. In fact, in research, speed is often a liability.

Here’s the takeaway from this essay. Whether I use the manual tools that were available in 1972 (and I often still do, by the way), or Google Scholar, or some other digital information resource, the results are the same—not because of the tool, but because of how I use what the tool generates. I’ve often said in my writing workshops that “you can’t polish a turd, but you can roll it in glitter.” Just because you’ve written the first draft of an essay, selected a pleasing font, right and left-justified the text, and added some lovely graphics, it’s still a first draft—a PRETTY first draft, but a first draft, nonetheless. It isn’t anywhere near finished.

The same corollary applies to research or any other kind of news or information-gathering activity. My widely cast net yields results, but some of those results are bycatch—information that’s irrelevant, dated, or just plain wrong. It doesn’t matter why it’s wrong; what matters is that it is. And this is where the human-in-the-loop becomes very important. I go through the collected data, casting aside the bycatch. What’s left is information. To that somewhat purified result I add a richness of experience, context, skepticism, and perspective. Ultimately I generate insight, then knowledge, and ultimately, wisdom. 

So again, technology provides a fast track to AN answer, but it doesn’t in any way guarantee that I’ve arrived at anything close to THE answer. Only the secret channels and dark passages and convoluted, illuminated labyrinths of the human brain can do that. 

So yeah, technology can be a marvelous tool. But it’s just a tool. The magic lies in the fleshware, not the hardware. Technology is only as good as the person wielding it. 

The Generational Blame Game

It’s a fundamental aspect of human nature, I believe, for each generation to criticize the generation that preceded it, often using them as a convenient scapegoat for all that’s wrong in the world. The current large target is my own generation, the Baby Boomers. I recently overheard a group of young people—mid-20s—complaining at length about their belief that the Boomers constitute a waste of flesh who never contributed much to society. Respectfully, I beg to differ; this is my response, along with a plea to ALL generations to think twice about how they characterize those who came before.

Millennials, sometimes called Gen-Y, and the Plurals, commonly referred to as Gen-Z, often blame Baby Boomers for the state of the world: the growing wealth imbalance, the violence and unpredictability of climate change, the multifaceted aftermath of COVID because of its impact on the supply chain, and the world’s growing political and cultural divisions—in essence, the world sucks and Boomers are to blame. They often proclaim Boomers to be a generation that contributed little of value to the world. This, of course, is a long-standing social convention: blame the old people, because they know not how dumb, useless and ineffective they are.

On the other hand, there’s a lot of admiration out there for the current Millennial über meisters of Silicon Valley—people like Mark Zuckerberg, Brian Chesky (AirBnB), Alexandr Wang (Scale AI), and Arash Ferdowsi (Dropbox). They deserve admiration for their accomplishments, but they didn’t create Silicon Valley—not by a long shot. The two generations that came before them did that.

But let’s consider the boring, stumbling, mistake-prone Boomers. You know them; they include such incompetent, non-contributing members of society as Bill Gates, the Steves, Jobs and Wozniak, Peggy Whitson, who recently retired as Chief Astronaut at NASA, Larry Ellison, who founded Oracle, Oprah Winfrey, creator of a breathtakingly influential media empire, Marc Benioff, founder of SalesForce, Reid Hoffmann, co-creator of LinkedIn, and Radia Perlman, the creator of the Spanning Tree Protocol, the rule set that the 25 billion computers on the Internet, give or take a few hundred million, use to talk to each another. And I won’t even bother to mention Tim Berners-Lee, the creator of the World Wide Web. 

What a bunch of losers.

But there may be a reason for the dismissal of an entire generation’s contributions to the world that goes beyond the tradition of putting elders on a literal or figurative ice floe and shoving them off to sea. I find it interesting that the newest arrivals on the generational scene judge the value of a generation’s contributions based on the application that that generation created. All hail Facebook, X, Instagram, Uber, Amazon, AirBnB, Google, Tencent, AliBaba, TikTok, GitHub, and Instacart, the so-called platform companies. Those applications are the “public face” of massive and incomprehensibly complex technological underpinnings, yet rarely does anyone make time today for a scintilla of thought about what makes all of those coveted applications—ALL of them—work. In fact, none of them—NONE of them—would exist without two things: the myriad computers (including mobile devices) on which they execute, and the global network that gives them life and makes it possible for them to even exist.

The tail wags the dog here: without the network, these applications could not function. Want some proof? The only time the vast majority of people on the planet are even aware of the network’s existence is when it breaks, which is seldom. But when it does? When ice or wind bring down aerial transmission cables, when a car takes out a phone pole, when fire destroys critical infrastructure and people can’t mine their precious likes on Facebook, when there’s a long weekend and everybody is home downloading or gaming or watching and the network slows to a glacial crawl, technological Armageddon arrives. Heart palpitations, panting, sweating, and audible keening begin, as people punch futilely at the buttons on their devices. But consider this: the global telephone network has a guaranteed uptime of 99.999 percent. In the industry, that’s called five-nines of reliability. And what does that mean in English? It means that on average, the phone network—today, the Internet—is unavailable to any given user for eight-and-a-half minutes a year. In a standard year, there are 525,600 minutes. For about nine of those every year, the network hiccups. Take a moment to think about that. 

When we think back on famous scientists and innovators, who comes to mind? Well, people like Alexander Graham Bell, of course, who invented the telephone, but who also invented the world’s first wireless telephone, called the photophone—and yes, it worked; or Thomas Edison, who became famous for the invention of the lightbulb, but actually invented many other things, and who was awarded 2,332 patents and founded 14 companies, including General Electric; the Wright Brothers, who flew successfully at Kitty Hawk; Watson and Crick, who discovered the DNA double helix and created a path to modern genetics and treatments for genetic disease; Bardeen, Bartain and Shockley, unknown names to most people, but names attached to the three scientists at Bell Telephone Laboratories who invented the transistor; Philo T. Farnsworth, the creator of television; and Marie Curie, who did pioneering research on radioactivity. These are all famous names from the late 1800s all the way through the 1960s. But then, there’s a great twenty-year leap to the 1980s, the time when Generation X came into its own. Movies were made about this generation, some of the best ever: Ferris Buehler’s Day Off. The Breakfast Club. Home Alone. Sixteen Candles. St. Elmo’s Fire. Clerks. The Lost Boys. Karate Kid. Gen-X was a widely criticized generation, an ignored, under-appreciated, self-reliant, go-it-alone generation of entrepreneurs that includes Jeff Bezos of Amazon fame, Cheryl Sandberg of Facebook, Sergey Brin of Google, Meg Whitman of Hewlett-Packard, Travis Kalanick, of Uber, and dare I say it, Elon Musk. All major contributors to the world’s technology pantheon, some as inventors, some as innovators. The power of the Internet to allow data aggregation and sharing made it possible for platform companies like Uber, eBay, Facebook and Google to exist. Those weren’t inventions, they were innovations (and to be sure, exceptional innovations!), built on top of pre-existing technologies.

Even the much-talked-about creations of Elon Musk aren’t inventions. Let’s look at StarLink, the SpaceX constellation of orbiting communication satellites. A satellite comprises radio technology to make it work; solar cells to power it; semiconductors to give it a functional brain; and lasers to allow each satellite to communicate with others. All of those technologies—ALL of them—were invented at Bell Labs in or around the 1940s. In fact, the first communications satellite, Telstar, was created at Bell Labs and launched into orbit in 1962—more than 60 years ago—to broadcast television signals. 

That 20-year leap between the 60s and the 80s conveniently ignores an entire generation and its contributions to the world—not just techno-geeks, but content and entertainment and media people who redefined our perception of the world. This was the time of the Baby Boomers, and while you may see us—yes, I am one—as an annoying group of people that you wish would just go away, you might want to take a moment to recognize the many ways my generation created the lifestyle enjoyed by Millennials and Gen-Z—and took steps to ensure that it would endure.

The thing about Boomer researchers, scientists, and innovators was that with very few exceptions, they were happy to work quietly behind the scenes. They didn’t do great big things exclusively for money or power; they did them because they were the right things to do, because they wanted to leave the world a better place for those who came later. And they did, in more ways than you can possibly imagine.

Let’s start with the inventions and innovations that made possible, among other things, the devices on which you watch, listen or read, and the content they deliver. I know I’ve already mentioned some of these people, but they deserve a few more words. 

Let’s start with the Steves—and no, I don’t mean me. I’m talking about Steve Wozniak and Steve Jobs who did quite a few things before inventing the iconic Macintosh. Both were born in the 1950s and grew up in the San Francisco Bay Area, and met while they were summer interns at Hewlett-Packard. In 1977, seven years before the Mac, they introduced the world to the Apple II personal computer, which included color graphics, a sound card, expansion slots, and features that made it the first machine that came close to the capabilities of modern PCs. Later, they introduced what many called the “WIMP Interface,” for windows, icons, mice, and pointy fingers, the hallmarks of what later became the Mac operating system—and ultimately, Windows 95 and the generations of that OS that followed. Incidentally, the incredibly stable, highly dependable Macintosh operating system is based on UNIX, an operating system first designed and developed at—you guessed it—Bell Laboratories.

Next we have Sir Tim Berners-Lee, born in London in 1955. He grew up around computers, because his parents were mathematicians who worked on the Ferranti Mark I, the first computer in the world to be sold commercially. He became a software consultant for the CERN Particle Physics Laboratory in Switzerland, which became famous for being the home of the Very Large Hadron Collider, which was recently used by astrophysicists to discover the Higgs Boson.

While at CERN in the 1980s, Berners-Lee took on the challenge of organizing and linking all the sources of information that CERN scientists relied on—text, images, sound, and video—so that they would be easily accessible via the newfangled network that had just emerged called the Internet. In the process he came up with the concept for what became the World Wide Web, which he laid out in a terrific research paper in 1989. Along the way he developed a software language to create web pages, called HTML, along with the first web browser, which he made available to everyone, free of charge, in 1991.

Most people think of the Internet and the World Wide Web as the same thing—but they aren’t. The Internet is the underlying transport infrastructure; the Web is an application that rides on top of that infrastructure, or better said, a set of applications, that make it useful to the entire world. 

Next, let me introduce you to Ray Kurzweil, who decided he would be an inventor before he started elementary school. By the time he turned 15, he had built and programmed his own computer to compose music. After graduating from MIT with degrees in computer science and literature, he created a system that enabled computers to read text characters, regardless of the font.

Kurzweil invented many things, but he is perhaps best known for coining the concept of the Singularity, the moment when digital computers and the human brain merge and communicate directly with each other. It’s a fascinating idea. A good business PC easily operates at four billion cycles per second. The human brain, on the other hand, operates at about ten cycles per second. But: a digital PC has limited memory, whereas the human brain’s memory is essentially unlimited. So what happens if we combine the blindingly fast clock speed of a PC with the unlimited memory of the human brain? The Singularity. Cue the Twilight Zone music.

Now let me introduce you to Ajay Bhatt. Born in India, he received an undergrad degree in electrical engineering before emigrating to the U.S., where he earned a master’s degree in the same field, working on technology to power the Space Shuttle. After joining Intel in 1990, he had an epiphany while working on his PC one evening. What if, he wondered, if it was possible for peripheral devices to connect to a computer as easily as plugging an electrical cord into a wall socket? Not all that hard, he decided, and he and his colleagues invented the Universal Serial Bus, which we all know as USB.

And then we have one of my favorites, Bob Metcalfe. Another MIT grad with degrees in 

engineering and management as well as a PhD from Harvard,  he joined Xerox’s Palo Alto Research Center, better known as Xerox PARC, a well-respected facility that has been compared to the east coast’s Bell Labs. While he was there, Metcalfe and his colleagues developed a technique for cheaply and easily connecting computers so that they can share files at high speed. The technology that resulted is called Ethernet, the basis for nearly every connectivity solution in use today in modern computer networks, including WiFi. He went on to found 3Com Corporation, but for me, he will always be most famous for what has come to be known as Metcalfe’s Law: that the value of a mesh network, meaning a network in which every computer connects to every other computer in the network, increases as a function of the square of the number of devices that are attached. Want that in plain English? When a new computer loaded with data connects to a mesh network, the combined value of all that data and its shared access doesn’t increase in a linear way; it increases exponentially. Don’t believe it? Look at every one of the so-called platform companies that we discussed earlier: Apple’s App or music store, Uber, Amazon, every single social media company, and for that matter, the telephone network and the World Wide Web itself.

Dr. Robert Jarvik was a prodigy who invented a surgical stapler and other medical devices while he was still a teenager. But then he got serious. While he was an undergraduate student at the University of Utah in 1964, his father needed to have heart surgery. That ordeal influenced Jarvik to turn his curiosity, inventiveness and problem-solving skills—along with his medical degree— toward finding a method to keep patients with failing hearts alive until they could receive a transplant. While he wasn’t the first to develop an artificial heart, Jarvik’s 1982 creation, the Jarvik 7, was the first such device that could be implanted inside a person’s body. Today, Jarvik continues to work on a device that can serve as a permanent replacement organ.

Here’s another one, and this one fascinates me. Sookie Bang was born and raised in South Korea. She graduated from Seoul National University in 1974 and earned a Ph.D. in microbiology from the University of California at Davis in 1981. As a professor and researcher at the South Dakota School of Mines and Technology, her specialty is bioremediation—for example, using bacteria as an ingredient in a sealant to fix cracks caused by weathering and by freezing water that seeps into the concrete outer surfaces of buildings. Bang and her colleagues figured out how to speed up a naturally occurring process in which bacteria extract nitrogen from urea, which produces carbon dioxide and ammonia as byproducts. The CO2 and ammonia then react with water and calcium to form calcium carbonate, the chemical compound that we know as limestone. The patch created by the bacterial process seals the crack from the inside out and integrates with the porous concrete, repairing the crack. In essence, the concrete becomes self-healing.

Another Boomer name you need to know is Dean Kamen, who was born in Long Island, N.Y., in 1951. You may not know who he is, but I guarantee you know at least one of his inventions.

In the early 2000s, Kamen attracted media attention because investors were knocking each other over to be the first to fund “Project Ginger.” The project was highly secretive, but when the veil was finally lifted, the world was stunned when they were introduced to the Segway Transporter. The device incorporates sophisticated electronics and a gyroscope that allow it to self-balance, and moves, stops and turns based on subtle changes in the driver’s posture. Today, the Segway’s progeny include the ubiquitous “hover boards” that every kid seems to have. But Kamen’s invention also led to the development of an extraordinary device that has changed the lives of thousands of people: a remarkable wheelchair that, thanks to its gyros, can convert from a standard four-wheel chair to a two-wheel chair, in the process lifting the occupant up to eye level with an adult. It can even climb stairs. 

 But Kamen was an inventor long before he created the Segway. While he was still a college student at Worcester Polytechnic Institute in 1972, he invented a wearable device called the ambulatory infusion pump. It changed the lives of diabetics, freeing them from having to worry about injecting themselves with insulin. The pump did it for them.

But he didn’t stop there. After creating the ambulatory infusion pump, Kamen went after a solution for patients with severe kidney disease who had to travel to dialysis centers for the treatments they needed to survive. He invented a portable machine that allowed patients to give themselves dialysis treatments at home, while sleeping. In 1993, it was named Medical Product of the Year.

The list goes on: flexible foot prostheses, artificial skin grafts, innovative battery designs, and plenty of others, all created by experienced, gifted innovators and inventors—and dare I say it, with a small bit of pride, Baby Boomers.

The truth is, every generation yields its own crop of gifted people who make important  contributions to science, engineering, the arts, medicine, and society at-large. But without the contributions of those who came before, nothing we enjoy today would exist. The Boomers stood on the shoulders of giants from the Greatest and Silent Generations, just as Gen-X, the Millennials and Gen-Z stand on Boomer shoulders, and just as the next generations to arrive will stand on theirs. It’s easy to criticize those who came before, but it’s also not much of a stretch to recognize that the current generations of any era wouldn’t be where they are or have what they have without them. So instead of looking for the failures of prior generations, maybe we all need to take a moment to recognize their successes—and how those successes benefit us. Of course, if you still want to blame the Boomers for the Internet, mobile telephony, and the commercial success of the global semiconductor industry that makes literally EVERYTHING work, I guess I’m good with that.

The Wonderful, Terrible Gift of Science

*A note before you begin to read: This is a long post; if you’d rather listen to it, you can find it at the Natural Curiosity Project Podcast.

Part I

LIFE IS VISUAL, so I have an annoying tendency to illustrate everything—either literally, with a contrived graphic or photo, or through words. So: try to imagine a seven-sided polygon, the corners of which are labeled curiosity, knowledge, wisdom, insight, data, memory, and human will. Hovering over it, serving as a sort of conical apex, is time. 

Why these eight words? A lifetime of living with them, I suppose. I’m a sucker for curiosity; it drives me, gives my life purpose, and gives me a decent framework for learning and applying what I learn. Knowledge, wisdom, insight, and data are ingredients that arise from curiosity and that create learning. Are they a continuum? Is one required before the next? I think so, but that could just be because of how I define the words. Data, to me, is raw ore, a dimensionless precursor. When analyzed, which means when I consider it from multiple perspectives and differing contexts, it can yield insight—it lets me see beyond the obvious. Insight, then, can become knowledge when applied to real-world challenges, and knowledge, when well cared for and spread across the continuum of a life of learning, becomes wisdom. And all of that yields learning. And memory? Well, keep listening.

Here’s how my model came together and why I wrestle with it. 

Imagine an existence where our awareness of ‘the past’ does not exist, because our memory of any action disappears the instant that action takes place. In that world, a reality based on volatile memory, is ‘learning,’ perhaps defined as knowledge retention, possible? If every experience, every gathered bit of knowledge, disappears instantly, how do we create experience that leads to effective, wisdom-driven progress, to better responses the next time the same thing happens? Can there even be a next time in that odd scenario, or is everything that happens to us essentially happening for the first time, every time it happens?

Now, with that in mind, how do we define the act of learning? It’s more than just retention of critical data, the signals delivered via our five senses. If I burn myself by touching a hot stove, I learn not to do it again because I form and retain a cause-effect relationship between the hot stove, the act of touching it, and the pain the action creates. So, is ‘learning’ the process of applying retained memory that has been qualified in some way? After all, not all stoves are hot.

Sometime around 500 BC, the Greek playwright Aeschylus observed that “Memory is the mother of all wisdom.” If that’s the case, who are we if we have no memory? And I’m not just talking about ‘we’ as individuals. How about the retained memory of a group, a community, a society?

Is it our senses that give us the ability to create memory? If I have no senses, then I am not sentient. And if I am not sentient, then I can create no relationship with my environment, and therefore have no way to respond to that environment when it changes around me. And if that happens, am I actually alive? Is this what awareness is, comprehending a relationship between my sense-equipped self and the environment in which I exist? The biologist in me notes that even the simplest creatures on Earth, the single-celled Protozoa and Archaea, learn to respond predictably to differing stimuli.

But I will also observe that while single-celled organisms routinely ‘learn,’ many complex multi-celled organisms choose not to, even though they have the wherewithal to do so. Many of them currently live in Washington, DC. A lifetime of deliberate ignorance is a dangerous thing. Why, beyond the obvious? Because learning is a form of adaptation to a changing environment—call it a software update if you’re more comfortable with that. Would you sleep well at night, knowing that the antivirus software running on your computer is a version from 1988? I didn’t think so. So, why would you deliberately choose not to update your personal operating system, the one that runs in your head? This is a good time to heed the words of Charles Darwin: It is not the strongest that survive, nor the most intelligent, but those that are most adaptable to change. Homo sapiens, consider yourselves placed on-notice.

Part II

RELATED TO THIS CONUNDRUM IS EPISTEMOLOGY—the philosophy that wrestles with the limits of knowledge. Those limits don’t come about because we’re lazy; they come about because of physics. 

From the chemistry and physics I studied in college, I learned that the convenient, simple diagram of an atom that began to appear in the 1950s is a myth. Electrons don’t orbit the nucleus of the atom in precise paths, like the moon orbiting the Earth or the Earth orbiting the Sun. They orbit according to how much energy they have, based on their distance from the powerfully attractive nucleus. The closer they are, the stronger they’re held by the electromagnetic force that holds the universe together. But as atoms get bigger, as they add positively-charged protons and charge-less neutrons in the densely-packed nucleus, and layer upon layer of negatively charged orbiting electrons to balance the nuclear charge, an interesting thing happens. As layers of electrons are added,  the strength with which the outermost electrons are held by the nucleus decreases with distance, making them less ‘sticky,’ and the element becomes less stable. 

This might be a good time to make a visit to the Periodic Table of the Elements. Go pull up a copy and follow along.

Look over there in the bottom right corner. See all those elements with the strange names and big atomic numbers—Americium, Berkelium, Einsteinium, Lawrencium? Those are the so-called transuranium elements, and they’re not known for their stability. If a distant electron is attracted away for whatever reason, that leaves an element with an imbalance—a net positive charge. That’s an unstable ion with a positive charge that wants to get back to a stable state, a tendency defined by the Second Law of Thermodynamics and a process called entropy, which we’ll discuss shortly. It’s also the heart of the strange and wonderful field known as Quantum Mechanics.

This is not a lesson in chemistry or nuclear physics, but it’s important to know that those orbiting electrons are held within what physicists call orbitals, which are statistically-defined energy constructs. We know, from the work done by scientists like Werner Heisenberg, who was a physicist long before he became a drug dealer, that an electron, based on how far it is from the nucleus and therefore how much energy it has, lies somewhere within an orbital. The orbitals, which can take on a variety of three-dimensional shapes that range from a single sphere to  multiple pear-shaped spaces to a cluster of balloons, define atomic energy levels and are stacked and interleaved so that they surround the nucleus. So, the orbital that’s closest to the nucleus is called the 1s orbital, and it’s shaped like a sphere. In the case of Hydrogen, element number one in the Periodic Table, somewhere within that orbital is a single lonely electron. We don’t know precisely where it is within the 1s orbital at any particular moment; we just know that it’s somewhere within that mathematically-defined sphere. This is what the Heisenberg Uncertainty Principle is all about: we have no way of knowing what the state of any given electron is at any point in time. And, we never will. We just know that statistically, it’s somewhere inside that spherical space.

Which brings us back to epistemology, the field of science (or is it philosophy?) that tells us that we can never know all that there is to know, that there are defined limits to human knowledge. Here’s an example. We know beyond a shadow of a doubt that the very act of observing the path of an electron changes the trajectory of that electron, which means that we can never know what its original trajectory was before we started observing it. This relationship is described in a complex mathematical formula called Schrödinger’s Equation.

Look it up, study it, there will be a test. The formula, which won its creator,  Erwin Schrödinger, the Nobel Prize in 1933, details the statistical behavior of a particle within a defined space, like an energy-bound atomic orbital. It’s considered the fundamental principle of quantum mechanics, the family of physics that Albert Einstein made famous. In essence, we don’t know, we can’t know, what the state of a particle is at any given moment, which implies that the particle can exist, at least according to Schrödinger, in two different states, simultaneously. This truth lies at the heart of the new technology called quantum computing. In traditional computing, a bit (Binary Digit) can have one or the other of two states: zero or one. But in quantum computing, we leave bits behind and transact things using Qubits (quantum bits), which can be zero, one, or both zero and one at the same time.  Smoke ‘em if you got ‘em.

The world isn’t neat and tidy where it matters: it’s sloppy and ill-defined and statistical. As much as the work of Sir Isaac Newton described a physical world defined by clear laws of gravity, and velocity, and acceleration, and processes that follow clearly-defined, predictably linear outcomes, Schrödinger’s, Heisenberg’s, and Einstein’s works say, not so fast. At the atomic level, the world doesn’t work that way. 

I know—you’re lighting up those doobies as you read this. But this is the uncertainty, the necessary inviolable unknown that defines science. Let me say that again, because it’s important. Uncertainty Defines Science. It’s the way of the universe. Every scientific field of study that we put energy into, whether it’s chemistry, pharmacology, medicine, geology, engineering, genetics, or a host of others, is defined by the immutable Laws of Physics, which are governed by the necessary epistemological uncertainties laid down by people like Werner Heisenberg and Erwin Schrödinger, and codified by Albert Einstein.

Part III

ONE OF MY FAVORITE T-SHIRTS SAYS,

I READ.

I KNOW SHIT.

I’m no physicist, Not by a long shot. But I do read, I did take Physics and Chemistry, and I was lucky enough to have gone to Berkeley, where a lot of this Weird Science was pioneered. I took organic chemistry from a guy who was awarded a Nobel Prize and had more than a few elements named after him (Glenn Seaborg) and botany from the guy who discovered how photosynthesis works and also had a Nobel Prize (Melvin Calvin). I know shit.

But the most important thing I learned and continue to learn, thanks to those grand masters of knowledge, is that uncertainty governs everything. So today, when I hear people criticizing scientists and science for not being perfect, for sometimes being wrong, for not getting everything right all the time, for not having all the answers, my blood boils, because they’re right, but for the wrong reasons. Science is always wrong—and right. Schrödinger would be pleased with this duality. It’s governed by the same principles that govern everything else in the universe. Science, which includes chemistry, pharmacology, medicine, geology, engineering, genetics, and all the other fields that the wackadoodle pseudo-evangelists so viciously criticized during the pandemic, and now continue to attack, can’t possibly be right all the time because the laws of the universe fundamentally prevent us from knowing everything we need to know to make that happen. Physics doesn’t come to us in a bento box wrapped in a ribbon. Never in the history of science has it ever once claimed to be right. It has only maintained that tomorrow it will be more right than it is today, and even more right the day after that. That’s why scientists live and die by the scientific method, a process that aggressively and deliberately pokes and prods at every result, looking for weaknesses and discrepancies. Is it comfortable for the scientist whose work is being roughed up? Of course not. But it’s part of being a responsible scientist. The goal is not for the scientist to be right; the goal is for the science to be right. There’s a difference, and it matters.

This is science. The professionals who practice it, study it, probe it, spend their careers trying to understand the rules that govern it, don’t work in a world of absolutes that allow them to design buildings that won’t fail and drugs that will work one hundred percent of the time and to offer medical diagnoses that are always right and to predict violent weather with absolute certainty. No: they live and work in a fog of uncertainty, a fuzzy world that comes with no owner’s manual, yet with that truth before them, and accepting the fact that they can never know enough, they do miraculous things. They have taken us to the stars, created extraordinary energy sources, developed mind-numbingly complex genetic treatments and vaccines, and cured disease. They have created vast, seamless, globe-spanning communications systems, the first glimmer of artificial intelligence, and demonstrated beyond doubt that humans play a major role in the fact that our planet is getting warmer. They have identified the things that make us sick, and the things that keep us well. They have helped us define ourselves as a sentient species.

And, they are pilloried by large swaths of the population because they’re not one hundred percent right all the time, an unfair expectation placed on their shoulders by people who have no idea what the rules are under which they work on behalf of all of us. 

Here’s the thing, for all of you naysayers and armchair critics and nonbelievers out there: Just because you haven’t taken the time to do a little reading to learn about the science behind the things that you so vociferously criticize and deny, just because you choose deliberate ignorance over an updated mind, doesn’t make the science wrong. It does, however, make you lazy and stupid. I know shit because I read. You don’t know shit because you don’t. Take a lesson from that.

Part IV

THIS ALSO TIES INTO WHAT I BELIEVE to be the most important statement ever uttered by a sentient creature, and it begins at the liminal edges of epistemological thought: I am—the breathtaking moment of self-awareness. Does that happen the instant a switch flips and our senses are activated? If epistemology defines the inviolable limits of human knowledge, then what lies beyond those limits? Is human knowledge impeded at some point by a hard-stop electric fence that prevents us from pushing past the limits? Is there a ‘there be dragons here’ sign on the other side of the fence, prohibiting us from going farther? I don’t think so. For some, that limit is the place where religion and faith take over the human psyche when the only thing that lies beyond our current knowledge is darkness. For others, it stands as a challenge: one more step moves us closer to…what, exactly?

A thinking person will experience a moment of elegance here, as they realize that there is no fundamental conflict between religious faith and hardcore science. The two can easily coexist without conflict. Why? Because uncertainty is alive and well in both. Arthur C. Clarke: Any sufficiently advanced technology is indistinguishable from magic.

Part V

THIS BRINGS ME TO TIME, and why it sits at the apex of my seven-sided cone. Does time as we know it only exist because of recallable human memory? Does our ability to conceive of the future only exist because, thanks to accessible memory and a perception of the difference between a beginning state and an end state,  of where we are vs. where we were, we perceive the difference between past and present, and a recognition that the present is the past’s future, but also the future’s past?

Part VI

SPANISH-AMERICAN WRITER AND PHILOSOPHER George Santayana is famous for having observed that ‘those who fail to heed the lessons of history are doomed to repeat them.’ It’s a failing that humans are spectacularly good at, as evidenced by another of Santayana’s aphorisms—that ‘only the dead have seen the end of war.’ I would observe that in the case of the first quote, ‘heed’ means ‘to learn from,’ not simply ‘to notice.’ But history, by definition, means learning from things that took place in the past, which means that if there is no awareness of the past, then learning is not possible. So, history, memory, and learning are, to steal from Douglas Adams, the author of The Hitchhiker’s Guide to the Galaxy, “inextricably intertwingled” (more on that phrase later). And if learning can’t happen, does that then mean that time, as we define it, stops? Does it become dimensionless? Is a timeless system the ultimate form of entropy, the tendency of systems to seek the maximum possible state of disorder, including static knowledge? Time, it seems, implies order, a logical sequence of events that cannot be changed. So, does entropy seek timelessness? Professor Einstein, white courtesy telephone, please.

The Greek word chronos defines time as a physical constant, as in, I only have so much time to get this done. Time is money. Only so much time in a day. 60 seconds per minute, 60 minutes per hour, 24 hours per day. But the Greeks have a second word, kairós, which refers to the quality of time, of making the most of the time you have, of savoring time, of using it to great effect. Chronos, it seems, is a linear and quantitative view of time; kairós is a qualitative version. 

When I was a young teenager, I read a lot of science fiction. One story I read, a four-book series by novelist James Blish (who, with his wife, wrote the first Star Trek stories for television), is the tale of Earth and its inhabitants in the far distant future. The planet’s natural resources have been depleted by human rapaciousness, so, entire cities lift off from Earth using a form of anti-gravity technology called a Gravity Polaritron Generator, or spindizzy for short, and become independent competing entities floating in space. 

In addition to the spindizzy technology, the floating cities have something called a stasis field, within which time does not exist. If someone is in imminent danger, they activate a stasis field that surrounds them, and since time doesn’t exist within the field, whatever or whoever is in it cannot be hurt or changed in any way by forces outside the field. It’s an interesting concept, which brings me to a related topic. 

One of my favorite animals, right up there with turtles and frogs, is the water bear, also called a  tardigrade (and, charmingly by some, a moss piglet). They live in the microscopically tiny pools of water that collect on the dimpled surfaces of moss leaves, and when viewed under a microscope look for all the world like tiny living gummy bears. 

Tardigrades can undergo what is known as cryptobiosis, a physiological process by which the animal can protect itself from extreme conditions that would quickly kill any other organism. Basically, they allow all the water in their tiny bodies to completely evaporate, in the process turning themselves into dry, lifeless little husks. They become cryptospores. Water bears have been exposed to the extreme heat of volcanos, the extreme cold of Antarctica, and intense nuclear radiation inside power plants; they have been placed outside on the front stoop of the International Space Station for days on end, then brought inside, with no apparent ill effects. Despite the research into their ability to survive such lethal environments, we still don’t really know how they do it. Uncertainty.

But maybe I do know. Perhaps they have their own little stasis field that they can turn on and off at will, in the process removing time as a factor in their lives. Time stops, and if life can’t exist without time, then they can’t be dead, can they? They become like Qubits, simultaneously zero and one, or like Schrödinger’s famous cat, simultaneously dead and alive.

Part VII

IN THE HITCHHIKER’S GUIDE TO THE GALAXY, Douglas Adams uses the phrase I mentioned earlier and that I long ago adopted as one of my teaching tropes. It’s a lovely phrase that just rolls off the tongue: “inextricably intertwingled.” It sounds like a wind chime when you say it out loud, and it makes audiences laugh when you use it to describe the interrelatedness of things. 

The phrase has been on my mind the last few days, because its meaning keeps peeking out from behind the words of the various things I’ve been reading. Over the last seven days I’ve read a bunch of books from widely different genres—fiction, biography, science fiction, history, philosophy, nature essays, and a few others that are hard to put into definitive buckets.

There are common threads that run through all of the books I read, and not because I choose them as some kind of a confirmationally-biased reading list (how could Loren Eiseley’s Immense Journey, Arthur C. Clarke’s Songs of a Distant Earth, E. O. Wilson’s Tales from the Ant World, Malcolm Gladwell’s Revenge of the Tipping Point, Richard Feynman’s Surely You’re Joking, Mister Feynman, and Studs Terkel’s And They All Sang possibly be related, other than the fact that they’re books?). Nevertheless, I’m fascinated by how weirdly connected they are, despite being so very, very different. Clarke, for example, writes a whole essay in Songs of a Distant Earth about teleology, a term I’ve known forever but have never bothered to look up. It means looking at the cause of a phenomenon rather than its perceived purpose to discern its reason for occurring. For example, in the wilderness, lightning strikes routinely spark forest fires, which burn uncontrolled, in the process cleaning out undergrowth, reducing the large-scale fire hazard, but doing very little harm to the living trees, which are protected by their thick bark—unless they’re unhealthy, in which case they burn and fall, opening a hole in the canopy that allows sunlight to filter to the forest floor, feeding the seedlings that fight for their right to survive, leading to a healthier forest. So it would be easy to conclude that lightning exists to burn forests. But that’s a teleological conclusion that focuses on purpose rather than cause. Purpose implies intelligent design, which violates the scientific method because it’s subjective and speculative. Remember—there’s no owners manual.

The initial cause of lightning is wind. The vertical movement of wind that precedes a thunderstorm causes negatively charged particles to gather near the base of the cloud cover, and positively charged particles to gather near the top, creating an incalculably high energy differential between the two. But nature, as they say, abhors a vacuum, and one of the vacuums it detests is the accumulation of potential energy. Natural systems always seek a state of entropy—the lowest possible energy state, the highest state of disorder. I mentioned this earlier; it’s a physics thing, the Second Law of Thermodynamics. As the opposing charges in the cloud grow (and they are massive—anywhere from 10 to 300 million volts and up to 30,000 amps), their opposite states are inexorably drawn together, like opposing poles of a gigantic magnet (or the positively charged nuclei and negatively charged electrons of an atom), and two things can happen. The energy stored between the “poles” of this gigantic aerial magnet—or, if you prefer, battery—discharges within the cloud, causing what we sometimes call heat lightning, a ripple of intense energy that flashes across the sky. Or, the massive negative charge in the base of the cloud can be attracted to positive charges on the surface of the Earth—tall buildings, antenna towers, trees, the occasional unfortunate person—and lightning happens. 

It’s a full-circle entropic event. When a tree is struck and a fire starts, the architectural order that has been painstakingly put into place in the forest by nature is rent asunder. Weaker trees fall, tearing open windows in the canopy that allow sunlight to strike the forest floor. Beetles and fungi and slugs and mosses and bacteria and nematodes and rotifers consume the fallen trees, rendering them to essential elements that return to the soil and feed the healthy mature trees and the seedlings that now sprout in the beams of sunlight that strike them. The seedlings grow toward the sunlight; older trees become unhealthy and fall; order returns. Nature is satisfied. Causation, not purpose. Physics, not intelligent design. Unless, of course, physics is intelligent design. But we don’t know. Uncertainty.

E. O. Wilson spends time in more than one of his books talking about the fact that individuals will typically act selfishly in a social construct, but that groups of individuals in a community will almost always act selflessly, doing what’s right for the group. That, by the way, is the difference between modern, unregulated capitalism and what botany professor Robin Wall Kimmerer calls “the gift economy” in her wonderful little book, The Serviceberry. This is not some left-leaning, unicorn and rainbows fantasy: it’s a system in which wealth is not hoarded by individuals, but rather invested in and shared with others in a quid pro quo fashion, strengthening the network of relationships that societies must have to survive and flourish. Kimmerer cites the story of an anthropologist working with a group of indigenous people who enjoy a particularly successful hunt, but is puzzled by the fact that they now have a great deal of meat but nowhere to keep it cold so that it won’t spoil. “Where will you store it to keep it fresh for later?” The anthropologist asks. “I store it in my friends’ bellies,” the man replies, equally puzzled by the question. This society is based on trust, on knowing that the shared meat will be repaid in kind. It is a social structure based on strong bonds—kind of like atoms. Bonds create stability; individual particles do the opposite, because they’re less stable. 

In fact, that’s reflected in many of the science fiction titles I read: that society’s advances come about because of the application of the common abundance of human knowledge and will. Individuals acting alone rarely get ahead to any significant degree, and if they do, it’s because of an invisible army working behind them. But the society moves ahead as a collective whole, with each member contributing. Will there be those who don’t contribute? Of course. It’s a function of uncertainty and the fact that we can never know with one hundred percent assurance how an individual within a group will behave. There will always be outliers, but their selfish influence is always neutralized by the selfless focus of the group. The behavior of the outlier does not define the behavior of the group. ‘One for one and none for all’ has never been a rallying call.

Part VIII

THIS ESSAY APPEARS TO WANDER, because (1) it wanders and (2) it connects things that don’t seem to be connected at all, but that clearly want to be. Learning doesn’t happen when we focus on the things; it happens when we focus on the connections between the things. The things are data; the connections create insight, which leads to knowledge, wisdom, action, a vector for change. Vector—another physics term. It refers to a quantity that has both direction and magnitude. The most powerful vector of all? Curiosity.

Science is the only tool we have. It’s an imperfect tool, but it gets better every time we use it. Like it or not, we live in a world, in a universe, that is defined by uncertainty. Science is the tool that helps us bound that uncertainty, define its hazy distant edges, make the unclear more clear, every day. Science is the crucible in which human knowledge of all things is forged. It’s only when we embrace that uncertainty, when we accept it as the rule of all things, when we revel in it and allow ourselves to be awed by it—and by the science-based system that allows us to constantly push back the darkness—that we begin to understand. Understand what, you say? Well, that’s the ultimate question, isn’t it?

When in the Course of Human Events

When in the Course of human events it becomes necessary for one people to dissolve the political bands which have connected them with another and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. — That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, — That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn that mankind are more disposed to suffer, while evils are sufferable than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security. 

Those, of course, are Thomas Jefferson’s opening lines of the U.S. Declaration of Independence. I’ll be referring back to them throughout this essay, so keep them close at hand. 

I am not a political scientist, nor am I a historian or sociologist. Here’s what I am: well-educated, with an undergrad degree from UC Berkeley in Spanish, and a minor in Biology; a Masters from St. Mary’s in International Business; and a Doctorate from the Da Vinci Institute in South Africa, where I studied technology and its sociological impacts across the world. I am well read, averaging 140 books per year, including everything from fiction of all kinds, to poetry, history, geography, travel, narrative essay, biography, technology, children’s books, and biology. I’m well-traveled: I spent my teen years in Francisco Franco’s Spain, I’ve lived and worked in more than 100 countries, and one of my favorite genres to read is the travel essay, which gives me insights into places I haven’t had the opportunity to visit. Finally, I’m a professional writer, speaker, and educator, with more than 100 books and hundreds of articles and white papers on the market.

Here’s how this all sugars off (that’s a Vermont term, by the way, that refers to how we boil maple sap to create syrup). First, I’m ferociously curious. I live to ask the question, ‘Why?’ I’m not satisfied simply knowing what something does, or even how. That quality served me well throughout my career as a consulting analyst to corporate executives, who wanted help with their strategic decision-making processes. I’m not satisfied with something because somebody said it—I want to know why, and I want to know that it’s true. That takes work; it means I have to check my sources and dig into the facts before I accept a conclusion. If more people were willing to do that simple thing, to exercise their right, obligation, and responsibility to be healthily skeptical, to respond to a ‘stated fact’ with, “Are you sure about that? I’m just gonna check one more source to verify,” the fake news issue wouldn’t be an issue. Like it or not, believe it or not, the Earth is not flat, vaccines work and do not cause autism, we really have been to the moon, the universe is expanding, and evolution is a fact, not a theory. Science is science for a reason: because by its very definition, the things it proposes have been exhaustively verified through a rigorous, competitive process of validation. It’s not opinion: it’s fact. Period. Furthermore, news is precisely that—news. It isn’t opinion. Yet in the minds of many, the two are seen as one and the same, and far too many people are willing to just accept what they hear or read without question as an undeniable truth. THAT is an abdication of responsibility as a citizen in a free country. It’s an extreme form of laziness, laziness of the worst possible kind.

When we moved to Spain in 1968, we were indoctrinated in the rules of the expatriate road. Do NOT make public comments about the government. Do NOT criticize any public figure. Remember, you’re living under a dictatorship. Be wary of the police; they are not your friends. There were two television channels, one sporadically broadcasting soap operas and cartoons for the kids for about three hours a day, the other what we called the “All Franco, All the Time” channel—hours and hours of the Generalísimo standing on a stage, waving his arms.

Let me be clear: I loved growing up in Europe; Spain will forever be in my blood. The experience played a large part in making me who I am today, a person entranced by languages, diverse cultures, strange foods, and the allure of travel. But it also put me in a place where I developed the intellectual wherewithal to critically compare the USA to other countries, especially when I started traveling extensively for a living.

America’s involvement in Vietnam was just starting to wind down when I started college. Like so many young people, I was critical of our involvement, because there was no logical reason whatsoever that I could discern for our presence there, certainly no tangible return that was worth the loss of life that that ugly war created. But I remained an ardent supporter of the United States, the Shining City on a Hill, in spite of my disagreement about Southeast Asia.

Years later, I became what I am today—writer, teacher, audio producer, photographer, speaker, observer of the world. I’ve worked all over the planet and have had the pleasure and honor to experience more countries, cultures, linguistic rabbit holes, ways of life, and food than most people will ever see. For that I am truly, deeply grateful.

But it hasn’t always been good. I’ve spent time in countries ruled by totalitarian regimes, seeing how people who have no other choice must live, and feeling slightly embarrassed by the fact that I have the choice—the choice—not to live that way. In China, in Tiananmen Square, I was stopped by police and questioned aggressively for an hour because I was carrying a professional-looking camera. In that same country, I was told by the chipper hotel desk clerk when I checked in that I had to ‘register’ my laptop and mobile phone because, as a non-Chinese, I could be bringing in and distributing subversive materials that could be detrimental to the state. In Venezuela, my client would not allow me to go anywhere by myself, and assigned me a round-the-clock bodyguard to keep me out of trouble. In Yugoslavia, while driving in a car on the highway, I was frantically hushed by the other people in the car because they were afraid that my question about life under the current regime might be overheard by people outside the car. I listened and tried to understand the logic of a Russian man, who, when I took him (at his request) to a grocery store in California to see what it was like, stopped halfway down the coffee aisle, turned to me, and asked, “So many coffees! Why don’t they just pick the best one and give us that one?” It took me a few minutes to wrap my head around what he was really saying, and when I did, my hair stood on end. Why would I want they, whoever they is, picking my coffee? And in Africa and Australia, and frankly, parts of the American south, I watched as institutionalized racism turned my stomach. In Australia, I got into a cab, and soon after out of a cab, when the driver began spewing racial epithets and talking about the new Abo bars he had installed on his car. In Australia, many cars have pipe bumpers on the front that they call “Roo Bars,” referring to the fact that they are designed to keep kangaroos, when struck by the car, from damaging it. Abo bars refer to Aboriginal people—you understand why I got out of the cab. The man was a pig.

So you can imagine why I am hypersensitive to such behaviors, especially when I encounter them at home. There’s a lot to criticize about the United States. Racism, Sexism, and Ageism are alive and well in America, and by some estimations, once again getting worse. There is a growing income gap, driven by the overzealous forces of capitalism and manipulation of the rules by certain sectors of society, and it is tearing at the very fabric of national society. Educationally, we are in a tailspin, and the perceived value of education for the sake of education and its profound impact on the future of the country is at an all-time low. For all the rhetoric to the contrary, the skilled trades are still looked down on and often described as ‘what you do if you can’t get a real job.’ What short-sighted, hypocritical garbage. Education, in all its many forms, isn’t a barrier to progress; it’s a gateway that makes it possible.

Politically, we’ve never been more polarized. Some months ago, I had a conversation with a well-educated man—I emphasize that, well-educated—in the deep south, who took exception to something I said about the polarized nature of American politics. So, I invited him to have a conversation. 

“What do you believe?” I asked him.

“I’m a Republican,” he replied. 

“That’s not a belief—that’s a club you belong to,” I pushed back. He couldn’t get past that. So, I tried to make it easier. 

“Look—I’m going to give you a series of questions; answer any one of them. Here we go: Tell me one thing that we could do in this country to fix the education system, or healthcare, or the economy, or infrastructure, or political gridlock, or the widening economic divide.”

He was unable to answer. But he reiterated his position as a Republican three times. 

This is part of the problem. In the 60s and 70s, the chant that was often heard or seen on bumper stickers was, “My country, right or wrong.” Today, it seems to be, “My party, or my candidate, right or wrong.” And this is where I have a fundamental problem. ‘Country’ and ‘government’ are two very different different things that cannot and should not be conflated. 

In the United States, we have a tricameral government to ensure checks and balances, to prevent one of the three from becoming more powerful than the other two. And, we have a two-party system, because they are ideologically different. One conservatively stresses small government, big business, and a culture of pulling yourself up by your bootstraps. I applaud that, when it’s possible. 

The other party advocates for larger, more involved government, expanded social programs, and a more liberal approach to success. Interesting word, liberal. The dictionary defines it as ‘someone willing to respect or accept behavior or opinions different from their own, someone open to new ideas.’ And ‘someone’ can be an institution as much as it refers to an individual. The definition goes on to say that ‘liberal relates to or denotes a political and social philosophy that promotes individual rights, civil liberties, democracy, and free enterprise.’ By those definitions, every person in this country should proudly claim to be liberal, if we are committed as a nation to moving forward, not backward.

Somewhere in the territory between the ideologies of our political parties lies the fundamental essence of democratic freedom. Today, however, there is a massive, unfathomably wide gap between them, driven by political zealotry, greed, and government representatives who have forgotten that public service was never intended to be a career, an opportunity to feather one’s own nest. Government is not a business—and yet, based on the money that changes hands, and the extraordinary influence it wields over decisions that affect the governed, it is.  

And yet: I support American Democracy, the so-called American Experiment, because I’ve seen the other side. I know what happens when totalitarianism is allowed to flourish, eroding individual freedoms, crushing the hope of women and minorities, destroying entire swaths of regional and national economies, stifling individual and organizational innovation, forcing businesses to flee to more open countries, slapping down the will of the people, and shuttering the media. 

During Donald Trump’s first term, I wrote an essay in response to many of his actions which I ended with this statement:

I never express political arguments on a public forum, but for this, I make an exception. As someone who grew up in a country run by a dictator, and has traveled and worked in more than 100 countries, many of them run by despots and autocrats whose police harassed me because I carried a camera, required me to register my phone and laptop because I might engage in subversive activities, and suppressed the rights of their people to have a basic, fulfilling life and denied them a voice over their own destiny, I say ENOUGH. I can tolerate a lot, but this decision on Donald Trump’s part to ignore and openly criticize what we stand for as a free people and as a democratic nation goes far beyond ‘a misstep.’ This is not politically motivated on my part: I am motivated by indignation, anger, disappointment, and shame. I am tired of having to spend the first half-hour of every class I teach outside of this country, trying to explain the actions of this pompous fool who pretends to represent our country. ENOUGH. ENOUGH. ENOUGH.

That paragraph talks about what happens when totalitarianism and one-person rule are allowed to become the law of the land. It describes Russia, North Korea, China, Turkey, Venezuela, Myanmar, the Philippines, and others. 

Now, cast an eye on the United States. Singlehanded, unilateral decisions, in the interests of big business, are swiping away vast swaths of public wild lands and National Park and Monument holdings. The current president and his appointees are giving a voice to extreme right-wing ultra-nationalists and white supremacists, destroying years of civil rights work. Women’s individual reproductive rights are being taken away, thanks to conservative appointments and ill-thought-out court decisions. News flash: a woman’s body is her and hers alone to govern, and governments cannot and should not legislate morality. That model is already taken: It’s called Saudi Arabia, Iran, Iraq, Afghanistan. 

And what of the incipient trade wars looming on the horizon? Yes, there may be reasons to engage in tough conversations with our economic allies about trade imbalances, but waging a tariff-based trade war is not the answer. Here’s what we know from economic history that goes back to 15th-century China, when they were the dominant economic force on the planet. Global competition keeps the price of many goods down, which is good for everybody—and which is severely impacted in a tariff war. Free trade allows access to a wide range of services and goods, which tariffs diminish. Many of the gains of protectionism are short-lived and counter-productive; in fact, periods of protectionism have a historical habit of ending in economic downturn, most notably the Great Depression of the 1930s. 

Closer to home, and more relevant in today’s world, when trade barriers go up, jobs that rely on the Internet disappear, as the barriers to the free movement of capital and labor get higher. Companies that are protected from outside competition may flourish in the short term, but are invariably less efficient in the longer term. 

Truth: The only winning move is not to play.

Not long ago, I was in Northern California and southeastern Oregon, and I got into a conversation with a farmer who runs an enormous operation—thousands and thousands of acres. I asked him how things were going, given the talk of tariffs and such. He told me that tariffs were the least of his concerns, although they were concerns. His biggest issue was that his entire workforce had disappeared, because of fears of immigration coming down on them. And, all thoughts to the contrary, he couldn’t find local people willing to do the work that his previously Hispanic workforce was willing to do. He told me that he was down 80% of his staff, and that that was common across all the farms in the area. His solution? “Easy,” he told me. “Since Trump’s immigration policies have made my workforce disappear, I can’t operate my farm. So, I’m moving my farm to Mexico. The country is giving me tax breaks, so it’s a great deal.” 

Great deal indeed. If the workers can’t come to the farm, the farm goes to the workers. And, of course, any products shipped out of Mexico to the United States will be classified as agricultural imports, and will therefore be taxed at a higher rate—which means higher prices at the grocery store. Very smart.

Finally, I have to speak out on behalf of the Press. I believe fervently that the single most important Freedom listed in the Bill of Rights is the first one: ‘Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.’ One thing that makes our system of government as good as it is, is that the press has the right, the obligation, and the responsibility to question government at every turn. That’s its job. When I hear our current president taking potshots at the Fifth Estate, it chills my blood. If you don’t want the press questioning your actions, then don’t engage in controversial actions that attract their attention. And by the way, be happy that you live in a country where the press has the right to do precisely that—and doesn’t serve as a marketing arm of the government. Again: that’s called Iran, or Russia, or North Korea, or Yemen, or Albania. Are those the countries we want to be lumped in with? The free Press serves as our collective societal conscience, and today we need it more than ever.

So yes—our government is not perfect, by any measure. It has warts, ugly parts, and is prone to mistakes. But it also has an obligation and responsibility to ultimately do the right thing for the people of this country. Go back to those opening lines:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. — That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed. From the consent of the governed—not the other way around. 

This is not about blame: it’s about responsibility. It is not a partisan issue; it is a People of the United States issue, and ‘people’ includes those that we, the governed, place in office to serve us—not the other way around. As I told one person who commented on a different but related post, this challenge is not political—it’s genital. It’s time for the people of this country to grow a set, put on our big boy pants, and do the hard work of being responsible by reminding Washington, through our voices and actions at the polls, that this country is better than its government, and that the government serves the will of the people. I’ve spent too much of my life seeing firsthand what the alternative looks like in less-privileged countries: we must not and will not allow despotism or nationalism to define who we are. We’re better than that.

I close with this. Not long ago I read Brené Brown’s book, Braving the Wilderness. In it, she suggests four actions that would go a long way toward helping us get through this dysfunctional, angry, blame-ridden period. 

People are hard to hate close up. Move in.

Speak truth to bullshit. Be civil.

Hold hands. With strangers.

Strong back. Soft front. Wild heart. 

Those four statements are profound, and they define, as clearly as anything I’ve ever read, the soul of America. It’s time to get back to that, to the Shining City on the Hill, the model of strength, kindness, reason, and diplomacy that much of the world has historically held up as the model of global decorum. Washington, grow the hell up and start acting like there are grownups in the room. Politicians, from both sides, start doing your damn jobs, the ones you were elected to do. You might want to keep in mind the words of US Supreme Court Justice Robert Jackson, who in 1950 said, “It is not the function of our government to keep the citizen from falling into error; it is the function of the citizen to keep the government from falling into error.” And citizens? Enough with the doom-speak; enough with the hand-wringing. Speak up, think, be curious, and act. It is your right, and it is your responsibility. We owe this to our children, and we owe this to the world.

This is life. There is no easy button, and there never has been. It’s time we stopped looking for one.

Indifference

‘The only thing I owe you is my utter indifference.’ 

I heard Dennis Miller say that on his show, years ago. It stuck with me, and in the last few weeks it’s come back into my memory. Recent events in my life have made me think long and hard about who I am, what I am, and how I am. Let me explain.

I don’t care what color you are. I don’t care who you love. I don’t care what beliefs help you get through the day. I don’t care if you are skinny, fat, old, young, rich, poor, feeble, or sharp. I don’t care what you studied, where you studied, why you studied, or IF you studied. I’m indifferent to these things because they don’t—matter.

Here’s what I do care about.

Kindness. Your ability and willingness to engage with others. Your level of curiosity (more is better). Your interest in things outside yourself. Your story. Your family. The things that make you laugh, smile, cry, and despair. These are the things that make you human.

Let me tell you a story. When I was 13 years old, my family moved to Spain, thanks to a job transfer. Considering that we moved to cosmopolitan, European Madrid from Midland, Texas, in the heart of the oil (and prejudice)-soaked Permian Basin, the adjustment was—jarring. But I dealt with it—I adjusted—I went native, as seasoned (and, perhaps, jaundiced) expats say. But it didn’t happen without help. 

We rented a house in a small village a few miles west of downtown Madrid, a cozy little pueblo called Aravaca. To call it a house was a gross understatement: it was a house in the same way that Costco is a ‘shop.’ It had nine bedrooms, five bathrooms, a beautiful garden with a pool, two kitchens, and Loli.

Loli, at right, modeling a Flamenco dress; her parents, at left.

Loli worked for the family that preceded us in the house as a domestic—a maid, I suppose—and she saved us. When we rented the house, it was just assumed that she was part of the package, and thankfully, she was. That was 50 years ago; I still see Loli every time I go to Spain on business. She’s a few years older than I am, and as much a part of my family to me as my parents and brothers are. She’s my sister.

One winter day, we accepted an invitation from Loli to join her at her parent’s home for coffee and dessert. They lived in an even tinier town beyond ours called Majadahonda, a scrubby little pueblo that looked like a Star Wars outpost town—no paved streets, cattle and sheep running around, ancient Spanish women dressed all in black, the sign of a lost husband. Light snow was falling; it was December, and it was very cold.

We parked the car and climbed a half-completed brick staircase on the outside of an equally incomplete building that led to their home. It comprised two rooms: a kitchen and dining room about eight feet on a side, and a bedroom and bath about the same size. In the center of the kitchen was a small round table with a heavy felt table cloth that reached all the way to the floor. Under the table was a heavy metal brazier, filled with burning coal; this was what heated the home. We were instructed to sit at the table, and wrap the table cloth around our legs to stay warm.

My prejudices began to surface—I felt them rise, like the tide. These people were so poor—they had nothing. The only things hanging on their bare, whitewashed walls were a large crucifix, and a slightly crooked photograph of Generalísimo Franco. I felt embarrassed, awkward, out-of-place. I didn’t know how to—BE.

We spoke enough Spanish to carry on a halting conversation with our hosts, but most of what we exchanged were smiles, and hand gestures, and a tremendous amount of laughter. I had no idea what they were talking about most of the time, but I don’t think I’ve ever in my life had a more fun day. These people were poor, they spoke no English, but they were kind, and they were inclusive.

Soon, neighbors began to arrive because they wanted to meet us, and with them, a cornucopia of food. An entire Serrano ham came through the door, the entire leg of a pig, air cured, strongly flavored, delicious. Strings of chorizo and lomo and morcilla and salchichón sausages, rich with paprika and garlic and savory fat. Bowls of fried and marinated anchovies. Olives and peppers. Mushrooms, sautéed in olive oil and garlic. Bags of French bread, torn apart to soak up the leavings on the plate. And a universe of cheeses from all corners of Spain. 

We ate until we were full, and then we ate some more. Desserts arrived, mysterious and unknown and incredibly tasty. And then, the music started.

Spain is a musical country. Spaniards are wired with arpeggios; 16th notes flow through their veins, and their hearts beat to the staccato attack of a Flamenco dancer’s shoes. And so it was that spontaneous singing began to break out. One person would begin to wail, that sad, lonely sound that makes me think of foghorns and that is completely unique to Spanish love songs, and everyone else would join in, clapping in syncopated rhythm as the music progressed. As each song approached its final chorus, a voice would begin a different song, and the group would switch over, seamlessly. I did not understand the words, but the music, the rhythm, the emotion, spoke to me. I was entranced.

And it was at that moment that I became aware of a deep shifting in my heart, or perhaps in my soul, a feeling that I can recall to this day, with crystalline clarity. I was changing, fundamentally. My preconceptions about poverty and the measure of a person’s worth shattered, and were remade that day. Our nine-bedroom house, with its landscaped garden and pool that I had bragged about to my friends in letters, was meaningless. These people, these wonderful, warm, giving, caring, connected people, were far richer than I would ever be. 

The Dalai Lama once said, ‘My religion is very simple. My religion is kindness.’ That’s the most profound thing I’ve ever heard a religious figure say.

So, let me go back to my original thought, the one that transported us to a family gathering in Majadahonda. That experience, and countless others that I was honored to be part of during my time in Spain, changed the way I look at the world. I don’t care about those superficial, unimportant, physical and metaphysical things that surround you, and I expect the same indifference of you. But: I also expect you to seek kindness in me, and to expect my interest in those deeply human things that truly make you who you are.

Look, I’m not naïve. I’ve been around long enough to have witnessed acts of human cruelty that defy my ability to rationalize them. I watch, as more and more people in the world today try to define themselves by the things that they surround themselves with, rather than by the things that lie inside them. I shake my head as we glorify actors and sports figures and call them society’s game changers, yet we pay little attention to teachers, scientists, activists, aid workers, and artists, the REAL game changers.

On the other hand, I’ve also seen breathtaking examples of human kindness. I’ve seen ordinary people engage in acts of bravery that, in wartime, would have earned them a medal. I’ve seen art and listened to music and read literature that made me cry with unfettered emotion, and that made me feel that we humans, for all our faults, still have redeeming qualities.

So, this is my pledge, to myself as much as to others. I will strive to be more aware. I will think before I open my mouth. And I will try, very hard, to understand that the way I experience the world is vastly different than the way many others do.

The Smell of a Rainstorm

I’m standing on the front porch because a thunderstorm is passing through, and the sky is as dark and green as the back of a catfish. If there’s a more satisfying experience out there, I honestly don’t know what it is. The hiss of rain, the random chiming of leaves, downspouts, puddles, and flower pots as the raindrops fall, the crackle and crash of thunder—it’s nature’s best symphony. And the light—I’ve always believed that the light during a thunderstorm is something you can taste. It’s more than visible; thunderstorm light glows, from within, and it comes from everywhere and nowhere. 

The best part of a thunderstorm, of course, is when it ends—not because it’s over, which I always regret, but because it leaves behind a scent trail, that amazing smell, the breath of the storm, that proves that it’s alive. That smell, which we usually call ozone, isn’t ozone at all, at least not totally. It’s a very different chemical compound that I’ll introduce you to in a minute. But first, because I brought it up, let me tell you a bit about ozone, because it is a pretty important chemical.

Ozone is a weird form of oxygen. Oxygen is normally a diatomic molecule, meaning that two oxygen atoms combine to form the gas that we breathe, O2. Ozone, on the other hand, is O3, a much less stable molecule.

Everybody knows about the ozone layer up there. Well, that layer exists because ultraviolet energy from space strikes the oxygen in the upper atmosphere, changing O2 to O3 and creating a layer or shell of ozone that does a very good job of shielding us from all that UV radiation that would otherwise fry us into little masses of melanoma. At least, it protects us until we do dumb human things, like release chlorofluorocarbons that chemically eat holes in the ozone layer and let all that nasty UV energy through. 

The ozone layer sits about 30 kilometers above the surface of the planet, and in spite of its name, the concentration of ozone up there is only about eight parts-per-million, while the rest is mostly just regular oxygen. But it’s that oxygen that absorbs ultraviolet energy to become the ozone that protects the planet’s surface from most of the effects of harmful radiation. And while ozone has beneficial effects in the atmosphere, they’re not all that beneficial down here on earth. It’s known to reduce crop yields when there’s too much of it in the ground, and because it’s such a powerful oxidant, it can be extremely irritating to noses, throats and lungs. It can also cause cracks in rubber and plastics, and in at least one study, it’s been shown to make arterial plaque, the fatty buildup that can lead to heart attack and stroke, worse. Talk about a love-hate relationship.

So, let’s talk about what we were originally discussing before I diverted us—and that was the wonderful smell that takes over everything after a rainstorm, that smell that makes us inhale deeply and feel good about life in general.

As it turns out, that smell doesn’t come from ozone—at least not exclusively. Ozone may be in the air if there was lightning during the rainstorm, but the chemical you’re mostly smelling is called Geosmin. You smell it after a rain, or in wet dirt that you’re digging up in the garden. The smell is so recognizable, and so wonderful, that it even has a name—Petrichor. It comes from two Greek words that mean “the smell of the substance that flows in the veins of the Gods.”

So, where does Geosmin come from? Well, it turns out that it’s created as a by-product when three types of bacteria found in the soil, actinomycetes, streptomycetes, and cyanobacteria, have their way with organic material. As they break it down, Geosmin is released. So, it’s naturally occurring, and in fact contributes to the flavor of beets, spinach, lettuce, mushrooms, even that wonderful, earthy taste of catfish. Sometimes it can be overpowering when too much of it gets into water supplies, and while it isn’t harmful, it can temporarily give water a bitter taste. 

Here’s one last, interesting thing about Geosmin and its Petrichor aroma. Human noses are extremely sensitive to the smell of Petrichor, in fact, more sensitive to it than just about any other compound. We can detect it in concentrations of five parts per trillion. To put that into perspective, for the human nose to detect methanol, a fairly pungent alcohol, it has to be present in concentrations of a billion parts-per-trillion. That’s quite a difference. And why are we so amazingly sensitive to it? Well, some scientists believe that that sensitivity has been genetically selected, because it allowed our distant ancestors to find water, even in the driest places on earth. No wonder it smells so good—it helped keep us alive.

the Control of Nature

I’m a writer, which means that I’m also a serious reader. I like to say that writing is my craft; reading is my gym. And one author whose books have meant a lot to me—in fact, I’d consider him a mentor, even though we’ve never met—is a guy named John McPhee. If his books are any indication, he’s a ferociously curious guy. They all fall into the genre that I love, which is called creative nonfiction. It includes writers like William Least Heat-Moon, Bill Bryson, Annie Dillard, and of course, John McPhee. Creative nonfiction means writing about subjects that are real, but that incorporate storytelling into the narrative. In creative nonfiction, adjectives are legal.

I first ran across McPhee’s work when I took a writing workshop back in the 90s from William Least Heat-Moon, the inspiring author of one of my all-time favorite books, Blue Highways. One of John McPhee’s books, Coming Into the Country, was required reading for the workshop. It’s about homesteaders in Alaska, back in the days when the Alaska government would give land to people in exchange for their agreement to homestead it. Boring, you say? Well, consider the story of the guy who drove an old school bus up there. When he got reasonably close to the land he had acquired as part of his homesteading agreement, he parked the school bus, took a cutting torch to it, and cut off the top. He then turned the former top upside down like an overturned turtle’s shell, and drove the school bus-turned-convertible onto it. Once there, he welded the two together, attached a long shaft with a propeller on one end to the drive shaft of the school bus, shoved his contraption into the river, started the engine, and motored a few hundred miles to his newly acquired homestead. See what I mean? Story. It’s everything.

McPhee has written about a breathtaking range of topics. He wrote Annals of the Former World, in which he took a series of road trips across the United States with a geologist, looking at freeway roadcuts to understand the dynamic geology of North America, and in the process, writing a magnificent book about the geology of the continent. He wrote The Pine Barrens, the story of the great pine forests that cover most of southern New Jersey, and the people who live there. He wrote Uncommon Carriers, about the world of cargo carriers—all kinds—that form the basis of the global supply chain. He wrote Oranges, about the business of growing and selling them in Florida. He wrote Encounters with the Archdruid, about the interactions between conservationists and those they see as the enemy. And he wrote The Curve of Binding Energy, the story of Theodore Taylor, an early nuclear engineer who was also an anti-nuclear activist. 

By the way, here’s a quote from Annals of the Former World that shows what kind of a writer McPhee is: “If by some fiat, I had to restrict all this writing to one sentence (and by the way, the book is two-and-a-half inches thick), this is the one I would choose: “The summit of Mount Everest is marine limestone.” Think about that.

So far, John McPhee has written more than 30 books, and I’ve read them all. I can honestly say that each one has made me a measurably better writer and thinker. But the book that really stuck with me, more than of the others, is called The Control of Nature. That book has been in my head a lot lately as I watch what’s going on in California specifically with the damage caused by heavy rains and flooding, and in the country (or world in general), as climate change has its way with us. 

The Control of Nature is divided into three sections: ‘Atchafalaya’; ‘Catching the Lava’; and ‘Los Angeles Against the Mountains’. Each section tells a story of human hubris, of our largely futile efforts to make nature do something that nature doesn’t want to do—like changing the direction of the Mississippi River, or trying to redirect lava flows in places like Hawaii and Iceland away from population centers (Iceland dumped cold water on one of their flows), or protecting Los Angeles infrastructure from damage caused by flooding by building flood canals, like the cement-bound LA River. How’s that working out?

Some of you may remember a quote that I toss out a lot. It’s from Loren Eiseley, another of my favorite writers. Back in the 60s, Loren said, “When man becomes greater than nature, nature, which created us, will respond.” Well, she’s responding. And one of the lessons we can choose to learn from her response is that this is not a time for head-to-head combat. I used to tell my SCUBA diving students that it doesn’t matter how strong a swimmer you are, or how good a diver you are, the ocean is always stronger. The ocean will win, every time. So don’t even try. Discretion is the better part of valor, and to ignore that fact can be fatal. 

As I said, this is not a time for head-to-head combat. Nature vs. Humanity cannot be a boxing match, because the outcome is predetermined, whether we like it or not. News flash: We don’t win this one. This is more a time for martial arts, in which we use our opponent’s weight and strength to work in our favor. Nature is telling us what to do, every day. We just seem to have a problem listening. ‘You’re not the boss of me,’ we say. ‘No, actually, you have that backward,’ nature says. ‘Here—let me demonstrate.’ 

The other flaw in the logic is that we have this tendency to think in terms of ‘us vs. nature,’ of ‘humans vs. the natural world,’ when in fact, we’re as much a part of the natural world as blue whales and chickadees and earthworms and slime molds. We just don’t act like it. By viewing ourselves as something apart from nature, as something better than or superior to nature, we invoke Loren Eiseley again. Nature is responding to our abuse, to our attempt to dominate, and her response is swift, sure, and painful.

So, what’s the alternative? The alternative is to shift our thinking from ‘us vs. nature’ to ‘us as an integral part of nature.’ Nice words. But, what do they mean? How do they become real, or actionable, as people like to say in the business world?

The answer is simpler than most people realize, although it requires deliberate action. There’s that word again—deliberate. The answer isn’t one great, big thing, because if that were the case, nothing would ever change. Here’s an example for the techies. Think about it: What’s more powerful: a single mainframe computer, or hundreds of personal computers, or servers, networked together? The answer, of course, is the latter. Although instead of talking about computers here, we’re talking about one-person efforts on behalf of the environment of which we are a part, that, in aggregate, amount to enormously powerful results. The whole is greater than the sum of its parts. For example, if you live in a house, you probably have a yard, which means that you probably have grass, and shrubs, and trees, and flowering plants, and other things to make it look good. The problem is that most of those are non-native, which means that they’re not always good for local pollinators, like bees and moths and butterflies and even spiders, or other local wildlife. But if each of us were to set aside an area in the back corner of the yard the size of a typical walk-in closet, say, eight feet by ten feet, that’s eighty square feet that can be allowed to grow wild with local plants, which provide habitat, including food, for native pollinators. I guarantee that if you go down to your local nursery, or Audubon Center, you can buy a shaker bottle full of local plant seeds that you can take and shake over your designated area.

Here’s another one. We often use broad-spectrum insecticides to get rid of insect pests, which they do very well. But those nicotinoid-based compounds are indiscriminate—they also kill beneficial insects like bees, butterflies, moths, and spiders, and birds, and reptiles and amphibians, and potentially humans, if they leach into the water supply—and they do. So, why not switch to environmentally friendly compounds? They’re out there, and yes, they may cost a little bit more, but not enough to be a showstopper, especially when you consider the alternative. I don’t want to be yet another alarmist here—there’s more than enough of them already—but consider this: pollinators aren’t a nice-to-have thing. Bees, moths, butterflies, spiders, and even some birds move pollen from flower to flower, a process that’s required for the flower to give rise to fruit. No bees, no pollination. No pollination, no fertilization. no fertilization, no fruits or vegetables. So think twice, please, about using that insecticide.

Other things? There are lots of them. Buy soaps and detergents in bulk, and refill the same bottle over and over, to reduce plastic consumption. Buy one of those showerheads that allow you to turn down the water pressure to a warm trickle when you don’t need the full force of the blast. An efficient showerhead still puts out about two to two-and-a-half gallons of water per minute, which over the course of a year of showering can really add up, which means that any effort to conserve falls on the correct side of the environmental balance sheet. You don’t have to turn the shower off; just turn it down. It makes a huge difference.

What else? Set the thermostat in winter one degree cooler and buy a sweater or that cool hoodie you’ve been jonesing for. There’s your excuse! Think before you get in the car to run that errand. Are you close enough to walk instead? I do it every day, a few miles each way, and I feel so much better for it.

Another thing you can do is buy as much locally produced food as you can. I’m about to write a whole series of essays on the role that technology can play to help the environment, but just consider this. California can no longer feed the nation. They’ve depleted their deep-water aquifers to the point that the ground in the central valley is measurably sinking, and the drought is making it necessary for farmers to uproot fruit and nut trees and many crops, because of the great volumes of water they consume—water that’s no longer available, or if it is, it’s too salty to use. But even if California CAN ship produce across the country, we know that that takes its toll on the environment because of the trucks and planes required to do it, and freshness is a concern. We also know that there have been outbreaks of disease—salmonella and listeria—associated with large-scale farming.

Local produce, on the other hand, is much fresher, it tastes better, it’s safer, and it supports a local farmer. And yes, you’re probably going to pay a little more, but how much is your health worth? 

I’m not channeling Chicken Little here. The sky isn’t falling, but it’s a lot lower than it used to be. And before the naysayers climb all over me, yes, I know that some of the current climate change effects we’re experiencing are happening as a matter of the natural course of things. But I also know, because the science proves it, that we’re doing a lot of things that are making it worse, things that, through minor but deliberate efforts, we could change without a whole lot of personal impact. But there’s that ‘deliberate’ word again—meaning, let’s stop talking, and wringing our hands, and putting the bumper sticker on the car that says ‘save the bees,’ or wearing the ‘May the Forest Be With You’ T-shirt. Those are all fine. But a bit more minimal effort combined with deliberate action would go a very long way. 

In other episodes, and in my leadership workshops, I often talk about the danger and ineffectiveness of slogan leadership—you know, putting up those motivational posters that show a crew of people on a misty river at sunrise, in a rowing scull, with the word ‘teamwork’ across the bottom. Or a person standing on top of a mountain, arms raised in celebration, silhouetted against the sunset, with the word ‘commitment’ across the bottom of the poster. That’s slogan leadership, and while the pictures are pretty, it’s a form of responsibility abdication. So, let’s not abdicate—let’s do. It shows the other corners of the natural world that we’re willing to make an effort to play well with others, and it sends the right message to our kids and grandkids. 

We can’t control nature, but we can harness her awesome power to help clean up our act, like a martial arts master does against a stronger opponent. As someone who spends an awful lot of time in the natural world, I’d much rather have nature as my ally than my enemy. It’s a choice. And it’s our move.

The Power of Connections

Years ago, while still living in California, I began my writing career by submitting feature articles to local magazines in the San Francisco Bay Area. For some reason, I always gravitated toward offbeat subject matter, which apparently made my stories interesting – and desirable.  

One day, at the request of my editor, I sat down to write a feature story about one of the local towns in our area. But as I started writing, it occurred to me that I really didn’t know what a feature story was, even though I’d been writing them for several years. Wikipedia, by the way, defines a feature story as a “human interest” story that is not typically tied to a recent news event. They usually discuss concepts or ideas that are specific to a particular market, and are often pretty detailed. 

Anyway, I grabbed the dictionary off the shelf (this was years before the Web, and digital dictionaries were still a dream), and searched the Fs for ‘feature.’ I read the entry and satisfied my need to know and as I started to close the book, that’s when I saw it. Directly across the gutter (that’s what they call the middle of the open book where two pages come together) was the word ‘feces.’ 

Now I’m a pretty curious guy, so I wasn’t going to let this go. Needless to say, I know what feces is, but what was really interesting were the words at the bottom of the definition. The first one said, ‘See scat.’ So I turned to the Ss and looked up scat, and it turned out to be the word that wildlife biologists use for animal droppings. But wait, as they say, there’s more. THAT definition told me to see also, Scatologist. (You’ve got to be kidding me). But I did. You guessed it—someone who studies, well, scat. 

An owl pellet (scat) from a friend’s collection.

So I called the biology department at my undergraduate alma mater, the University of California at Berkeley. When somebody answered the phone, I asked, ‘Do you have a … scatologist on staff?’ Of course, she replied, let me connect you to Dan. The next thing I knew I was talking with Dan, a very interesting guy, so interesting, in fact, that the next weekend I was with him in the hills, collecting owl pellets  and the droppings of other animals to determine such things as what they eat, what parasites they might have, how predation of certain species affects populations of others, and so on. It was FASCINATING. 

Remember that what got me started down this rabbit hole was the search for feature, which led me to feces. Well right underneath the suggestion that I also see scatologist, it said, see also, coprolyte. This was a new word for me, so off to the Cs I went, in search of it. 

My very own coprolite.

A coprolite is, and I’m not making this up, a fossilized dinosaur dropping. A paleo-scat, as it were. I have one on my desk. OF COURSE I have one on my desk. Anyway, once again, I got on the phone, and this time I called the paleontology department at Berkeley, and soon found myself talking to a coprologist – yes, there is such a person. How do you explain THAT at a dinner party? Anyway, he agreed to meet with me, and once again I had one of those rare and wonderful days, learning just how fascinating the stuff is that came out of the north end of a south-bound dinosaur. He showed me how they slice the things on a very fine diamond saw and then examine them under a high-power microscope to identify the contents, just as the scatologist did with owl pellets and coyote scat. 

Think about this for a moment. If I hadn’t allowed myself to fall prey to serendipity (Wikipedia defines it as “A “happy accident” or “a pleasant surprise”), I never would have met those remarkable people, and never would have written what turned out to be one of most popular articles I’ve ever written.

 Another time, my wife and I were out walking the dogs in a field near our house. At one point, I turned around to check on the dogs and saw one of them rolling around on his back the way all dogs do when they find something disgustingly smelly. Sure enough, he had found the carcass of some recently dead animal, too far gone to identify but not so far gone that it didn’t smell disgusting. I dragged him home with my wife following about 30 feet behind and gave him the bath of baths to eliminate the smell. Anyway, once he smelled more or less like a dog again I felt that old curiosity coming on, so I went downstairs to my office and began to search Google for the source of that horrible smell that’s always present in dead things. And I found it. 

In case you care.

The smell actually comes from two chemicals, both of which are so perfectly named that whoever named them clearly had a good time doing so. The first of them is called cadaverine; the second, putrescine. Can you think of better names for this stuff? Interestingly, putrescine is used industrially to make a form of nylon.

So what’s the point of this wandering tale? Storytellers are always looking for sources, and the question I get more often than any other is about the source of my stories. The answer, of course, has lots of answers, but in many cases I find stories because I go looking for them but leave my mind open to the power of serendipity. For this reason, I personally believe that the best thing about Wikipedia is the button on the left side of the home page that says, “Random Article.” I use it all the time, just to see where it takes me. 

Curiosity is everything. I just wish there was more of it in the world.

Candle in the Darkness

The quotes in this essay were excerpted from Carl Sagan’s The Demon-Haunted World: Science as a Candle in the Dark.

It is not the function of our government to keep the citizen from falling into error; it is the function of the citizen to keep the government from falling into error. 

US Supreme Court Justice Robert Jackson, 1950.

I HAVE ALWAYS BELIEVED that every child, by the time they are 13 years old or so, should have three specific and non-negotiable skills. They should be able to read well; they should have a decent understanding of their individual, inalienable rights, especially freedom of speech and the sanctity of a free press; and they should understand the scientific method and how it works.

Why these three things? Because they are the foundational elements of a community-centric, free society. The ability to read represents more than access to great literature, to worlds of fancy and fantasy and horror and drama; it offers more than the ability to travel through time and space, more than the opportunity to meet a host of unforgettable characters. All good things, those. But reading also lies at the heart of critical thinking and healthy skepticism and the ability to develop a cogent and convincing argument. It is a catalyst of confidence, and an essential element of inspired leadership. It is an enabler of diplomacy and reason and governance. And reading is an essential tool for individual relevance, growth, and an assured future.

Freedoms of speech and a free, non-aligned press are, more than anything else, the most powerful protectors of true democracy that we have. They are stronger than armies, more powerful than bloviated oratory, and if protected and allowed to do their jobs, they are the single greatest threat to autocracy and demagoguery. 

And the scientific method? I don’t mean ‘something done in sort of a scientific way;’ I mean THE scientific method, the rigorous six-step process which has been around since the early 1700s and which has guided every legitimate scientific undertaking since. 

Like the other two, it’s essential. Here’s how it works. 

First, I make an observation or ask a curiosity-driven question. For example, how do bats manage to catch insects on a pitch-black night and not fly into obstacles? 

Second, I research the topic as exhaustively as I can, and that includes personally observing whatever phenomenon it is I want to research. If there is an answer in the literature, and it has already been proven correct, then I don’t have to continue any further—I have my answer. But if I don’t, I gather as much information as I can to help me design a process to come up with an answer.

Based on my research, I develop a hypothesis. “I believe that bats must have organs in their eyes that allow them to see perfectly in the dark, like night vision goggles.”

In step four, my task is to design an experiment to test my hypothesis. I capture a dozen bats (I need a dozen or more to ensure that I don’t base my research on a single bat that by chance has some kind of genetic mutation that allows it to see in the dark). In the lab, I carefully blindfold the bats and release them into an obstacle (and insect)-filled laboratory that is completely dark.

Next, I analyze the data. “After 25 identical experiments in which I released the blindfolded bats into the obstacle-filled dark space, each time radically rearranging the obstacles to ensure unpredictability and to avoid the possibility that the bats manage to memorize the layout of the obstacles in the darkened laboratory, the data reveals two facts: (1) not once did a bat ever collide with an obstacle, and (2) all the bats were well-fed and the insect population was significantly depleted during each experimental test period.”

In the final step, I report my conclusions. “Based on the data that I was able to reproducibly generate with my experimental setup, I conclude that eyesight, regardless of wavelength sensitivity, has nothing to do with Myotis lucifugus’ ability to to capture prey while avoiding randomly-placed obstacles, because the blindfolded bats behaved identically to those without blindfolds.”

Notice that all I did in this process was to eliminate one hypothesis—I didn’t answer the question. I know how they DON’T avoid obstacles and catch dinner (by seeing them), but I still have no idea how they DO avoid obstacles and catch dinner. And that brings us full-circle, back to the beginning. Another question: I have concluded that little brown bats do not rely on vision to catch prey and avoid obstacles in the dark. How, then, DO they catch prey and avoid obstacles in the dark?

Back to my research, this time with a focus that perhaps eliminates vision-oriented outcomes. A different question, perhaps: are there other senses that bats might use to avoid obstacles and locate prey besides sight? Smell might work for insect prey, but it won’t work with big foam rubber obstacles. Touch can probably be discounted, because by the time the fast-moving bat is close enough to feel the obstacle or their prey, it’s too late. What about hearing? Back to step three. “I hypothesize that bats rely on some form of sound to locate their prey and avoid obstacles in a controlled dark environment.” I design a second experiment: I take my 12 bats and fill their ears with soundproof foam to eliminate their ability to hear. I also dust each bat with harmless chalk dust, each bat a different color. I then repeat the experiment. When the experiment period ends, I turn on the lights, and lo and behold, the surfaces of the white foam blocks that served as obstacles are covered with chalky impact locations, and the insect population has not been diminished at all. 

Next, just to be sure, I create what’s called a control group. I remove the ear plugs from six of the 12 bats and clean the chalk dust from their bodies. I then dust the bats that still have ear plugs with red chalk dust, and the bats without ear plugs with yellow chalk dust. I then clean and rearrange the obstacles, and repeat the experiment. This time, when I turn on the lights, I find that the foam barriers are covered with red blotches. I conclude that hearing, not vision, is key to the little brown bat’s ability to navigate and capture prey.  But what I still don’t know is, what sound? The barriers make no sound, yet they somehow avoid them. And the insects may or may not make sound, but even if they did, would that be enough information to answer the question? The answer, of course, is no.

I won’t take this little exercise any further, other than to note that to answer the question as to how bats avoid obstacles and capture prey, I would have to figure out a way to hear whatever sound the bats are relying on, determine whether the bats or the prey are emitting it, and at what frequency, since we hear nothing in the lab environment during the experiments with the naked ear. 

So, back to the scientific method and why it’s one of the three critical skills. The iterative process I’ve just described is a good example of the scientific method, with one important piece left out. Let’s say I ultimately determine (correctly) that my bats are emitting ultrasonic clicks that bounce off of obstacles, including prey, allowing the bats to echolocate themselves relative to their environment, thus avoiding collisions while also managing to feed themselves. I perform the experiments several times over, each time carefully noting the results. I then publish a paper that lays out my conclusions, with each step of the scientific method carefully documented: I started with this hypothesis; I made these observations in the field and did this review of the available literature on the subject; I crafted an experiment that allowed me to test my hypothesis; I used control subjects (with and without earplugs) to further test my hypothesis; I carefully analyzed the data I collected; then, based on my analysis, I came up with a conclusion that I believe to be true. Here’s where the piece I haven’t mentioned yet comes into play.

I publish my results as a scientific paper in a scholarly journal. Other scientists read the paper, and then do everything in their power to prove me wrong. It isn’t just their job; it is their responsibility as legitimate scientists, in the same way that it is my responsibility to attempt to poke holes in the results of my colleagues’ findings when they publish their own papers. It’s based on the sacred belief that it’s not about the scientist being right; it’s about the science being right. Is it fun? Do scientists enjoy having their work questioned and pulled apart and quite often proven wrong? Of course not! But they understand that if that doesn’t happen as part of the vetting process, without the scientific method, then science becomes a valueless sham, a card trick, and we’d still be in the Dark Ages, running our fingers along sheep entrails and trepanning people’s skulls to release bad humors and rubbing the lumps on those same people’s heads to predict the future. The sun, subservient body that it is, would still be orbiting the earth, which we’d still believe to be flat. And we’d still be convinced that fossils in the ground were carefully hand-placed 6,000 years ago by a mysterious deity, like a parent hiding Easter eggs. 

***

“Science is more than a body of knowledge; it is a way of thinking.” 

I took that quote from Carl Sagan’s book, The Demon-Haunted World: Science as a Candle in the Dark. It’s an important message, especially today as knowledge seems to be falling out of favor, as educational effort is supplanted by social media ratings and ravings, and as science as a guiding light is characterized as ‘wrong’ (at worst) or ‘merely a suggestion’ (at best). 

Sagan wrote this book in 1996–nearly 30 years ago. Just for the sake of perspective, that’s the year the Palm Pilot, the DVD,  and the first USB interface came out. It’s also the year that an initiative was launched in Menlo Park, California, called Project BackRub. It evolved into another project you may have heard of, called Google.

My point is that around that same time, 30 years ago, Carl Sagan wrote this:

I have a foreboding of an America in my children’s or grandchildren’s time —when the United States is a service and information economy; when nearly all the key manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.

And if that hasn’t started the hackles rising on your back, Sagan continues:

The dumbing down of America is most evident in the slow decay of substantive content in the enormously influential media, the 30-second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudo science and superstition, but especially a kind of celebration of ignorance. As I write, the number-one videocassette rental in America is the movie Dumb and Dumber. “Beavis and Butthead” remain popular (and influential) with young TV viewers. The plain lesson is that study and learning—not just of science, but of anything— are avoidable, even undesirable.

And if that isn’t enough, he finishes that paragraph with this:

We’ve arranged a global civilization in which most crucial elements— transportation, communications, and all other industries; agriculture, medicine, education, entertainment, protecting the environment; and even the key democratic institution of voting—profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.

Science isn’t perfect, and isn’t designed to be, as I hope I explained at the beginning of this essay. But it’s the best tool we have.

If you’re not familiar with Carl Sagan, let me help. He was an astronomer and science communicator, along the lines of Neil deGrasse Tyson and Bill Nye. He led the effort to assemble the content that was inscribed on the golden record that is attached to the Voyager spacecraft, and among his many books he wrote the novel Contact, upon which the movie of the same name starring Jodie Foster and Matthew McConaughey is based.

Sagan took the subtitle of his book from a work published in 1656 called A Candle in the Dark. Written by Thomas Ady, the book attacks the witch hunts that were so common at the time as a scam “to delude the people,” baseless attacks on individuals who were believed to have the power to make people sick and to change the weather. Sagan again:

For much of our history, we were so fearful of the outside world, with its unpredictable dangers, that we gladly embraced anything that promised to soften or explain away the terror. Science is an attempt, largely successful, to understand the world, to get a grip on things, to get hold of ourselves, to steer a safe course. Microbiology and meteorology now explain what only a few centuries ago was considered sufficient cause to burn women to death.

As I have said numerous times in episodes of The Natural Curiosity Project, science has never claimed to be right, only that it will be more right tomorrow, and even more right the day after that. Every day yields wonders and insights in the quest to understand how the world works. Yet today, today, science is attacked by those who refuse to understand what it represents. It’s one thing for scientists to attack their own work using the tenets of the scientific method; it’s another thing entirely for others to attack it without merit. As Sagan says,

Not every branch of science can foretell the future — paleontology can’t—but many can and with stunning accuracy. If you want to know when the next eclipse of the Sun will be, you might try magicians or mystics, but you’ll do much better with scientists. They will tell you where on Earth to stand, when you have to be there, and whether it will be a partial eclipse, a total eclipse, or an annular eclipse. They can routinely predict a solar eclipse, to the minute, a millennium in advance. You can go to the witch doctor to lift the spell that causes your pernicious anemia, or you can take vitamin B12. If you want to save your child from polio, you can pray or you can inoculate. If you’re interested in the sex of your unborn child, you can consult plumb-bob danglers all you want (left-right, a boy; forward-back, a girl — or maybe it’s the other way around), but they’ll be right, on average, only one time in two. If you want real accuracy (here, 99 percent accuracy), try amniocentesis and sonograms. Try science.

Toward the end of his book, Sagan has this to say about education:

Education on the value of free speech and the other freedoms reserved by the Bill of Rights, about what happens when you don’t have them, and about how to exercise and protect them, should be an essential prerequisite for being an American citizen—or indeed a citizen of any nation, the more so to the degree that such rights remain unprotected. 

If we can’t think for ourselves, if we’re unwilling to question authority, then we’re just putty in the hands of those in power. But if the citizens are educated and form their own opinions, then those in power work for us. In every country, we should be teaching our children the scientific method and the reasons for a Bill of Rights. With it comes a certain decency, humility and community spirit. In the demon-haunted world that we inhabit by virtue of being human, this may be all that stands between us and the enveloping darkness.

Science is real, and it is as accurate as anything can possibly be BECAUSE it is designed to be ferociously self-critical. What if our political system worked the same way? Wow—what an amazing thing THAT would be!

It warms my heart to see that none other than Carl Sagan believes in the importance of the three skills with which I started this essay. Yet, caution is called for. As we enter yet another period of political uncertainty and divisiveness, perhaps it is good that I end with this line from Sagan’s first chapter: 

The candle flame gutters. Its little pool of light trembles. Darkness gathers. The demons begin to stir.

Alien Invasion

Occasionally, I run across something that I just can’t ignore. Sunday morning was one of those times when my curiosity about the natural world just couldn’t be contained.

My wife Sabine and I had gone out for a walk. As we rounded the front of our house, we passed under the canopy of an apple tree grove in our front yard, Sabine pointed at the mulch and said, “OK, that’s just gross. Some dog puked in the yard.” She was right: there was a big pile of yellow goo spread out on the much, about 20 inches cross. It WAS pretty disgusting-looking, so I promised to clean it up when we got back.

When we did get back a few hours later, I grabbed a shovel to take care of the slime under the apple tree, but when I got out there I stopped dead in my tracks. Why? Because I SWEAR it was bigger, taller, and I kid you not, closer to the apple tree. In fact, there was now a blob of the yellow stuff on the side of one of the trees. Clearly this was not something that came out of a retching dog. But here’s the REALLY weird thing. When I went to pick some of it up with the shovel, a cloud of what looked like smoke erupted from it.

So I decided to leave it where it was. I went inside grabbed my iPad, and searched for “yellow slime on mulch.” Instantly, I was rewarded with a photograph of my slime – Fuligo septica, otherwise known as Dog Vomit Slime Mold. Scientists must have a blast naming things—that has to be a high point when they discover something new. And by the way, it’s also called scrambled egg slime and flowers of tan. In Mexico, they do, in fact, scramble them like eggs and eat them. In Spanish, they’re called caca de luna, which means … well, caca is the Spanish word for what comes out of the north end of a south-bound dog, luna means moon. So this is moon sh—well, you know. I’ve tried them in pueblos south of Mexico City, and they’re not bad—kind of nutty. 

Anyway, slime molds are fascinating. They fall into a category called myxomycetes, which comes from two Greek words meaning “mucus fungus.” Yummy combination—that’s not much better than dog vomit. Anyway, the interesting thing about slime molds is that they pass through a development phase called a plasmodium. During the plasmodium phase, the cells that make up the organism rearrange themselves into a single, gigantic cell with millions of nuclei, that can weigh as much 45 pounds. They’re not plants, and they’re not animals—they’re something in between.

By the way, the smoke that came out of the thing when I nudged it with my shovel was a cloud of spores, on their way to propagate the species.

Here’s the other interesting thing about slime molds. They move. As in, they crawl.  And how fast, you ask? Well, brace yourself: about an inch a day. That means that … never mind. You don’t want to go to sleep thinking about that. Just be sure to lock the door. In 1973, down in Dallas, people panicked when these things erupted in their gardens. They didn’t know what they were, and they thought it was an alien invasion. Of course, this WAS Dallas, and obviously there were too many people watching Invasion of the Body Snatchers that week.

I know that the vast majority of you couldn’t care less about slime molds, especially those that have ‘vomit’ and ‘mucus’ in their names. But you do have to admit that this is kind of interesting.

I feel like Egon Spengler in the Ghostbusters: “I collect spores, molds and fungus.”