Tag Archives: technology

The Research Myth

I recently had a conversation about technology’s impact on the availability and quality of information in the world today. It’s an argument I could make myself—that tech-based advances have resulted in access to more data and information. For example, before the invention of moveable type and the printing press, the only books that were available were chained to reading tables in Europe’s great cathedrals—they were that rare and that valuable. Of course, it was the information they contained that held the real value, an important lesson in today’s world where books are banned from modern first world library shelves because an ignorant cadre of adults decides that young people aren’t mature enough to read them—when it’s the adults who lack the maturity to face the fact that not everybody thinks the same way they do in this world, and that’s okay. But, I digress.  

Image of chained books in Hereford Cathedral. Copyright Atlas Obscura.

When moveable type and the printing press arrived, book manuscripts no longer had to be copied by hand—they could be produced in large quantities at low cost, which meant that information could be made available to far more people than ever before. To the general population—at least, the literate ones—this was a form of freedom.But to those who wanted to maintain a world where books were printed once and kept chained to desks where only the privileged few (the clergy) could read them, the free availability of knowledge and information was terrifying. Apparently, it still is. Knowledge is, after all, the strongest form of power. How does that expression go again? Oh yeah: Freedom of the Press…Freedom of Expression…Freedom of Thought…Sorry; I digress. Again.

Fast-forward now through myriad generations of technology that broadened information’s reach: The broadsheet newspaper, delivered daily, sometimes in both morning and evening editions. The teletype. Radio. The telephone. Television. The satellite, which made global information-sharing a reality. High-speed photocopying. High-speed printing. The personal computer and desktop publishing software. Email. Instant Messaging and texting. And most recently, on-demand printing and self-publishing through applications like Kindle Direct, and of course, AI, through applications like ChatGPT. I should also mention the technology-based tools that have dramatically increased literacy around the world, in the process giving people the gift of reading, which comes in the form of countless downstream gifts.

The conversation I mentioned at the beginning of this essay took a funny turn when the person I was chatting with tried to convince me that access to modern technologies makes the information I can put my hands on today infinitely better and more accurate. I pushed back, arguing that technology is a gathering tool, like a fishing net. Yes, a bigger net can result in a bigger haul. But it also yields more bycatch, the stuff that gets thrown back. I don’t care about the information equivalents of suckerfish and slime eels that get caught in my net. I want the albacore, halibut, and swordfish. The problem is that my fishing net—my data-gathering tool—is indiscriminate. It gathers what it gathers, and it’s up to me to separate the good from the bad, the desirable from the undesirable.

What technology-based information-gathering does is make it easy to rapidly get to AN answer, not THE answer.

The truth is, I don’t have better research tools today than I had in the 70s when I was in college. Back then I had access to multiple libraries—the Berkeley campus alone had 27 of them. I could call on the all-powerful oracle known as the reference librarian. I had access to years of the Reader’s Guide to Periodical Literature. I had Who’s Who, an early version of Wikipedia; and of course, I had academic subject matter experts I could query. 

Technology like AI doesn’t create higher quality research results; what technology gives me is speed. As an undergraduate studying Romance Languages, I would often run across a word I didn’t know. I’d have to go to the dictionary, a physical book that weighed as much as a Prius, open it, make my way to the right page, and look up the word—a process that could take a minute or more. Today, I hover my finger over the word on the screen and in a few seconds I accomplish the same task. Is it a better answer? No; it’s exactly the same. It’s just faster. In an emergency room, speed matters. In a research project, not so much. In fact, in research, speed is often a liability.

Here’s the takeaway from this essay. Whether I use the manual tools that were available in 1972 (and I often still do, by the way), or Google Scholar, or some other digital information resource, the results are the same—not because of the tool, but because of how I use what the tool generates. I’ve often said in my writing workshops that “you can’t polish a turd, but you can roll it in glitter.” Just because you’ve written the first draft of an essay, selected a pleasing font, right and left-justified the text, and added some lovely graphics, it’s still a first draft—a PRETTY first draft, but a first draft, nonetheless. It isn’t anywhere near finished.

The same corollary applies to research or any other kind of news or information-gathering activity. My widely cast net yields results, but some of those results are bycatch—information that’s irrelevant, dated, or just plain wrong. It doesn’t matter why it’s wrong; what matters is that it is. And this is where the human-in-the-loop becomes very important. I go through the collected data, casting aside the bycatch. What’s left is information. To that somewhat purified result I add a richness of experience, context, skepticism, and perspective. Ultimately I generate insight, then knowledge, and ultimately, wisdom. 

So again, technology provides a fast track to AN answer, but it doesn’t in any way guarantee that I’ve arrived at anything close to THE answer. Only the secret channels and dark passages and convoluted, illuminated labyrinths of the human brain can do that. 

So yeah, technology can be a marvelous tool. But it’s just a tool. The magic lies in the fleshware, not the hardware. Technology is only as good as the person wielding it. 

The Generational Blame Game

It’s a fundamental aspect of human nature, I believe, for each generation to criticize the generation that preceded it, often using them as a convenient scapegoat for all that’s wrong in the world. The current large target is my own generation, the Baby Boomers. I recently overheard a group of young people—mid-20s—complaining at length about their belief that the Boomers constitute a waste of flesh who never contributed much to society. Respectfully, I beg to differ; this is my response, along with a plea to ALL generations to think twice about how they characterize those who came before.

Millennials, sometimes called Gen-Y, and the Plurals, commonly referred to as Gen-Z, often blame Baby Boomers for the state of the world: the growing wealth imbalance, the violence and unpredictability of climate change, the multifaceted aftermath of COVID because of its impact on the supply chain, and the world’s growing political and cultural divisions—in essence, the world sucks and Boomers are to blame. They often proclaim Boomers to be a generation that contributed little of value to the world. This, of course, is a long-standing social convention: blame the old people, because they know not how dumb, useless and ineffective they are.

On the other hand, there’s a lot of admiration out there for the current Millennial über meisters of Silicon Valley—people like Mark Zuckerberg, Brian Chesky (AirBnB), Alexandr Wang (Scale AI), and Arash Ferdowsi (Dropbox). They deserve admiration for their accomplishments, but they didn’t create Silicon Valley—not by a long shot. The two generations that came before them did that.

But let’s consider the boring, stumbling, mistake-prone Boomers. You know them; they include such incompetent, non-contributing members of society as Bill Gates, the Steves, Jobs and Wozniak, Peggy Whitson, who recently retired as Chief Astronaut at NASA, Larry Ellison, who founded Oracle, Oprah Winfrey, creator of a breathtakingly influential media empire, Marc Benioff, founder of SalesForce, Reid Hoffmann, co-creator of LinkedIn, and Radia Perlman, the creator of the Spanning Tree Protocol, the rule set that the 25 billion computers on the Internet, give or take a few hundred million, use to talk to each another. And I won’t even bother to mention Tim Berners-Lee, the creator of the World Wide Web. 

What a bunch of losers.

But there may be a reason for the dismissal of an entire generation’s contributions to the world that goes beyond the tradition of putting elders on a literal or figurative ice floe and shoving them off to sea. I find it interesting that the newest arrivals on the generational scene judge the value of a generation’s contributions based on the application that that generation created. All hail Facebook, X, Instagram, Uber, Amazon, AirBnB, Google, Tencent, AliBaba, TikTok, GitHub, and Instacart, the so-called platform companies. Those applications are the “public face” of massive and incomprehensibly complex technological underpinnings, yet rarely does anyone make time today for a scintilla of thought about what makes all of those coveted applications—ALL of them—work. In fact, none of them—NONE of them—would exist without two things: the myriad computers (including mobile devices) on which they execute, and the global network that gives them life and makes it possible for them to even exist.

The tail wags the dog here: without the network, these applications could not function. Want some proof? The only time the vast majority of people on the planet are even aware of the network’s existence is when it breaks, which is seldom. But when it does? When ice or wind bring down aerial transmission cables, when a car takes out a phone pole, when fire destroys critical infrastructure and people can’t mine their precious likes on Facebook, when there’s a long weekend and everybody is home downloading or gaming or watching and the network slows to a glacial crawl, technological Armageddon arrives. Heart palpitations, panting, sweating, and audible keening begin, as people punch futilely at the buttons on their devices. But consider this: the global telephone network has a guaranteed uptime of 99.999 percent. In the industry, that’s called five-nines of reliability. And what does that mean in English? It means that on average, the phone network—today, the Internet—is unavailable to any given user for eight-and-a-half minutes a year. In a standard year, there are 525,600 minutes. For about nine of those every year, the network hiccups. Take a moment to think about that. 

When we think back on famous scientists and innovators, who comes to mind? Well, people like Alexander Graham Bell, of course, who invented the telephone, but who also invented the world’s first wireless telephone, called the photophone—and yes, it worked; or Thomas Edison, who became famous for the invention of the lightbulb, but actually invented many other things, and who was awarded 2,332 patents and founded 14 companies, including General Electric; the Wright Brothers, who flew successfully at Kitty Hawk; Watson and Crick, who discovered the DNA double helix and created a path to modern genetics and treatments for genetic disease; Bardeen, Bartain and Shockley, unknown names to most people, but names attached to the three scientists at Bell Telephone Laboratories who invented the transistor; Philo T. Farnsworth, the creator of television; and Marie Curie, who did pioneering research on radioactivity. These are all famous names from the late 1800s all the way through the 1960s. But then, there’s a great twenty-year leap to the 1980s, the time when Generation X came into its own. Movies were made about this generation, some of the best ever: Ferris Buehler’s Day Off. The Breakfast Club. Home Alone. Sixteen Candles. St. Elmo’s Fire. Clerks. The Lost Boys. Karate Kid. Gen-X was a widely criticized generation, an ignored, under-appreciated, self-reliant, go-it-alone generation of entrepreneurs that includes Jeff Bezos of Amazon fame, Cheryl Sandberg of Facebook, Sergey Brin of Google, Meg Whitman of Hewlett-Packard, Travis Kalanick, of Uber, and dare I say it, Elon Musk. All major contributors to the world’s technology pantheon, some as inventors, some as innovators. The power of the Internet to allow data aggregation and sharing made it possible for platform companies like Uber, eBay, Facebook and Google to exist. Those weren’t inventions, they were innovations (and to be sure, exceptional innovations!), built on top of pre-existing technologies.

Even the much-talked-about creations of Elon Musk aren’t inventions. Let’s look at StarLink, the SpaceX constellation of orbiting communication satellites. A satellite comprises radio technology to make it work; solar cells to power it; semiconductors to give it a functional brain; and lasers to allow each satellite to communicate with others. All of those technologies—ALL of them—were invented at Bell Labs in or around the 1940s. In fact, the first communications satellite, Telstar, was created at Bell Labs and launched into orbit in 1962—more than 60 years ago—to broadcast television signals. 

That 20-year leap between the 60s and the 80s conveniently ignores an entire generation and its contributions to the world—not just techno-geeks, but content and entertainment and media people who redefined our perception of the world. This was the time of the Baby Boomers, and while you may see us—yes, I am one—as an annoying group of people that you wish would just go away, you might want to take a moment to recognize the many ways my generation created the lifestyle enjoyed by Millennials and Gen-Z—and took steps to ensure that it would endure.

The thing about Boomer researchers, scientists, and innovators was that with very few exceptions, they were happy to work quietly behind the scenes. They didn’t do great big things exclusively for money or power; they did them because they were the right things to do, because they wanted to leave the world a better place for those who came later. And they did, in more ways than you can possibly imagine.

Let’s start with the inventions and innovations that made possible, among other things, the devices on which you watch, listen or read, and the content they deliver. I know I’ve already mentioned some of these people, but they deserve a few more words. 

Let’s start with the Steves—and no, I don’t mean me. I’m talking about Steve Wozniak and Steve Jobs who did quite a few things before inventing the iconic Macintosh. Both were born in the 1950s and grew up in the San Francisco Bay Area, and met while they were summer interns at Hewlett-Packard. In 1977, seven years before the Mac, they introduced the world to the Apple II personal computer, which included color graphics, a sound card, expansion slots, and features that made it the first machine that came close to the capabilities of modern PCs. Later, they introduced what many called the “WIMP Interface,” for windows, icons, mice, and pointy fingers, the hallmarks of what later became the Mac operating system—and ultimately, Windows 95 and the generations of that OS that followed. Incidentally, the incredibly stable, highly dependable Macintosh operating system is based on UNIX, an operating system first designed and developed at—you guessed it—Bell Laboratories.

Next we have Sir Tim Berners-Lee, born in London in 1955. He grew up around computers, because his parents were mathematicians who worked on the Ferranti Mark I, the first computer in the world to be sold commercially. He became a software consultant for the CERN Particle Physics Laboratory in Switzerland, which became famous for being the home of the Very Large Hadron Collider, which was recently used by astrophysicists to discover the Higgs Boson.

While at CERN in the 1980s, Berners-Lee took on the challenge of organizing and linking all the sources of information that CERN scientists relied on—text, images, sound, and video—so that they would be easily accessible via the newfangled network that had just emerged called the Internet. In the process he came up with the concept for what became the World Wide Web, which he laid out in a terrific research paper in 1989. Along the way he developed a software language to create web pages, called HTML, along with the first web browser, which he made available to everyone, free of charge, in 1991.

Most people think of the Internet and the World Wide Web as the same thing—but they aren’t. The Internet is the underlying transport infrastructure; the Web is an application that rides on top of that infrastructure, or better said, a set of applications, that make it useful to the entire world. 

Next, let me introduce you to Ray Kurzweil, who decided he would be an inventor before he started elementary school. By the time he turned 15, he had built and programmed his own computer to compose music. After graduating from MIT with degrees in computer science and literature, he created a system that enabled computers to read text characters, regardless of the font.

Kurzweil invented many things, but he is perhaps best known for coining the concept of the Singularity, the moment when digital computers and the human brain merge and communicate directly with each other. It’s a fascinating idea. A good business PC easily operates at four billion cycles per second. The human brain, on the other hand, operates at about ten cycles per second. But: a digital PC has limited memory, whereas the human brain’s memory is essentially unlimited. So what happens if we combine the blindingly fast clock speed of a PC with the unlimited memory of the human brain? The Singularity. Cue the Twilight Zone music.

Now let me introduce you to Ajay Bhatt. Born in India, he received an undergrad degree in electrical engineering before emigrating to the U.S., where he earned a master’s degree in the same field, working on technology to power the Space Shuttle. After joining Intel in 1990, he had an epiphany while working on his PC one evening. What if, he wondered, if it was possible for peripheral devices to connect to a computer as easily as plugging an electrical cord into a wall socket? Not all that hard, he decided, and he and his colleagues invented the Universal Serial Bus, which we all know as USB.

And then we have one of my favorites, Bob Metcalfe. Another MIT grad with degrees in 

engineering and management as well as a PhD from Harvard,  he joined Xerox’s Palo Alto Research Center, better known as Xerox PARC, a well-respected facility that has been compared to the east coast’s Bell Labs. While he was there, Metcalfe and his colleagues developed a technique for cheaply and easily connecting computers so that they can share files at high speed. The technology that resulted is called Ethernet, the basis for nearly every connectivity solution in use today in modern computer networks, including WiFi. He went on to found 3Com Corporation, but for me, he will always be most famous for what has come to be known as Metcalfe’s Law: that the value of a mesh network, meaning a network in which every computer connects to every other computer in the network, increases as a function of the square of the number of devices that are attached. Want that in plain English? When a new computer loaded with data connects to a mesh network, the combined value of all that data and its shared access doesn’t increase in a linear way; it increases exponentially. Don’t believe it? Look at every one of the so-called platform companies that we discussed earlier: Apple’s App or music store, Uber, Amazon, every single social media company, and for that matter, the telephone network and the World Wide Web itself.

Dr. Robert Jarvik was a prodigy who invented a surgical stapler and other medical devices while he was still a teenager. But then he got serious. While he was an undergraduate student at the University of Utah in 1964, his father needed to have heart surgery. That ordeal influenced Jarvik to turn his curiosity, inventiveness and problem-solving skills—along with his medical degree— toward finding a method to keep patients with failing hearts alive until they could receive a transplant. While he wasn’t the first to develop an artificial heart, Jarvik’s 1982 creation, the Jarvik 7, was the first such device that could be implanted inside a person’s body. Today, Jarvik continues to work on a device that can serve as a permanent replacement organ.

Here’s another one, and this one fascinates me. Sookie Bang was born and raised in South Korea. She graduated from Seoul National University in 1974 and earned a Ph.D. in microbiology from the University of California at Davis in 1981. As a professor and researcher at the South Dakota School of Mines and Technology, her specialty is bioremediation—for example, using bacteria as an ingredient in a sealant to fix cracks caused by weathering and by freezing water that seeps into the concrete outer surfaces of buildings. Bang and her colleagues figured out how to speed up a naturally occurring process in which bacteria extract nitrogen from urea, which produces carbon dioxide and ammonia as byproducts. The CO2 and ammonia then react with water and calcium to form calcium carbonate, the chemical compound that we know as limestone. The patch created by the bacterial process seals the crack from the inside out and integrates with the porous concrete, repairing the crack. In essence, the concrete becomes self-healing.

Another Boomer name you need to know is Dean Kamen, who was born in Long Island, N.Y., in 1951. You may not know who he is, but I guarantee you know at least one of his inventions.

In the early 2000s, Kamen attracted media attention because investors were knocking each other over to be the first to fund “Project Ginger.” The project was highly secretive, but when the veil was finally lifted, the world was stunned when they were introduced to the Segway Transporter. The device incorporates sophisticated electronics and a gyroscope that allow it to self-balance, and moves, stops and turns based on subtle changes in the driver’s posture. Today, the Segway’s progeny include the ubiquitous “hover boards” that every kid seems to have. But Kamen’s invention also led to the development of an extraordinary device that has changed the lives of thousands of people: a remarkable wheelchair that, thanks to its gyros, can convert from a standard four-wheel chair to a two-wheel chair, in the process lifting the occupant up to eye level with an adult. It can even climb stairs. 

 But Kamen was an inventor long before he created the Segway. While he was still a college student at Worcester Polytechnic Institute in 1972, he invented a wearable device called the ambulatory infusion pump. It changed the lives of diabetics, freeing them from having to worry about injecting themselves with insulin. The pump did it for them.

But he didn’t stop there. After creating the ambulatory infusion pump, Kamen went after a solution for patients with severe kidney disease who had to travel to dialysis centers for the treatments they needed to survive. He invented a portable machine that allowed patients to give themselves dialysis treatments at home, while sleeping. In 1993, it was named Medical Product of the Year.

The list goes on: flexible foot prostheses, artificial skin grafts, innovative battery designs, and plenty of others, all created by experienced, gifted innovators and inventors—and dare I say it, with a small bit of pride, Baby Boomers.

The truth is, every generation yields its own crop of gifted people who make important  contributions to science, engineering, the arts, medicine, and society at-large. But without the contributions of those who came before, nothing we enjoy today would exist. The Boomers stood on the shoulders of giants from the Greatest and Silent Generations, just as Gen-X, the Millennials and Gen-Z stand on Boomer shoulders, and just as the next generations to arrive will stand on theirs. It’s easy to criticize those who came before, but it’s also not much of a stretch to recognize that the current generations of any era wouldn’t be where they are or have what they have without them. So instead of looking for the failures of prior generations, maybe we all need to take a moment to recognize their successes—and how those successes benefit us. Of course, if you still want to blame the Boomers for the Internet, mobile telephony, and the commercial success of the global semiconductor industry that makes literally EVERYTHING work, I guess I’m good with that.

RIPPED FROM THE HEADLINES!!!

THE BOOK AMAZON REFUSED TO MARKET!

Okay, enough with the histrionics. Although, histrionics or not, it’s true. For almost a year, Amazon refused to let me run a marketing campaign for my novel, Brightstar, because they deemed it too controversial. Why? Apparently, because Russia is the ‘bad guy’ in the novel. That’s not true: Putin is the bad guy in the novel, a title he deserves. The irony is that as Amazon’s army of AI evaluators decided, thanks to their Byzantine algorithms, that my novel was unkind to Putin and therefore ineligible to have its own paid marketing campaign, their biggest advertised product was the latest Jack Ryan series on Amazon Prime, which took place in Russia and did plenty of Russia-bashing. Of course, they had the enormously talented John Krasinski. I … didn’t.

But this is not about sour grapes or Amazon bashing (it isn’t even about Russia bashing). This is about the role that technology increasingly plays in our world, and the fact that while its value is beyond reproach, it does deserve to be questioned before it’s implemented. 

I’ll start with the book. It’s a great story: even without Amazon’s help, it has sold well. And to Amazon’s credit, yesterday, right after I launched the marketing campaign for Russet, I tried again to launch a campaign for Brightstar, and lo and behold, they allowed me to do so. Not sure what changed, but the campaign is now active.

Here’s the point I want to make. I tried for a week to speak to a human at Amazon about their refusal to allow me to advertise the book. But the decision to give Brightstar a thumbs-down for a marketing campaign, a campaign that I have to pay Amazon for, was made (apparently) by one or more AI instances without the benefit of a human in the loop. After a week of trying to get somebody on the phone to explain to me why I was ineligible for Amazon’s marketing services, I gave up and went elsewhere. “Elsewhere” turned out to be a very effective choice, and the book sold very well. It still does.

So, why am I telling you this? To sell books, of course, but there’s another reason, one that’s more important. For 43 years, I worked full-time in the technology, media and telecom industry: more than a decade in the telephone industry, network analysis and IT mostly, then ten years as a senior consultant with an advisory professional services company, then 24 years on my own as a consulting analyst to companies striving to understand the implications of technological change for their businesses. I did this work all over the world, in more than 100 countries. What I discovered in all those years of focusing on the contact point between people and technology is that technology is a game-changer. I have watched in humble awe as it catalyzed education, reinvented healthcare, made government more transparent, forced a shift in power from the few to the many, grew local, regional, and national economies, empowered individuals, and created hope—so much hope. Here are some examples.

I sat on the ground with a group of educators in the shade of acacia trees and watched as the kids from a local rural school unpacked the bright green laptops they had been given by the One Laptop Per Child Project. The adults were largely mystified by the machines, choosing instead to immerse themselves in their mobile phones. Within a half hour, the kids had created social media accounts and were online, chatting with people all over the world. By the end of the day, the machines were old news; they had become experts.

I watched in awe and with no small number of tears as an elderly woman in a different African village was handed a mobile device for the first time and told to push a particular number on the screen. Within seconds, she was videoconferencing with her son, whom she had not seen or communicated with directly in ten years. He left the village to get work in the city; the arrival of mobile connectivity and solar charging stations in her village made it possible for her to routinely speak with distant family members.

In Ghana, in west Africa, an organization I had the opportunity to work with decided to tackle one of the country’s greatest challenges: adult literacy. Without literacy a person can’t take a driver’s test, can’t read road signs, can’t read a map, can’t read medical prescriptions, can’t help their children with their homework, can’t fill out a job application, can’t read loan documents, can’t read a services contract. In Ghana, large swaths of people may not be able to read, and the remote villages may not have running water, or sewer, or electricity, but everyone has a mobile phone—everyone. So, the folks I got to know came up with an idea: let’s send reading lessons as text messages to peoples’ phones. They did. The result? A climb from complete illiteracy to a grade eight reading level in eight weeks.

In one of Southern Africa’s slums, I was invited into a rural clinic by a healthcare organization I was working with. The clinic was a metal shipping container that had been divided into two rooms, one twice the size of the other. The larger room served as the waiting room, exam room, diagnostic center, and prescription dispensary. The smaller room was a full-blown surgical suite. I was invited to sit in while a patient had her gall bladder removed. The procedure took 40 minutes from open to close; they sent her home that afternoon with a bottle of aspirin, four tiny puncture wounds in her belly from the surgical tools, and four band-aids. Nothing magical about this story until I tell you that the procedure was performed by a team of surgeons who were located at a hospital in Maryland, 7,500 miles away, using a robotic surgery machine. The machine was connected on each end to an optical network that provided the bandwidth necessary to perform the procedure remotely.

One of the most poignant photographs I saw during the Arab Spring uprising was of a group of teenagers running past a low brick wall on the perimeter of Tahrir Square in Cairo. The wall was splashed with graffiti, French words that said, “Thank you, Facebook. Thank you, Twitter, for our freedom.” I’m no fan of social media—I believe it has largely become a corrosive and destructive force in modern society—but during Arab Spring, it gave voice to those who for so long had not had one.

Finally, a personal note about the role that technology has played in the lives of so many. The first time I went to South Africa to do work for the small university that became such a big part of my life, I had been there for three days when the founder and chair of the school told me that they had a graduation taking place the next evening and asked if I would please be their commencement speaker.

“Tomorrow? Sure … I think,” I fumbled out a response. Not much notice for a commencement speech.

So, I prepped and got ready, fully prepared to say all the appropriate things. The next evening, we all filed into the auditorium in the standard processional to the familiar tune of Pomp and Circumstance, and sat in the front row. The graduates sat behind us, resplendent in their caps and gowns. One by one they stood when their names were called and climbed to the stage, where they were presented with their rolled certificates.

In the audience, tears flowed on the faces of the gathered family members. What an amazing thing this was: their child was graduating from a university program. 

What I haven’t told you is that these students were not graduating with two-year or four-year degrees, nor were they graduate students. They were employees of various South African companies who had attended and were graduating from a one-week Microsoft Project course. Sounds silly, doesn’t it, to wear caps and gowns and march in a processional? It’s not. It wasn’t all that long ago that these students, all black, were denied access to education in general and would never have had the opportunity to graduate from ANY kind of program, degreed or otherwise, much less from one offered at a highly regarded university.

I won’t bore you with the post-graduation gathering, or the emotional, heartfelt tributes I heard for the next few hours, or the number of hugs I got from graduates, or how humbled and lucky I felt just to be part of the ceremony, but I will say this: technology, whether it’s telecom connectivity, or telemedicine, or the extensive tentacles of the Web, or videoconferencing, or a company’s need to train its employees on the use of a project management application, changes lives. It makes us better people. It gives us the velocity and acceleration we need to move forward, always forward. It can be one of the most powerful eliminators of social and economic inequality ever created. Technology can be, in the truest sense of the word, awesome—as in, awe-inspiring.

But to be fair, tech also has its dark side. Computers and mobile devices tear at the fabric of community, all-too-often forcing us into fearful and paranoid communities of one, obsessed with fear of missing out and not being good enough, smart enough, thin enough, pretty enough, or connected enough. Social media then pits these one-person communities against each other, emphasizing our scant differences while minimizing just how similar we really are. It’s a tragic addition to our reality, it’s destructive, and it’s dangerous.

Artificial Intelligence, the latest innovation to be added to the technological pantheon, is, like all technologies, an amazing thing that has enormous potential. But it also has the potential to make us complacent and lazy, convincing us that its ability to replace human function is up to the task, when in fact it’s not—not even close. It causes us to develop blind spots, makes us believe that good enough is good enough and that the status quo is as good as it gets. Meanwhile, ubiquitous, near-seamless broadband connectivity enrolls us all in a cult of speed, driving us to worship velocity rather than being part of a community of goodness and richness and caring for each other.

As I said at the beginning of this essay, I spent my entire professional career in the technology world, which means that I am no stranger to it. It also means that I appreciate what it does for us in all its many forms, and am sometimes awed by its breathtaking complexity, carefully hidden from view by those who developed it. But I also bear a sense of healthy skepticism about technology because of its potential to do us harm. I can quickly assemble a new piece of furniture with a screwdriver, but I can also stab somebody in the eye with it. Alfred Nobel invented dynamite as a way to cut roadways and tunnel through mountains and accelerate the pace of human infrastructure development; he was deeply saddened when it became a central component of mass warfare. AI can revolutionize healthcare, engineering, and the arts, but as we’re now seeing, when co-opted by ne’er-do-wells, it can be turned into a destructive weapon with great effect.

Two lessons emerge from this essay, one of them admittedly selfish (I’d like people to read Brightstar because it’s a GREAT story with a GREAT ending). The first is to make what I believe is a very, very important observation that we must all keep in mind. Technology in and of itself is always—ALWAYS—a sideshow attraction until it is put to good use by a human. Robots, for example, like AI, cannot begin to replace human capability and capacity, but they can augment it. Remote video is wonderful, but it will never, ever replace a handshake and a conversation over a cup of coffee. I love email, but quite a few of my friends and I exchange hand-written letters several times a year, the receipt of which makes me feel good for days after opening the envelope. Fear not, humans: technology augments us, not the other way around. Never lose sight of that.

My second observation is that I wrote Brightstar to show what happens when an innovative new technology, wielded by people who have their heads screwed on right, wreaks havoc on totalitarian, despotic regimes that would oppress their own people in the name of power-grabbing. The description on the back cover says it all:

Jason and Nicky are much closer than most brothers—they are best friends, growing up in a military household, moving constantly, with an alcoholic, abusive father and a caring mother who tries to shield them from their father’s demons. When Nicky dies in a freak accident, Jason is devastated. He ultimately recovers and joins a company that has developed a remarkable radio-based communication technology called Brightstar that, when deployed, will become one of the most powerful allies to freedom and one of the greatest threats to totalitarianism the world has ever seen.

When a natural disaster gives the company the chance to deploy their new technology to save countless lives, another opportunity unexpectedly arises. Regime change is underway in Russia, and the challenger to Putin sees Brightstar as the lever he needs to bring about hopeful change in the country. It becomes Jason’s job to deploy it—in the face of an incumbent regime that will deny its installation at all costs. 

The book ends with a shattering, unexpected conclusion that will stop readers in their tracks and make them beg for a sequel.

The Brightstar technology does one thing very well: it catalyzes the democratization of information. In other words, the more people know, the better informed they are about the fact and fiction that define an issue. And the more informed they are, the better they can make informed decisions, decisions that have a positive impact on themselves and their community. 

That, you see, is the power of technology. When it’s used to move society toward the future, when it’s used to shine a light on the things that don’t, it serves us as it should. But when it’s used as a bludgeon to disguise the beauty of our attainable future, to move us backward, to create divisiveness by falsely showing us how different we are in our wants and needs and desires, rather than how similar we really are, it does damage. Our job is to prevent that from happening. And that is what Brightstar is about.

By the way, if you’re interested in the Brightstar technology, I wrote a short essay about how it might actually work. I’m happy to share it with you. And if you’d like to read the book, check it out here: