Automation and the Danger of Lost Knowledge

6 Min Read

In my previous post I pondered the quality of machine-generated data, cautioning that even though it overcomes some of the inherent errors of human-generated data, it’s not immune to data quality issu

In my previous post I pondered the quality of machine-generated data, cautioning that even though it overcomes some of the inherent errors of human-generated data, it’s not immune to data quality issues.  Nicholas Carr, in his recent article All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines, pondered the quality of our increasingly frequent decision to put so many things into the hands of our automatons.

While autopilot, for example, has improved flight safety over the years, many aviation and automation experts have concluded that the overuse of automation erodes pilots’ expertise and dulls their reflexes.  “Automation has become so sophisticated,” Carr wrote, “what pilots spend a lot of time doing is monitoring screens and keying in data.  They’ve become, it’s not much of an exaggeration to say, computer operators.”  As one expert cited by Carr put it: “We’re forgetting how to fly.”

Carr’s article also examined the impact of automation on medical diagnosis, surgery, stock trading, accounting, navigation, and driving, as well as the fundamental notion underlying those and arguably most, if not all, human activities—the dissemination and retention of knowledge.

“Automation, for all its benefits,” Carr cautioned, “can take a toll on the performance and talents of those who rely on it.  It alters how we act, how we learn, and what we know.  The choices we make, or fail to make, about which tasks we hand off to machines shape our lives and the place we make for ourselves in the world.  That has always been true, but in recent years, as the locus of labor-saving technology has shifted from machinery to software, automation has become ever more pervasive, even as its workings have become more hidden from us.  Seeking convenience, speed, and efficiency, we rush to off-load work to computers without reflecting on what we might be sacrificing as a result.”

The Knowledge You No Longer Need to Have

In his book Where Good Ideas Come From: The Natural History of Innovation, Steven Johnson wrote about the benefits of stacked platforms—layers upon layers that we can build upon without needing to know how any of those layers work.  Although he explained that this paradigm existed long before computers and is the basis of many other fields of human endeavor, information technology provides the easiest contemporary examples.  One is Twitter, which leverages the Web, itself a multi-layered platform, as well as the SMS mobile communications platform, and the GPS satellite system, among others.  Twitter could be quickly developed and deployed, in large part, because it could stand on the shoulders of existing (and mostly open source) information technology giants.

“In a funny way,” Johnson noted, “the real benefit of stacked platforms lies in the knowledge you no longer need to have.  You don’t need to know how to send signals to satellites or parse geo-data to send that tweet circulating the Web’s ecosystem.”

While I agree with Johnson, I also find it more than a little disconcerting.  The danger of lost knowledge enabled by the Web’s ecosystem, and other information technology advancements, is it leaves us at the mercy of the people, corporations, and machines that retain the power of that knowledge.

Is Automation the Undoing of Knowledge?

“Knowing demands doing,” wrote Carr.  “One of the most remarkable things about us is also one of the easiest to overlook: each time we collide with the real, we deepen our understanding of the world and become more fully a part of it.  While we’re wrestling with a difficult task, we may be motivated by an anticipation of the ends of our labor, but it’s the work itself—the means—that makes us who we are.”

“Computer automation severs the ends from the means,” concluded Carr.  “It makes getting what we want easier, but it distances us from the work of knowing.  As we transform ourselves into creatures of the screen, we face an existential question: Does our essence still lie in what we know, or are we now content to be defined by what we want?  If we don’t grapple with that question ourselves, our gadgets will be happy to answer it for us.”

While machine-generated data is not the same thing as machine-generated knowledge, in the same way that automation is not the same thing as artificial intelligence, if we consider how much of our knowledge, and our responsibility for doing something with it, we’re turning over to machines through automation, it might give us pause to hit the pause button—if not the stop button—in some cases.

Share This Article
Exit mobile version