Finally free of the imperative of manuscript editing, I actually am reading. Nicholas Carr’s new book, The Glass Cage, is a worthy sequel to The Shallows. The earlier book was a brilliant telling of the neuroscience of our brains in using the internet…. As opposed to, say, reading. (yes, this is a blog post, blah blah…) The current book is an exploration of the automation of processes of all sorts, from factory processes to self-driving cars to decision-support software employed by doctors and lawyers.

Carr’s books are attractive because he avoids turning them into a polemic on one side or the other of these questions. He doesn’t think automation is inherently bad (Frankenstein) or inherently good (the techno-futurists); indeed, he gives a nice history which shows that excitement about machines and anxiety about them have gone hand in hand from their inception. His books are really more about understanding something thoroughly.

But with two lessons. One, Carr is adept at noting how “this time it’s different.” In The Shallows, he persuasively makes the case that the internet is not just another in a string of “media” advances, from writing to the printing press to the telegraph to the radio. The combination of the actual processes (and limits) involved in use and the physical capacities (and limits) of the human person shape what a given media technology can mean and be for us. The internet combines a pace of extraordinarily rapid inflow and a virtually-unlimited storage capacity. This differs from reading. In The Glass Cage, he is out to show that the current wave of automation is different because of its capacity to mimic not just human physical processes, but human thought processes. One of the key claims of the book is that the ability to mimic processes is not the same as replicating the processes themselves – Watson doesn’t answer a Jeopardy question the same way a human does, nor does “Doctor Algorithm” go about diagnoses in the same way a doctor does. In some ways, the ability to process massive amounts of data via algorithms and probabilities is great; in other ways, it is very different from human thought and action, and introduces a different set of “errors.”

Two, Carr adopts a tone whereby he concludes “Automation is dangerous, but it can be a gift if we figure out how to use it wisely.” He’s a moderate, but because he’s so interested in these processes, he recognizes not just the beneficial results of technologies but the actual tasks at which the technology excels. If we didn’t end up serving the technology, Carr seems to say, we could make the technology serve us. Most importantly, in both books, there is a consistent message that the use of technology designed to assist skill actually serves to replace and erode skill. Thus, the best use of the internet or of automation would be “some.” In particular, Carr highlights how automation makes us dumber and how it makes the subjective character of our work decay. These views are described in significant detail, and without oversimplification.

Yet what becomes obvious from these books is we as a society lack almost completely the moral language that would be necessary to draw such conclusions. “Some” is not a force in the public discussion. We seem to like our morality served up really clear, no matter which side we are on. Moral outrage is about violations of “basic rights” or “God’s law.” When we are faced with even a concept like inequality, there are temptations to cast it in this mold, and when it doesn’t fit well, go little further in really getting at what the problem is. Bill Gates can sing the praises of technology, but then warn of the dangers of artificial intelligence and insist we should be more concerned. (Notably, the voices coming from places like Google, Facebook, and the like are far less "concerned" and far more techno-futurist.)

The missing language is a language of limits, modesty, appropriateness, reasonableness, and the like. It is the traditional language of the virtue of temperance, but leavened by prudence and wisdom. Carr’s book illustrates the need for such a language, but also the difficulty of finding it. We’d like to imagine that technology will threaten us in a Frankenstein or Robocop sort of way. But what if technology’s power to destroy us is more like the slow, sure creep of the problems of the high-functioning alcoholic, instead of the dramatic conflict of the destructive or homeless drunkard?

There are a whole array of problems we face in our culture – campus drinking and sexual assault (where Dartmouth’s recent announcement of a campus ban on hard alcohol is an intriguing step), consumerism and economic excess, structural issues in the economy, family breakdown, climate change (2014 was the hottest year on record, and the price of gas won’t change that), and yes, smartphone use – where we need to recover the language of limits, modesty, and restraint. In practice, of course, many people observe limits. But where’s our public, shared moral language for them?

Topics

David Cloutier is an associate professor of theology at the Catholic University of America and the author of Walking God’s Earth: The Environment and Catholic Faith.

Also by this author
© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.