• ADVERTISEMENT

    Nicholas Carr’s ‘Glass Cage’: Automation Will Hurt Society in Long Run

    by Jenny Shank
    November 11, 2014
    Nicholas Carr has taken aim at things like the Roomba. Photo by Juliette Culver and used here with Creative Commons license.

    Writer Nicholas Carr has earned his reputation as one of the premier contemporary critics of technological utopianism through articles such as “Is Google Making Us Stupid?” which he published in The Atlantic in 2008 and lucid, insightful books including the 2011 Pulitzer Prize finalist “The Shallows: What the Internet is Doing to Our Brains” and his new “The Glass Cage: Automation and Us<.” Some might accuse him of technophobia, but he isn’t advocating that everyone smash their smartphones. Instead, he urges a balanced, thoughtful approach toward our engagement with technology rather than a blind acceptance that each new gizmo will enhance our work and our lives, even if our bosses or friends tell us it will. I recently interviewed Carr via email about “The Glass Cage” and how some recent technological advances might be making us dumber, less skilled, less happy, and less moral.

    Q&A

    Nicholas Carr in 2008. Photo by Sandy Fleischmann and used with Creative Commons license.

    Nicholas Carr in 2008. Photo by Sandy Fleischmann and used with Creative Commons license.

    "I think we need to change our attitude toward labor and workers. If we’re going to have a healthy society, we need to venerate workers a little more and capitalists a little less" -Nicholas Carr

    Shank: It seems to me that one idea underlying both “The Glass Cage” and “The Shallows” is that humanity has recently undergone an unprecedented expansion in technology, and that some problems have arisen because many people tend to accept any new technology as being automatically beneficial, without questioning it. Do you think there will come a point when more people begin to question and reject or selectively use automation technology?

    ADVERTISEMENT

    Carr: I wouldn’t necessarily say that the current explosion in technology is unprecedented. If you look at the advances that were under way 100 years ago, with electrification, drugs and medical technologies, telecommunications, and the automobile, I think you could make a case that progress today has actually narrowed. But it’s true that computers, smartphones, and software are very quickly and very deeply changing the way we act and think as well as the texture and pace of our lives. The speed with which this is playing out has made it very hard to step back and take a critical view of the changes. We’ve all been caught up in a whirlwind.

    Still, I do sense that more people are coming to have doubts about where we’re headed and in particular about our rush to automate subtle skills that are essential not only to the economy but to people’s sense of personal achievement and fulfillment. So I think we will see more questioning of the pace and direction of automation over the next few years. What’s hard to predict is whether the resistance, or even questioning, will become strong enough to counter the enormous momentum that’s propelling computerization forward. I guess I’m not optimistic there, but I’m not despairing, either.

    Shank: You mentioned in a recent talk at Google that you started out as a technology journalist reviewing or commenting on the features of new tech products and companies, and like many tech journalists you didn’t question whether the new technology was even necessary or beneficial. We expect the companies that sell these products to hype them, but is there also a tendency for people who aren’t employed by tech companies, such as journalists or even users, to go along with the line that most every technological advance is beneficial?

    ADVERTISEMENT

    Carr: I think we humans are naturally enthusiastic about new gadgets and gizmos — they can be amazing, after all — but that bias, while completely understandable, does make us much more susceptible to blindly buying into the marketing messages streaming out of Silicon Valley. More disturbing to me is the way that the news media has become an amplifier of technology marketing, giving exhaustive coverage to even fairly routine product announcements. I mean, when you look at how carefully companies like Apple and Amazon and Facebook and Google choreograph their “media events,” and how reliably the media plays its appointed role, it’s hard not to feel a little repulsed. The only precedent that I can think of is the coverage of Hollywood — but now smartphones and apps are the celebrities.

    Shank: Your books have been widely critically acclaimed — “The Shallows” was a finalist for the Pulitzer Prize — but it seems that when reviewers do criticize them, their main argument is that you are just an “anti-technology” doomsayer. How do you answer that criticism?

    Carr: I’m pretty much immune to it at this point. A lot of people worship technology – it’s become kind of central to their sense of self – so any criticism of technology provokes a knee-jerk reaction from them. It’s easier to scream “Luddite” or “technophobe” than to think. Most people, though, are not so defensive or blinkered – in giving my talk at Google, for instance, I found most everyone there very willing to consider my point of view – and so I try to aim my work at the open-minded rather than worry too much about the closed-minded.

    Is A Vacuuming Robot Amoral?

    carr_glasscageShank: “The Glass Cage” made explicit for me a number of problems with automation that I had been vaguely worried about. But one thing that I had never worried about until reading “The Glass Cage” was the morality of the Roomba. You write, “Roomba makes no distinction between a dust bunny and an insect.” Why is it so easy to overlook the fact, as I did, that when a Roomba vacuums indiscriminately, it’s following a moral code?

    Carr: It’s easier not to think about it, frankly. The workings of automated machines often raise tricky moral questions. We tend to ignore those gray areas in order to enjoy the conveniences the machines provide without suffering any guilt. But I don’t think we’re going to be able to remain blind to the moral complexities raised by robots and other autonomous machines much longer. As soon as you allow robots, or software programs, to act freely in the world, they’re going to run up against ethically fraught situations and face hard choices that can’t be resolved through statistical models. That will be true of self-driving cars, self-flying drones, and battlefield robots, just as it’s already true, on a lesser scale, with automated vacuum cleaners and lawnmowers. We’re going to have to figure out how to give machines moral codes even if it’s not something we want to think about.

    Does Automation Erode the Skills of White Collar Workers?

    Shank: Most people know that automation has eliminated a lot of factory jobs, but “The Glass Cage” points out that it’s eliminating or reducing the tasks of white-collar jobs, too. Do you think white-collar workers are less likely to notice or protest the changes to their jobs that automation brings than blue collar workers are?

    Carr: Not necessarily. A lot of doctors have been very vocal about the drawbacks of automated record-keeping and diagnostic systems. They see the software as intruding into their relationship with patients and circumscribing their own autonomy. And I talked to some architects who are deliberately reducing their dependence on computers and software and going back to manual sketching and model-building. At least in some professions, white-collar workers probably have more freedom to negotiate their relationship with technology than most factory workers have had. Still, you do see a lot of areas in which employers, including hospitals and businesses, are requiring professionals to use highly automated software even when it dulls their skills and makes their work less interesting. The desire for immediate efficiency gains often trumps all other considerations.

    Shank: One example of this white-collar job shift that you mention is how most radiologists now use automated guides to point out areas for further investigation on mammogram results, which helps them catch certain types of cancers, but causes them to miss others that they might once have caught, because the human tendency is to scrutinize the highlighted area, ignoring others. Why don’t radiologists just look at the results without an electronic aid first, and then use the automated checker second? Are any of them going against the grain and doing this sort of thing?

    Carr: Yes, I think the software aids are being used in both ways today – as primary analytical systems, which guide the radiologists from the start, and as secondary or backup systems, which offer advice after the professional has performed his or her own analysis. Most of the work I’ve seen from human-factors researchers suggests that the latter approach is the better one – that you get better results when you first give experienced professionals time to work through a problem on their own, and only then have software make suggestions or provide prompts. But that approach also adds a little inefficiency to the process – it tends to take a little more time – and so even here you have a lot of pressure to bring the software in immediately in order to save a few minutes and a few pennies. Sometimes it’s the hospital bean-counters that call the shots, not the doctors.

    People Actually Are Happier When They’re Working Hard

    Shank: I happened to be reading “Farmer Boy” by Laura Ingalls Wilder to my daughter while I was also reading “The Glass Cage,” and I was struck by a scene in which Almanzo, the farmer boy of the title, asks his dad why they slowly thresh the crops by hand on stormy days in winter, rather than hiring the threshing machine that would finish the job quickly. His dad says, “All it saves is time, son. And what good is time, with nothing to do? You want to sit and twiddle your thumbs, all these stormy winter days?” This book is set in the 1800s, when it was possible for a family farmer to make that kind of quality-of-life-over-efficiency choice. Is it possible to make such a choice in the modern world?

    Carr: That’s a lovely passage – I wish I’d been aware of it while writing the book. It would have served well as an epigraph. It gets to the heart of one of my central arguments: that technology in general and automation in particular shape our experience of life and hence our sense of engagement and fulfillment. We’re often too quick to believe that if we’re “freed up” from hard work, we’ll enjoy life more, but the opposite often turns out be true. When things become too easy for us, we become self-absorbed and anxious. True freedom comes from accomplishing hard things, from being busy at some meaningful task in the real world. As I write in the book, “Automation often frees us from that which makes us feel free.” I do think that, for economic and employment reasons, it’s becoming harder for people to resist labor-saving technology. At the same time, though, we’re seeing young people getting involved in small-scale agriculture and various handicrafts, often using more traditional, less automated tools. So all is not lost. Resistance is not futile.

    Shank: Could, contrary to what almost everyone seems to believe, some inefficiency be good for people, good for business, and good for the country?

    Carr: Learning requires inefficiency. You’ll never develop any interesting talent if you never experience friction, if you never have to work painstakingly and slowly through a hard challenge. Friendships and other close relationships, too, entail inefficiency — contrary to what Mark Zuckerberg might tell you. So, yes, the pursuit of immediate efficiency gains may come at a high cost in the long run, for individuals and for society.

    Shifting Society’s Emphasis from Capitalists to Workers

    Shank: You write, “To ensure society’s well-being in the future, we may need to place limits on automation. We may have to shift our view of progress, putting the emphasis on social and personal flourishing rather than technological advancement.” What are some ways that we could do this?

    Carr: One practical way that I discuss is changing our approach to designing and programming automated software and systems. Right now, the dominant approach is what’s called “technology-centered automation,” the goal of which is to hand over as much work and responsibility to computers as possible. A better approach, I think, is “human-centered automation,” which views the person and the computer as being partners. The software is programmed to keep the human deeply engaged in the work rather than cut off from the work.

    More broadly, I think we need to change our attitude toward labor and workers. If we’re going to have a healthy society, we need to venerate workers a little more and capitalists a little less. That may sound radical, but it’s an attitude that characterized society in much of the last century, when working people were respected and had opportunities and the middle class was thriving.

    Jenny Shank‘s novel, The Ringer, won the High Plains Book Award. Her stories, essays, satire and reviews have appeared in The Guardian, TheAtlantic.com, McSweeney’s, and the Dallas Morning News.

    Tagged: automation books nicholas carr robots technology the glass cage

    4 responses to “Nicholas Carr’s ‘Glass Cage’: Automation Will Hurt Society in Long Run”

    1. Adventure49 says:

      Technology is neutral. it is neither good nor bad. It is how we choose to use the technology that raises issues of morality. Automating drudgery allows people more free time to pursue other interests. The problem arises when their is a dearth of other interests. Unfortunately, our education system is oriented towards the utilitarian and not the intellectual development of people.

      A second issue is that automation has the potential of concentrating the wealth. If the free time that an individual acquires from automation results in their expending all of that time on survival; their is not gain.

      • Osman Koroglu says:

        Technology is not neutral. According to Andrew Feenberg: “technology is a product of social processes and political struggles.” Codes are decisive. Designers have goals and they design accordingly. Most of the time we do not notice the underlying idea and think that technology is neutral.

        • Adventure49 says:

          Technology is the result of innovation, and it is not until that technology is put to use that “social processes and political struggles” have there affect. When I write code or design a system using technologies it is then that issues of morality enter. Knowing how to achieve an end and working to achieve that end are not the same.

          • betterthanaboyfriend says:

            As a coder, you may enjoy Martin Ford’s book, a computer scientist and software engineer himself, which looks at automation from the side of it’s economic destruction and not a blanket animosity towards the machine. it isn’t neutral as a capital investment when it directly competes with the labor pool.

            Carr isn’t a neoluddite as much as looking at the staggering pace of certain automation processes to decimate complex jobs, even coding with template based website builders, will come under fire as well as we fanatically make the push for more people learning coding. So like a college degree, it will saturate and become less employable.

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift