The last book I read in 2020, before the pandemic rolled in and my mental life detonated, was Uncanny Valley, Anna Wiener's memoir about her years working in the Bay Area startup world. What stood out to me, and to many other readers at the time, was how the author described Silicon Valley's relationship with "friction." "Friction" is a term used in the tech industry for anything which slows down or impedes a user's adoption, use, or purchase of a product or service. The idea started off like a design principle: as a commitment to streamline processes, simplify interfaces, and generally make products easier to use. But, as Wiener points out, the "reduction of friction" evolved from a design principle into a general philosophy of life. What Silicon Valley was trying to build was "a world freed of decision-making, the unnecessary friction of human behavior, where everything—whittled down to the fastest, simplest, sleekest version of itself—could be optimized, prioritized, monetized, and controlled."
What struck me about this conception of a "fetishized life without friction" was less that it encapsulated a certain ideological stance visible everywhere in the tech industry—though it did. What actually got me how neatly and precisely it described my relationship with computers. People often talk about the addictive and attention-draining properties of the Internet, but when I think about my disordered patterns of computer use, I tend less like a drug dealer, and more like an avoidance machine. This "fetishized life without friction" might describe an existence with all the minor inconveniences of life removed. But for me it spoke to a desire, a compulsion, to push away all sources of mental friction, a way of avoiding, automating away, or outsourcing all the things that fills me with anxiety or existential dread.
About a year before I read the book, I had been diagnosed with obsessive-compulsive disorder. One school of thought about the condition stresses the importance of something called “experiential avoidance,” a tendency to try to escape or avoid situations, thoughts, sensory phenomena, or feelings that are distressing or “triggering.” In this school of thought, the ritual compulsions typical of OCD (the classic case is handwashing), are ways of pushing those thoughts away and reassuring oneself. But to the extent that these rituals "work," they are only temporarily efficacious. Over time, this coping behavior paradoxically heightens and intensifies the original anxiety. Since the fear is never directly addressed, it tends to grow in strength, and the rituals involved in defusing that fear grow more frequent, more elaborate, and become entirely incapacitating.
"Incapacitating" is exactly how I would describe my relationship to computers, especially since the start of the pandemic. "Friction" for me looks like uncertainty, partial knowledge, ambiguity, disorder, or anxiety. The ecosystem of note taking software and influencers that has emerged since 2020 offer a great example of what I am talking about. Writing is a task which forces one to confront ambiguity, uncertainty, and one’s own limits directly, so it’s not surprising that a coterie of avoidance entrepreneurs sprung up around it. The goal of note taking systems, almost without exception, is to provide a smooth, frictionless environment for making connections between disparate topics. As your “second brain” develops, the ability—and indeed the desire—to create a dense latticework of connections increases. The problem for me, at least, is that I don’t find the work of making connections to be difficult. If anything, my “first brain” tends to overproduce these kinds of thoughts.
What it does provide, however, is a sandbox for infinite varieties of “avoidant labor.” If your writing anxiety focuses, like mine frequently does, on the possibility of “accidental plagiarism,” well, you can tend to your “zettlekasten,” and ensure that everything you ever highlighted is properly cited and tagged. If you have a more generic form of writing anxiety, why not spend a few hours linking your notes? In both cases, you have an act adjacent to writing, which relieves some of the tension of not writing but actively avoids the task itself. But it’s not just note taking apps; in the place of “writing” you might add fears like “talking on the phone,” “grocery shopping,” “cooking,” or simply being left alone with your internal monologue. For every one of these fearsome thoughts or feelings, there is a tool, a service, an instructional video, or a python package ready with a prepackaged ritual to help to temporarily defuse, but never actually address these problems.
I’ve come to think of this landscape of services, tools, and content as “The Avoidant Web.” It takes place in little online ecological niches which are invisible to many users, hosting business models, productivity experts, and services that thrive on certain widely shared anxieties, phobias, and—more than anything—on a lack of confidence. On the avoidant web, problems are never solved, but they can be pushed out of your mind.
Nick Seaver’s work on algorithmic culture can help us see what is going on here. In his book Computing Taste, he shows how engineers design systems to "hook" people into using their services and keep them engaged. To do this, Seaver makes a surprising theoretical move: placing these algorithms alongside the physical traps used to capture animals. They function in much the same way animal traps do: by anticipating and exploiting a user’s behavior. In this way the systems Silicon Valley uses to ensnare us, Seaver notes, walk a fine line between coercion and persuasion. A mouse may walk into a trap in search of cheese, but to describe this as a purely free choice amounts to a kind of willful ignorance.
The problem is that, when thinking about “traps” we normally only think of one kind of trap and—by extension—one kind of quarry. Over the last decade, "addiction" has become something like a master concept for understanding our disordered relationships with our machines. In a recent piece called "Desire, Dopamine, and the Internet," L.M. Sacasas laments the hegemonic hold that discussions of addiction and dopamine have on our understanding of online life and our relationships with our devices. He worries that it provides an oversimplified view of how we relate to digital media, but also that it recapitulates and launders the impoverished behaviorist thinking of Silicon Valley. As Sacasas sees it, "dopamine culture" reduces humans "to the status of a machine, readily programmable by the manipulation of stimuli." I think he's right, and I think the problem goes deeper. Technology critics have wholeheartedly adopted the frameworks of those they aim to criticize, something which makes seeing the problem in a different way functionally impossible. To put it in Seaver’s terms, they’ve understood one kind of trap, and assumed they all work the same way.
For the next several posts, I want to build out a different way of looking at these problems, a framework for understanding computers as “avoidance machines.” I think this shift in perspective is useful, not because I think it’s a universal mode of engaging with them, but because I think the lens of avoidance can help us see much about the culture of computing that the frames of “addiction,” “attention,” and “distraction” have missed. It’ll be a symptomatic reading for sure, but one which I think has a lot to tell us about the ways individuals, institutions, and governments use computers, and (if I’m right) about the future of artificial intelligence.
Speaking from experience, programmers are notoriously susceptible to the pitch of tools like this, because our professional work frequently involves automating repetitive things we do by hand. So we get the jolt of productivity dopamine from automating our note-taking process, which is easy and doable for a programmer, instead of the uncertain reward of actually thinking deeply about a question that you don't know the answer to or attempting to write something novel. The labor of those is much harder to automate away, as anyone who thinks critically about the output of large language models for more than a second will attest.
Yes, addiction is real. What he is proposing is that the root of addiction is avoidance. In my years as an addicted heavy drinker, there was always an unarticulated question, what are you avoiding? Which after many years without drinking, begins to articulate itself. I'm going to follow these posts with eager interest.