We are, as an advanced twenty-first century American society, increasingly devoid of awe. There is little that can surprise or excite us; we are weary of the gadgets, wary of artificial intelligence, and prepared to legislate, if possible, the poison of social media out of our lives. We can’t quit our smartphones but there’s a consensus all of this twitching and flinching and blue light can’t be good for us. Once, the tech was wondrous: the automobile, the radio, the television, the fax machine, the personal computer, the internet to meld it all, to give us an eternity of knowledge at a keystroke. If Thoreau could bemoan the encroachment of the locomotive on nature, we could agree, as time dragged on, there was plenty ameliorative to industrialization, as long as rapacious capitalism could be held in check. Modernity vaulted us forward. The successive technological revolutions of the nineteenth and twentieth century made diamonds, it seemed, out of dust.
What now? We are coming to the close of what the writer Sam Kahn has called the Software age. It began, roughly, in the 1970s, when American industrial capacity began to fade, formerly great cities were hollowed out, and the personal computing revolution arrived to take us out of our analog world. The new computing behemoths, like Apple, grew even wealthier than General Motors and U.S. Steel but employed many fewer people. Hardware was out and software was in. If the factories continued to close, gutting Cleveland and Gary and Detroit and Buffalo, at least Americans could draw comfort from rise of Silicon Valley. That’s what would pass for industrial capacity—far fewer jobs, and supreme wealth for those with the singular skillsets to take advantage of the Software age, but at least we might be impressed. We liked, and then required, computers in our homes. We didn’t know our cellular phones could be supercomputers—who imagined infinite search and a GPS in a pocket?—but we embraced what Steve Jobs and Samsung offered us. Google made encyclopedias irrelevant and Facebook joined us to friends and strangers alike; media was social now, and that’s how we were going to live.
It’s easy to forget, in the 2000s and 2010s, how optimistic Americans were about their new technological fruits. If airplanes weren’t getting faster and we were no closer to bringing astronauts to Mars, rapid software evolutions could, in the short-term, offer their own thrills. Consider the span of time between 2007 and 2011. I choose these years because I attended college then; they also, conveniently, mapped perfectly upon the iPhone explosion. Steve Jobs, with four years to live and on the way to sanctification, introduced the iPhone in January 2007, before my last semester of high school. I went to college with a flip phone, as did everyone I knew. I didn’t understand why anyone would want more from a cellphone than texting and a calculator. A GPS was for an automobile. Social media—just Facebook then—was confined to my laptop, which I kept in my dorm or ferried to the library. The idea of needing to check Facebook on my phone seemed ludicrous. And yet, halfway through my college years, iPhones proliferated. I clung to my flip phone. I liked tactile texting, I didn’t care about grainy photos, and I carried, even then, an iconoclastic streak. I kept my flip phone in 2011, upon graduation. I started my first newspaper job with one, sending out tweets through a web browser loaded onto the postage stamp screen. And finally, for the sake of my career—and the new demands of the 2010s—I caved and bought an Android. That was my limp form of rebellion against Apple.
The smartphone did work wonders. It was, for its time, radically different, and offered genuine possibilities for knowledge discovery and connection. Compare the giddiness of 2011, the early peak of the smartphone years—and the heart of the Software era—with where we are now. Most experiences with the internet and technology are, largely, degraded. When software works, it’s addictive or even depressive, and when it doesn’t—or it strains, against reason, to create a world in which human beings would rather not tread—it resembles the ghostland of the Metaverse or the impotence of the Apple Vision Pro. Apple no longer meaningfully innovates. Neither does Mark Zuckerberg’s Meta, which coasts on its timely Instagram acquisition and slops up that platform, too. Threads utterly failed. Elon Musk throttled the written word on Twitter. The search function of Google, overwhelmed by ads and now A.I., is demonstrably worse at producing reliable answers to queries. Parts of daily existence that were once fairly seamless, like pointing to food on a physical menu or buying a ticket for a basketball game, now have needless technological roadblocks.
This, in one sense, is new. The twentieth century complaints about fresh technologies were that they would divorce us from nature and make us too reliant on their wonders. Cities swallowed up farmland. Automobiles made us less likely to take healthy walks. The mass production of consumer goods made us materialistic and unable to appreciate craft. We were softer, perhaps lazier, but we were living longer—and life, inarguably, was eased. What about today’s technologies? A decade ago, the pocket supercomputers were new. The screens held promise. Now we gaze upon the tech giants and see a menace, and we are learning that they have no new revolutions secreted away. A.I., maybe—but A.I. will not measure up to the I.T. revolution, to early Software, and that in turn never measured up to electricity, urban sanitation, and modern medicine. Rapid economic and technological growth, as the macroeconomist Robert J. Gordon has argued, may have been a one-off. Stagnation is coming. Or it’s already here.
The last burst of the Software age, in the 2000s and 2010s, was accompanied by a particular cultural stagnation we’ve come, in retrospect, to recognize quite clearly. Technological revolutions of the twentieth fueled, or at least did not undercut, cultural leaps forward. Literary modernism, surrealism, and Dada arrived with radio and cinema. The television age gave way to New Hollywood, and the greatest run of cinema America has ever known. Literature, at midcentury, flourished, as did popular music. It can be debated, exhaustingly and fruitlessly, whether the literature and pop music of the 1960s, 70s, and 80s was better than today’s—I would say yes, but I am open to the counterpoint—but what can’t be batted away is the sheer diversity of the efforts: racial, cultural, ideological, and stylistic diversity. The top of the charts, on any given week in this 30-year stretch of the postwar era, could be occupied by rock, Motown, folk, disco, soul, R&B, hip hop, or a solo superstar. The mainstream, or macroculture, fed off the countercultural currents, allowing for innovation at the mass level. Literature was no less multifarious. Novelists as disparate as Sam Delaney, Thomas Pynchon, Toni Morrison, Philip Roth, and Don DeLillo could find large audiences and have the support of significant publishers. Major magazines like Esquire, Vogue, Mademoiselle, and the Saturday Evening Post inculcated stylistic innovators like Joan Didion and Tom Wolfe, and a big city tabloid could make a celebrity out of Jimmy Breslin.
Today, the macroculture withers. The causes are complex. What Americans are left with, as their global soft power reaches new and dizzying heights—travel to Europe and see how the locals there live under the bootheels of American cultural exports—is a decade and a half of forgettable dross. There were great movies released in the 2010s. Great books were occasionally published. There were even albums of consequence. Consider, though, what dominated. The Marvel Cinematic Universe swallowed up Hollywood and sequels, reboots, and unambitious retreads became the default. It was rare for the largest movie studios to take risks, to even imagine new intellectual property. Television fared better, from the prestige era to the decade of fast cash that came with the rise of Netflix. But the streamers, having immolated a reliable twentieth century business model, are now in retreat, as Wall Street loses it appetite, in an era of high interest rates, for speculative investment. On the pop music front, there is a greater awareness of the paucity of options—rock is dying, bands themselves are mostly gone, hip hop is fading, and most pop music lacks melodic complexity, designed chiefly to game the Spotify algorithm. The goddess of this age, Taylor Swift, may very well be a spent creative force, and her dominance, as time passes, will earn new scrutiny, the way a talented but not generational tennis player might have thrived through a lack of competition. Swift has not been a sonic or stylistic innovator; her fame is as great as Madonna’s or Michael Jackson’s or Paul McCartney’s, but her durable impact on the culture is far fainter. She will be a billionaire and tour sold-out arenas for the rest of her life. What she won’t possess is more ineffable—the ability to change how we think about music and performance itself. She has made nothing new. Courtney Love may have said it best: “She might be a safe space for girls, and she’s probably the Madonna of now, but she’s not interesting as an artist.”
The Software age has not been a boon to literature. The “internet” novels, as Rhian Sassen pointed out in the Baffler recently, are tedious, rote, and aesthetically dull. Autofiction, a close cousin, has done little but produce a bevy of W.G. Sebald imitators who meander about their cradles of affluence and cough up trite reflections on the political moment. These are all, in the words of Henry Oliver, “discourse” novels that barely exist as literature at all—they are arguments, memes, and glorified tweets patched together in a bid to tell us how we live today, all the while sacrificing characterization, lush interiority, and style. Many writers have strived to imitate the deadpan prose of early Bret Easton Ellis, forgetting Ellis’ chief dictum: a novel does not succeed without style. Ellis did not write discourse novels. He was not trying to hem his books within the bounds of the political and cultural debates of the well-educated. The discourse novels are not memorable. They pass in and out of the body like vapor. My great lament, reading Lauren Oyler’s Fake Accounts, is that it was so plainly unequal to her literary criticism, which found a way, for a period, to distinguish itself. Her essays always left an imprint. I finished her novel and cannot say, if pressed, that I truly read it because so little seeped inside, because it was, at its core, barely about people or events. This is not an argument for plot. Two of my favorite writers, Virginia Woolf and Henry Miller, shed the mechanics of plot entirely, and hungered to represent reality on the page without the straitjacket of narrative. They succeeded because they wrote like few before or after them; their prose shimmered and burned, and they plunged deep into the canyons of human consciousness. “Let us again pretend that life is a solid substance, shaped like a globe, which we turn about in our fingers,” Woolf wrote in The Waves. “Let us pretend that we can make out a plain and logical story, so that when one matter is despatched—love for instance—we go on, in an orderly manner, to the next.” This is language, in meaning and aesthetics, that can endure. It is not held prisoner by the discourse.
Humanity is not finished with technological revolutions. America, still an empire, is not either. There’s no telling, though, what follows the Software age, if anything at all. There is no promise, in our short lifetimes, of anything as revelatory as electricity or even penicillin. What may not stagnate, in perpetuity, is culture. In music, movies, and literature, there an awareness of the mainstream’s failure. There is discomfort with the macroculture. The romantic turn in culture may drive more people offline and into pursuits of greater creativity and complexity. Substack has reinvigorated nonfiction writing and opened new paths for novelists. Popular music might regain its verve and there could be, in time, a new counterculture. There is ambition and hunger for it. The MCU Universe is dying. Retreads will persist, but there is new proof movies like Civil War, unbound to intellectual property, can succeed. Let’s see what comes next.
I'm a few years older than you; I graduated college in 2006, but I too did not get a smartphone until 2011.
Looking back, the smartphone/social media combo has improved my life in a couple of ways but introduced God knows how much stress in other ways. As our mutual friend Freddie DeBoer often points out, we're not meant to be rubbing our brains up against so many other brains online day after day. It's destabilizing.
Brilliant. I love, as always, the staggering insightfulness and unapologetic optimism of your writing.