The twenty-first century began, definitively, on September 11th, 2001. As time hurries us away from that cataclysm and more generations are born with no memory of two towers smoldering and collapsing on live television, it becomes apparent that no single moment so horrifically and neatly inaugurated a new era. The twentieth century knew mass catastrophe—World Wars, two atomic bombs, Vietnam—but less of the mass visual spectacle, with television virtually nonexistent for the first 50 years and reaching maturity only in the second 50. News occurred in measured increments, in the morning and at night, either frozen in paper or tightly arranged for delivery during a widely trusted broadcast. At the time of the destruction of the Twin Towers, Fox News was just five years old. Social media did not exist. The internet itself was mostly a suggestion; enough professional class Americans didn’t bother with email or even personal computers. Cellphones were not required for daily life and smartphones were science fiction technology. To enjoy most entertainment was to live by appointment: a movie in theaters for a prescribed duration, a popular television show at eight p.m. on a Thursday, once a week. If you missed them, there was little recourse—a cultural repository, if you were lucky, was something called a VHS tape. “The future belongs to crowds,” Don DeLillo wrote in 1991, closing out the first chapter of his prophetic novel Mao II. This was partially true. The question would become, in subsequent years, what kinds of crowds and how, exactly, they’d congregate. Few at the dawn of the twenty-first century could truly guess.
Decades are never marked so neatly. The ethos of the 1960s didn’t suddenly vanish on January 1st, 1970, just as the 70s didn’t evacuate the culture, politically and stylistically, at the end of 1979. The 1990s were very much in the air at the time of George W. Bush’s deeply controversial 2000 victory. And so it was with the 2000s itself, which dragged well beyond New Year’s Day 2010. These divisions are ultimately arbitrary but they can, with the right analysis, context, and temporal understanding, come into view. Now into the thick of the 2020s, pandemic-ravaged and internet-addled, we can bring what came before us into much clearer focus. If the twenty-first century cannot be said to have really begun until hijackers seized control of four American aircraft and slaughtered almost 3,000 people, the 2000s, too, did not begin until that day. If the birth of the 2000s was September 11th, when could the decade, the era, be declared over? Many will have their arguments. Many have validity. Here, one moment stands out above all others: the golden escalator. June 16th, 2015 was the day the 2000s died.
If other decades will indefinitely enjoy greater pop cultural footprints—greaser hair, tie-dye, leisure suits, and ray-bans can power Halloween nights for the rest of the century—few will define modern American history more than the stretch of time from September 11th, 2001 to the day a failing reality television star named Donald J. Trump decided, at last, he’d run for president of the United States. Within these 13 years and nine months came the political, cultural, and technological shifts that would mark the fall of the old world and the beginning of something unsettlingly new. Within this time flickered the very last lights of the twentieth century. The 1990s, meanwhile, represented the very last of something: when the world could be said to exist in analog, when whole industries born of the postwar boom could still claim a degree of supremacy in the wealthiest nation on Earth. Record labels were flush. Newspapers were profitable businesses. Glossy magazines raked in cash and set the pace of the culture. Diffusion and disunion were coming. They just didn’t know it yet.
In the weeks after September 11th, 90 percent of the nation approved of Bush as president. It was the highest figure ever recorded by Gallup. Never again will a president, for such a period of time, be so popular. And never again, in all likelihood, will the American people be so fully united around a singular cause: waging disastrous war overseas, in Iraq and Afghanistan. The early 2000s gave the United States one of its final moments of life before rabid polarization, union before a permanent disunion. This is not a value judgment—jingoism is a danger—but merely an acknowledgment of the enormous shifts in technology and media that would soon be afoot. Americans would be sorting themselves into their permanent camps, red and blue, polarizing along geographical, education, and income lines. But first time would have to pass. Events would have to happen. In Bush, there were glimmers of the future. The son of President George H.W. Bush positioned himself as a “compassionate conservative” but indulged in culture warfare that would become the preoccupation of his party for the next decade and beyond. In 2004, the year he won re-election, Bush sought to improve his standing by fiercely supporting a constitutional amendment to ban same-sex marriage. “America’s a free society which limits the role of government in the lives of our citizens,” Bush said. “This commitment of freedom, however, does not require the redefinition of one of our most basic social institutions. Our government should respect every person and protect the institution of marriage.” That year, he became the last Republican to win a popular vote.
The second, and final, mass event was the election of Barack Obama as president of the United States. Bush won by an absurdly small margin but became wildly popular for a brief period because of a terrorist attack that was too terrifying, spectacular, and deranged to have ever been foreshadowed by any popular movie or novel of the twentieth century. Just as no president will ever achieve a 90 percent approval rating in a Gallup poll ever again, no president is likely to arrive in office with the degree of optimism, excitement, and hope as Obama enjoyed. Obama notched 53 percent of the popular vote, a strong but not historically dominant number—Richard Nixon broke 60 percent against George McGovern in 1972—yet managed a decisive Electoral College victory that, with the passage of time, appears all the more impressive and unlikely, for a Democrat at least, to be repeated. Rural white states like Indiana and Iowa enthusiastically supported the first Black president. The night of Obama’s victory, massive celebrations erupted across the country. Red America may have fumed, but most were willing to concede that history had been made and history, for a vanishing moment, should be acknowledged in full. “Senator Obama has achieved a great thing, for himself and for his country. I applaud him for it,” said John McCain, the Republican senator Obama defeated. “In a contest as long and difficult as this campaign has been, his success alone commands my respect for his ability and perseverance. But that he managed to do so by inspiring the hopes of so many millions of Americans who had once wrongly believed that they had little at stake or little influence in the election of an American president is something I deeply admire and commend him for achieving.” These were still twentieth century words, of a time when fewer politicians conceived of the Republican-Democratic divide in existential terms. McCain was not going to join the future. Just two years later, a Republican of the new century would make his intentions plain. “The single most important thing we want to achieve is for President Obama to be a one-term president,” said Mitch McConnell, the future Senate majority leader.
Empire and post-Empire: the terms were coined by the novelist Bret Easton Ellis in 2011 to describe the meaning of Charlie Sheen’s latest meltdown. Ellis, the American Psycho author, was more interested in celebrity culture, and how Sheen had exposed the hollowness and frailty of the twentieth century’s pretensions—so-called Empire. Empire, stretching from the postwar period to the early 2000s, demanded a degree of faux-reverence and kitsch that the new wave of celebrities and internet personalities would no longer stomach. Seemingly inspired by Gore Vidal, the novelist and essayist who penetrated deep into the twentieth century American psyche, Ellis offered up a useful cleavage for the eras. Empire is best understood as the world where the internet either didn’t exist or didn’t matter. Post-Empire is everything after. The 2000s were the transition from Empire to Post-Empire, those seismic 13 years and nine months. In that time, the future—in all its wonder and horror—was staggering into being. Donald Trump, of Empire, would insidiously master the rules of post-Empire politicking to a remarkable degree.
In Empire, the underpinnings of capitalism were easy enough to understand. When a company made money, it succeeded. Only profits mattered. There was no General Motors, Kodak, or American Steel without a steady profit margin. Money existed; money was inarguable. In the 2000s, as Empire eroded, this iron law of capitalism—make more money than you lose, earn more cash to expand—was broken in ways unimaginable to the oligarchs of the old world. Some corporations still, to a point, bowed to old ways. Facebook, the social networking behemoth, became profitable by selling the personal information of its users to advertisers. Facebook’s 2012 public offering was a success. Even as the company, rebranded as Meta in the 2020s, seemingly wheezes into the future—it is now, increasingly, the preferred social media for Baby Boomers and Generation X—it can argue, with its relatively limited overhead, there will be a way to preserve a positive cashflow. Amazon lost money for years, but also reached profitability. Neither company, though, was in desperate need to make more money than they lost because they had long achieved the greatest marker of post-Empire success: scale. There is a certain irony that in an era when mass events were rapidly becoming scarce, corporations themselves were reaching a degree of size and penetration never before seen in human history. There were fewer and fewer mass television shows and politicians, but there were, in Facebook and Amazon and Apple and Google, corporations that governed almost every waking moment of American life. Their founders were titanic heroes and villains, as Rockefeller and Carnegie were, but their reach was far greater, their surveillance nearly complete. The smaller internet companies of the post-Empire era dreamed in such scale and pitched it readily to a wealthy investor class that saw, in a time of limited inflation and miniscule interest rates, many risks worth taking.
Most of these companies beyond the Big Four practiced a version of Ubernomics, chasing scale at all costs. Perhaps no company was more emblematic of the post-Empire order than Uber, if the Big Four could always claim more influence. Founded in 2009, one year into Obama’s term, Uber thrived on the premise that most middle-class people will always long for the trappings of the wealthy: private transportation, anytime and anywhere, in a black car. Empire travel was the yellow taxi cab, summoned from the side of the road or over a telephone line, with predictable pricing but sometimes unreliable transport. Taxi drivers didn’t always like going to certain neighborhoods; some were undeniably racist. In the parlance of the late 2000s, the industry was ripe for disruption, and Uber would do it with shocking ease, flooding every major city in America with a seemingly inexhaustible supply of vehicles. During Empire—or before 2007, really—the Uber model would have been impossible. There was nothing special about ordering a cab to take you wherever you wanted from a desktop computer; a phone call would suffice. What made Uber the company of the future was the smartphone. Perhaps no single piece of technology in modern history became so ubiquitous so quickly, so de rigueur for a functional life. For generations, first-world inhabitants had no cellphones at all and then, at the end of Empire, they purchased them to make phone calls beyond the confines of the home or office. Unveiled in January 2007, the iPhone would change all of that, a supercomputer with formidable surveillance capabilities that could perform a staggering range of daily tasks and entertain indefinitely. Apple’s iPhone would compete with Google’s Android, erasing the BlackBerry interregnum. With smartphones, by the early 2010s, arriving in most American households, Uber could flourish. The phone would bring the car, day or night, rain or shine, and go anywhere. There would be more Ubers than taxis and the Ubers would be cheaper. During Empire, such cheapness would be unsustainable. Cars cost money, drivers are employees who cost money, and fares can’t fall too low. But what if the drivers weren’t employees at all? And what if outside capital could guarantee rapid expansion, even if there was no conceivable mathematical way to turn a profit? This was the venture capital model, performed to its utmost limit by Uber.
The mandate was simple: become popular. Money would flow later. The easy credit of the post-2008 economic crash era made it possible. Uber lost billions and it did not matter. Uber cars clogged city streets and created an unprecedented level of congestion during a climate crisis; it did not matter. Taxi drivers, losing their livelihoods to a company that did not abide by preexisting regulations, began to kill themselves; for a long time, it did not matter. What mattered was scale and image. Uber was extremely popular because its car rides were priced at an artificially low level and capital from investors could continue to flow, regardless of whether the business model was tethered to logic. Logic was for Empire. And in that, there would emerge discomfiting parallels between Uber and the man who would decisively end one period of history, Trump. In the twentieth century, Trump had reaped the rewards of a naïve tabloid culture obsessed with celebrity and caricature, feeding gossip “scoops” to desperate journalists, appearing on television when there was no better way to reach a mass audience, and flagrantly lying about his wealth. America had no shortage of such grifts of personality, but Trump reigned supreme over history. Trump built his brand by lying enough until some of it seemed true. Investors followed suit long enough to sear Trump into the American consciousness. By the time they wised up, it was too late. As Trump became, in 2016, the apotheosis of rank materialism and a certain revanchism in the political arena, Uber was among Silicon Valley’s greatest and most absurd offering to the American public, filling a need many never knew they had with a business model that was, for many years, nothing short of fantasy.
The 2000s gave way to fracturing, to siloing, to the ready genesis of alternate realities. The fervid American landscape was always ripe for conspiracy. From JFK to the moon landing and beyond, there have been no shortage of evidence-free theories that capture, and never let go, the imagination. Do the Jews control everything? The Reptilians? What was really found at Roswell? In the age of Empire, theorists had to huddle among themselves, communicating through periodicals, mailing lists, and physical meetings in physical spaces. Organizing was effort. Knowledge, both real and false, was effort. The internet could erase such effort and engender new communities overnight. In the 1990s, this did not really happen because the tools were not there. Websites stood alone, clunky and pixelated and undesirable. Major national and international newspapers sneered at the concept of placing information there. A London Times editor, Simon Jenkins, predicted that the internet “will strut an hour upon the stage, and then take its place in the ranks of the lesser media.” Staring down the internet of 1996 or 1997, this was believable. Logging on from home meant clogging up a valuable telephone line. Progress came, at first, as the speed of loading a webpage increased. Ease of access mattered most. And then, to finish Empire off for good, came social media. Facebook, from a Harvard dormitory, in 2004. YouTube in 2005. Twitter in 2006. Instagram in 2010. It is important to remember, for a period in the 2000s and 2010s, how enthusiastic and optimistic most of the mainstream press and elite cultural and political institutions were about these developments. Obama’s 2008 victory was, for a time, bound up triumphantly with Facebook’s rise. The 24-year-old Facebook co-founder Chris Hughes was a mastermind of Obama’s “web blitzkrieg,” in the words of the left-leaning Guardian, noting that the young social media giant had “overwhelmingly been pro-Obama virtual territory.” (Hughes would emerge several years later as the owner of The New Republic, trying and failing to reinvent an Empire institution for the post-Empire age.) The arrogance of the Silicon Valley class was warranted: they were setting the terms for tomorrow. Only unmitigated good could come from the rise of social media. It would bring people together, fuel popular uprisings in autocratic countries, and spread enlightenment to all corners of a darkened world. “Social media would redistribute power and set people free, and users would determine their own destinies,” Anna Wiener wrote in Uncanny Valley, her memoir of Silicon Valley’s excess. “Deeply rooted authoritarian governments were no match for design thinking and PHP applications. The founders pointed to Cairo. They pointed to Moscow. They pointed to Tunisia. They side-eyed Zuccotti Park.”
With the advent and popularization of the smartphone, the new tech hegemons were right on time, their products ready to exist in perpetuity on handheld supercomputers. But the elder millennials and young gen Xers who conceived of the technology couldn’t quite fathom what they were doing. They were not Einsteins or Oppenheimers; they understood, only dimly, what they were unleashing. The timing would ultimately be crushing, with new forms of personalized, consciousness-absorbing media unleashed upon a rapidly polarizing body politic—the Democratic and Republican parties would, inevitably, begin to sort themselves ideologically, ending the curious twentieth century experiment of very liberal Republicans and very conservative Democrats—and driving them even further apart, the more reliable Empire news sources blown to pieces by their failing business models and the 2008 economic crash.
In the 13 years and nine months between the destruction of the World Trade Center and the rise of Donald Trump, grassroots political movements acquired new organizational structures, aims, and flavors. They became uniquely internet-driven, to their strength and detriment. They were, above all, genuine expressions of rage and grief at the overwhelming political and societal failures of the 2000s. First came Occupy Wall Street, and then Black Lives Matter—if each were infused with great optimism and belief in the power of mass protest to change America, they were also characterized by a tremendous pessimism. They stared down into the maw of the new American century and found rampant racism and income inequality. Structuralism, borrowed from the last century, took on new import. Structures were everywhere and they were rotten.
Occupy Wall Street could have existed without the new internet, but would flourish and fizzle on its terms. In the wake of the worst economic meltdown since the Great Depression, the Canadian anti-consumerist magazine Adbusters initiated a call for protest. To call attention to rampant income inequality in the United States, the phrase “we are the 99 percent” was wielded; it was first popularized on a Tumblr page of the same name. The protest would be at Zuccotti Park, a public-private space near Wall Street in downtown Manhattan. The protest would be equal parts symbol, spectacle, and policy fight. In the first term of the Obama years, the banks and financiers that had fueled the financial crisis were not held accountable, and policy mandarins in the administration had pushed for an insufficiently large amount of stimulus spending to return the economy to more stable footing. Unemployment and underemployment, particularly among the working-class under the age of 30, had been a persistent problem. It was 2011, another pivot point: the optimism of Obama’s victory had curdled while new technologies, viewed with unbridled enthusiasm, were beginning to be discussed more skeptically. But excitement would persist because, for Facebook and Twitter in particular, they were the tools of protest. Social media allowed the calls to gather and march to spread with unprecedented efficiency. In a matter of days, the occupation of Wall Street was on, and encampments would spread throughout the United States. Hundreds slept in Zuccotti Park nightly for months on end. Unabashedly leftist politics were on the rise again, even fashionable. Desperation had resuscitated socialism. And in Occupy’s disdain of hierarchy were traces of anarchism and the horizontal decentralization of the internet; the libertarian streak of the oldline tech evangelists, intentionally or not, seemed to pulse through the new protest movement. The movement would prove influential in the years to come—Bernie Sanders and Bill de Blasio, the future mayor of New York, were held aloft by Occupy politics—and the savviest veterans of Zuccotti Park would take on more electoral aims, helping to seed the Democratic Socialists of America, the primary vehicle for American socialism that boomed in 2016 and 2017. But there would be no Michael Harringtons, Tom Haydens, or Stokely Carmichaels minted through Occupy. The movement itself faded within a year, crushed by police and weather, and drained by waning interest. Without clear, achievable policy aims, it was a movement that could simultaneously enrapture and frustrate. Though the term was not yet in vogue, Occupy was very much a memable movement. It would exist, at the very minimum, to be remembered.
It was notable, perhaps above all else, Occupy Wall Street came in Obama’s very first term, just three years after his rapturous election. If history is, inevitably, filled with hinge and pivot points, Occupy was probably one, a recognition that dissatisfaction can be highest when hope is first present. Many of the protesters and campers at Occupy had been Obama voters; the committed anarchists and socialists had sneered at the first Black president, but they were too miniscule in number to make a mass protest movement possible. The Occupy cohort came for a wide range of reasons, from lack of employment to crushing student debt to anger over America’s forever wars, launched by the president who was once the most popular in recorded American history. Obama, elected on a promise to reverse the damage of Bush, had not been able to rapidly improve the material conditions of those who had been most enthusiastic about his ascent. In Occupy Wall Street was this implicit tension. A generation had looked to Obama and found him, three years on, lacking.
Black Lives Matter, too, would come in the time of the first Black president. Obama’s wary embrace of racial politics angered a millennial generation, coming of age during Iraq and the meltdown of the economy, that demanded more confrontation, more accountability. They were restless. The 2000s would be the first decade in modern times where young middle-class people became aware that they would probably enjoy fewer material comforts than their parents. The 1980s-born, teenagers and twenty-somethings for the twin eruptions of Occupy and BLM, were the children of the baby boomers, those born into uncommon affluence. Their youngest years came as America’s allies and enemies had been vanquished in the Second World War. They were presented, in young adulthood, with heavily subsidized or completely free higher education. They were, by their twenties, able to readily acquire property in either cities or suburbs. The 1960s protest movements that had consumed the baby boom generation were fed, in large part, by fears of being drafted into the Vietnam War—not by rage, at least among white people, over a frail job market or capitalism’s excesses. Martin Luther King Jr. would meld identity and class-based concerns—his assassination came as he was preparing to march with striking sanitation workers—but the era would, first and foremost, be defined by opposition to racism and war. The middle-class (at least the white middle-class) did not feel its downward mobility.
In the 2000s, that sense was everywhere. The United States was a less overtly racist nation than it had been in the 1960s, but no one, Obama included, could usher in a post-racial utopia. Black Lives Matter, like Occupy Wall Street, could have come into being without a social media and smartphone-enabled internet, but its reach and constitution would have been different. In 2013, a Twitter hashtag appeared in the wake of the acquittal of George Zimmerman, a Florida man who had fatally shot an unarmed Black teenager named Trayvon Martin a year earlier. The originators of the hashtag were activists Alicia Garza, Patrisse Cullors, and Opal Tometi. They would all achieve a degree of fame, but none rivaling civil rights leaders of the second half of the twentieth century. BLM, immediately centered around combating police brutality and anti-Black racism in America, would grow far more prominent after Eric Garner and Michael Brown, two more unarmed Black men, were killed in the summer of 2014. Garner’s death, in Staten Island, came first, and it mattered because of smartphone technology: Ramsey Orta, a friend and member of the activist group Copwatch, filmed the encounter between Garner and police. Garner had been selling cigarettes illegally when NYPD officers approached him and placed him in a banned chokehold maneuver, taking him to the ground and eventually killing him. Garner’s pleas of “I can’t breathe” became a national rallying cry.
The apex of Black Lives Matter wouldn’t come until 2020, after a police officer in Minneapolis killed George Floyd, but it was in the summer of 2014 when the movement was truly born. That August, Michael Brown was shot and killed by a police officer in Ferguson, a suburb of St. Louis. Brown’s killing would ultimately prove more ambiguous—a U.S. Department of Justice investigation cleared the officer, Darren Wilson, of civil rights violations and found forensic evidence supported Wilson's account that he acted in self-defense—but it was, even more than Garner’s death, a catalyst for Black Lives Matter, what made the movement a momentary national force. Calls for ending police brutality were not new; what did seem different was the sorrow and rage. At the minimum, like with Occupy, an awareness had grown that some system had failed and would be in need of dramatic reform or utter destruction. Policing in America had to change. The question, as always, was whether decentralized protest would be enough. A policy agenda could be clear or unclear, depending on which leader—anointed or self-appointed—was speaking. Demands could be diffuse and the political scene was metastasizing. Leaders, inevitably, are flawed, but leaders make a movement durable. They are a ballast. Occupy and BLM, in 2000s and early 2010s fashion, lacked both. They would imprint society; they inarguably mattered. They were also held captive by a certain permutation of internet culture that had taken hold in this twenty-first century, in these deeply consequential 13 years and nine months. Social media bred ephemerality. Post-Empire was much faster than Empire. And speed could be tiring. By the day Trump descended his golden escalator, Black Lives Matter was already in retreat. And no one had tried to forcibly occupy a park in at least three years.
A rapidly evolving internet would weaken or obliterate other Empire institutions altogether. Broadcast television could never be the same, as fewer shows drew audiences of tens of millions in a single night. In a nation that once had many mass communal entertainments—M*A*S*H, All in the Family, the Cosby Show, Seinfeld, Friends, and many others—there would be, by the time of Trump’s ascension, only one: the Super Bowl. Saturday Night Live, a staple of Empire, kept its relevance in the 2010s as individual skits were consumed on the internet hours or days after the show first aired. Very few would sit down, at the appointed hour, to watch the show in its entirety. Social media and streaming services would ensure curated entertainments and cultural moments for a nation of more than 300 million. Offerings would simultaneously increase and decrease: the internet was the new repository for American and global cultural knowledge, with almost any show or movie (or a clip, at least) summonable with a few keystrokes. At the same juncture, large movie studios would dramatically narrow their fare, relying almost exclusively on remakes and sequels as post-Empire culture appeared to exhaust itself. The original summer blockbuster, a staple of the 1990s, would wither and die. Consolidation in the publishing industry began to limit what the reading public could enjoy. By the 2020s, countercultures themselves were only barely possible. In the 2000s, they had enjoyed their last broad flourishing, with the indie rock and cinema movements attracting fanfare and breaking, at times, into the mainstream. Freshly produced art struggled for an audience. Ultimately, post-Empire would begin to prioritize, or incentivize, the old over the new, as the rediscovered and repurposed outcompeted whatever freshly emerged. In the middle of 2022, old music accounted for a stunning 72 percent of all consumption on the U.S. market.
Single television shows, movies, pop cultural figures, and politicians would lose much of the power they commanded in the twentieth century. No television show is more representative of Empire’s last gasp than American Idol. Launched in 2002, the conceit is unfathomable today: one TV show, watched by an enormous number of Americans, manufactures the careers of pop stars. Each week, the show airs and judges decide who they like or don’t like. The judges themselves, a pair of music industry executives and a 1980s pop star mostly forgotten at the show’s inception, become very famous. Eventually, a popular vote determines the winner. American Idol continued to exist into the 2010s, past the Obama era and into Trump’s presidency, but it mattered most in the wake of September 11th. A terrorist attack and a successful television show have nothing in common but one obvious thing: the mass audience. Post-Empire, only a terrorist attack could command such attention, but this was not the case in the early 2000s when the internet remained in its transitional phase, bereft of social media and the smartphones that would ferry it so intimately into the lives of most Americans. There is no American Idol, as we know it, with this evolved internet. Appoint-viewing television simply doesn’t matter anymore. Pop stars arrive through streaming services and TikTok videos or are held hostage by them; one 2010s pop star, Halsey, proclaimed in 2022 that their record label was holding back a new single until a “fake viral moment” was created on TikTok. In such a landscape, traditional broadcast television has no chance.
If American Idol wasn’t the very last music phenomenon or mass cultural event, it was of a kind that can never be repeated again. The numbers, by the standards of the current age, are difficult to comprehend. For seven consecutive years, American Idol was the highest-rated television show in the United States. The 2002 season finale drew 23 million viewers. The 2003 season finale drew 38 million. In 2006, that number was nearly matched. Ratings tell one story, though, and careers tell another. No single institution, ever again, will mint so many pop stars. At the time of American Idol’s 10th anniversary in 2012, show contestants had topped the Billboard charts 345 times. The winners and top contestants were household names. The most successful, Kelly Clarkson and Carrie Underwood, stood atop the pop music landscape. Chris Daughtry, Jennifer Hudson, Ruben Studdard, Fantasia, Adam Lambert, and many others produced number-one hits. By the late 2000s, rival networks—American Idol was a Fox property—were distraught, unsure how they would compete. “This is a big monolith sitting out there. It’s the ultimate schoolyard bully,” said a scheduler for CBS. It would not last, though not because CBS or NBC concocted a rival program to match American Idol’s immense popularity. The run of dominance would end because Empire had to end. The millennials coming of age in the 2010s had other means of finding entertainment and discovering the stars of their generation. The so-called monolith, the schoolyard bully, was like the armored knight who suddenly encounters a gatling gun. Technology will not be intimidated. Justin Bieber, the pop star of the next generation, was discovered on YouTube.
There is no culture without politics, and no politics without culture. And all of it would be wrenched in unimaginable directions by the advent of social media and the smartphone. From 9/11 to Trump’s ascent—from the Iraq War to Black Lives Matter, from American Idol to TikTok—the assumptions undergirding American society were rapidly, and perhaps permanently, undone. “People played by the old rules, despite a growing recognition that those rules were flawed,” Chuck Klosterman wrote of the 1990s. If so, in the next decade, the people and their leaders ignored these rules altogether.
"The internet itself was mostly a suggestion; enough professional class Americans didn’t bother with email or even personal computers."
The professional class most certainly used email every day in 2001. I'm not sure when exactly office work became sitting in front of a computer all day, that was before my time, but I was working in offices by the mid-to-late 90s and it absolutely entailed sitting at a computer and emailing all day by then.
This is great long form journalism - an enjoyable column. Your paragraphs on Occupy were insightful. It's interesting how that has been practically forgotten. No one talks about it anymore.
Your views on the parallel rise of alternative media would fit in this nicely. People justifiably turning away from the Empire's Approved Stenographers (NYT, CBS, ABC, NBC, Fox et al) contributed a lot to what you describe here.