When George W. Bush was sworn in as the 43rd president of the United States, it was, in every sense, a callback to another era. The most powerful men flanking him had served his father, the 41st president, or prior administrations. The day itself appeared sepia-toned: a thick cover of clouds swept over the Capitol and dignitaries slipped on raincoats, hoping to repel an imminent drizzle. Bush’s remarks were brief and the inauguration, in the view of the agenda-setting journalists and pundits who absorbed and made meaning of it all, was “subdued.” The New York Times decreed an “old-fashioned aura” had swept over Bush the Younger’s momentous day. A Republican, Bush was nevertheless echoing an icon of the last century, at least vaguely, in John F. Kennedy, the newspaper reported. “The enemies of liberty in our country should make no mistake: America remains engaged in the world, by history and by choice, shaping a balance of power that favors freedom,” Bush said. “We will defend our allies and our interests.” These remarks were interpreted to mean that Bush believed in a United States that would no longer act unilaterally on the world stage. “Balance of power,” for the Times and other influential outlets, was the crucial phrasing. “We will show purpose without arrogance,” the new president said.
January 20th, 2001 still existed in a sort of caul. Destruction and terror waited in the near-future; Bush was merely another president not at all unlike the man he had just defeated in what had been the most contested election in American history, decided by the Supreme Court a month prior. Americans would be forgiven, entering the new century, for thinking that election was an apex or climax, the moment before events became predictable again and most could tune out of politics altogether. From the future, the 2000 election can appear shockingly modern, with a Republican and Democrat locked in close combat, the nation evenly divided, but the numerical outcome masked a lack of interest leading into November and the general belief, among young voters at least, the stakes were inordinately low. Al Gore, the Democratic candidate, hailed from the center of his party; he was a son of a senator and an Ivy League graduate. George Bush, the Republican candidate, hailed more from the center than the left of his party; he was the son of a president and also an Ivy League graduate. Bush was born in 1946. Gore was born in 1948. Neither offered generational change: Bill Clinton, the outgoing president, had also been born in 1946. Bush’s victory appeared to be little more than a reconstitution of his father’s first term. Dick Cheney, his vice president, had served as George H.W. Bush’s defense secretary. His deputy, Paul Wolfowitz, had worked under the elder Bush as well.
For the Americans who chose to casually or more intensely consume politics, the means by which they learned about Bush or Gore hadn’t changed all that much over the final 20 years of the century. Sure, television technology had improved—everyone owned a color TV by 2001—and the internet was becoming a viable option for the adventurous. By the late 1990s, many newspapers had introduced websites, which functioned almost entirely as digital reproductions of what could be found in print. They were mostly ignored. The greatest revolution, most agreed, had been 24-hour cable news, pioneered by CNN in 1980. The news cycle, in theory, was endless, but it was confined, rather inertly, to television screens fixed in homes or public places. Overtly partisan cable television had only just been imagined into existence, with Roger Ailes’ Fox debuting in 1996. MSNBC, not yet obviously liberal, was launched that same year. Cellphones, if owned at all, could only make telephone calls. Most of these calls, though, were best placed inside an office. Newspapers, radio, and nightly news broadcasts determined, in near-totality, what Americans understood about their country and world. The newspapers arrived in the morning or afternoon, and their editors decided what would be fit for consumption.
This was Empire, the postwar political and technological consensus that endured. There were lurches and leaps, inklings of the hope and darkness ahead, but there were reasons to believe the future would not dramatically depart from the past. Bush, quite literally, resembled a president who came before him and there was a clear, ongoing convergence between the two parties, even as certain conservatives like Newt Gingrich, the fiery House speaker, heralded a rightward shift. In 2001, there was still lingering triumphalism from the end of the Cold War, the Soviets dead, Russia now a place where American political operatives could, with pride, engineer presidential elections. Democrats and Republicans alike could agree that communism was bad, capitalism was good, technology offered a brighter tomorrow, and marriage was between a man and woman. As rainclouds gathered over Washington, a stiff chill in the air, onlookers could be forgiven for closing their eyes as the younger Bush spoke and not being sure if a Republican or Democrat was addressing them across the national mall. We will show purpose without arrogance. Empire demanded, for television and newsprint, performative decorum, and Bush had plenty of it on that day.
Bush’s own politics, at the cusp of the new century, were a blending of old and new, and he would come to embody this shift from Empire to post-Empire. He was a Republican in Texas who had defeated a Democratic incumbent, Ann Richards, to win his first statewide campaign. Richards would be the last Democratic governor of Texas. In the 1990s, there was still nothing particularly unusual about Democrats winning elections in otherwise very conservative states, states the Democratic candidates for president had stopped carrying. Republicans, too, could find plenty of success in liberal states that had, on the presidential level, stopped voting for Republicans. This was due to a foundational quirk of American history that was set to expire at the end of Empire, its demise accelerated by the rise of the internet and the collapse of the newspaper industry. At the dawn of the 2000s, the two major political parties of America retained a degree of ideological diversity that was, for much of the twentieth century, a defining feature of the political system and, more often than not, an enabler of progress.
Political scientists of the postwar period, unable to know the virulent polarization that would afflict the nation after their deaths, would bemoan this state of affairs. “Alternatives between the parties are defined so badly that it is often difficult to determine what the election has decided even in broadest terms,” the American Political Science Association warned in the 1950s. The Democrats, since the end of the Civil War, had been the party of the Jim Crow South. Republicans, born in the era of Lincoln, were both liberal reformers and aspiring plutocrats. During the New Deal period, the Democratic Party would migrate left on economic concerns, winning a new generation of working-class voters while still maintaining its southern, racist base. Heterogeneous America was messily sorted among both parties. The twentieth century was not, in any sense, an era of domestic peace, but the pitched battles over civil rights, overseas wars, and communism were not neatly arranged between the two parties. No party had a monopoly on vitriol, jingoism, or cosmopolitanism. As late as the 1950s, Black voters were almost split evenly between the Republicans and Democrats. The same was true for union members, Catholics, and men. After Lyndon Johnson, a Texas Democrat, helped engineered the passage of landmark civil rights legislation in the 1960s and Richard Nixon, in 1968, capitalized on racist backlash to help his ticket carry parts of the South, the parties began to realign and polarize along geographical and ideological poles—at least on the presidential level. Ticket-splitting, however, would remain a defining feature of the political scene into the 2000s, allowing local Democrats and Republicans to win territory where the national party was losing clout. Bipartisanship remained possible because there were enough Democrats and Republicans who shared policy viewpoints with members of the other party and voters, more connected to local patronage networks, rewarded lawmakers who successful secured pork for their districts. Ideology itself could matter only so much, particularly when vast swaths of the American public were congregated around the same nightly news broadcasts and regional newspapers. A furious Nixonite and a McGovern peacenik in Chicago would still, at the beginning of each day, have to read the Tribune.
In Bush the younger were hints of the coming world, life after Empire. He cared far more about religion than his father, reading the Bible daily and talking openly about how Jesus changed his life. He said he heard God’s call to run for the presidency. It was not yet obvious, in 2001, the evangelical Christian right would become a defining force in the GOP, the cohort of voters all ambitious Republicans would have to flatter, cajole, and ultimately deliver for. These voters were clear-eyed about the stakes of each election cycle, gravitating to candidates who promised to change the make-up of the Supreme Court and overturn hated decisions, like Roe v. Wade. Bush’s views on abortion were somewhat cloudy, but he was, unequivocally, the president who would stand for the evangelical right and convince them, once and for all, they belonged in the Republican Party. Donald Trump, a post-Empire president, would prove that a champion of Christians needn’t know the Bible or worship any God. The president, merely, would have to deliver. At the end of the twentieth century and the beginning of the twenty-first, though, appearances like these still mattered. Trump himself, a celebrity dilettante with little connection to the Republican Party, had no notion of the movement he’d one day lead. “I’m very pro-choice,” Trump said in 1999. “It may be a little bit of a New York background because there is some different attitude in different parts of the country and I was raised in New York and grew up and worked and everything else in New York City.”
The morning of the Bush inauguration was something of a pathetic fallacy: a muddle of gray clouds, a day backward-looking and ultimately forgettable. The parties had traded place in power and America, after months of anger and disbelief over an election decided on hanging chads, was ready to move on. Politics was simply not a 24/7 preoccupation, not a means by which one could organize an identity. Political news arrived with all other news, bound in a newspaper or shuffled among broadcast segments. Even Fox and MSNBC, still nascent, were for the peculiar die-hards. CNN existed for those with a particular interest in watching loops of plane crashes or wildfires. Politico, the D.C. insider’s fix, would not be founded for another six years. Bush, handsome and graying with his put-upon Texan drawl, was the 54-year-old steward of this world, and he would do. He resembled an American president. His keen interest, upon taking office, was what he had called education reform. After vowing on the campaign trail he would give public parents vouchers to attend private schools, he found that such a plan could not get through Congress. Instead, he took up a coming 2000s obsession, standardized testing. Students, Bush declared, must be tested in as many grades as possible. By the summer, Bush was declaring a semblance of victory in a Rose Garden speech. Civility and integrity, he said, had come back to Washington. He touted what he believed to be the first great achievement of his administration, a $1.35 trillion tax cut spread over 10 years. As the first significant tax cut in two decades, the legislation cut income tax rates across the board, reducing the lowest rate from 15 percent to 10 percent, and the highest rate from 39.6 percent to 35 percent. The 10 percent tax rate was made retroactive to January 1st. In August, about to head off to Crawford Ranch in Texas for a month-long vacation, Bush could be forgiven for believing that he would be remembered, in part, as the Republican president who fully indulged in supply-side economics and did little more. Foreign policy, at least, was sleepier terrain.
Thomas Pynchon’s Gravity’s Rainbow, a multifarious epic of the postwar era, famously begins with the sentence “a screaming came across the sky.” Published in 1973, the sentence was applicable for what was to come—the greatest of all shock events, a final merger of fact and fiction, the hyperreal. The concept, coined by the French sociologist and philosopher Jean Baudrillard, spoke to a hypothetical where consciousness could not distinguish reality from a simulation of reality. For those not alive on September 11th, 2001—or those simply too young to remember the day the twenty-first century began—Baudrillard’s formulation is a useful one. None of it, simply, seemed real. All of it was beyond America’s preconceived notions of violence, terror, and spectacle. None of the simulacra of disaster manufactured in the prior century offered any adequate preparation. Empire stars like Arnold Schwarzenegger, Bruce Willis, and Sylvester Stallone had not acted in any film like this one. There were no popular novels to point the way, no made-for-TV movies or series to seed the imagination. No radio plays. The baby boomers, entering middle-age, had been reared on fears of nuclear apocalypse. Their children, of generation X and the early millennial years, treated nuclear threat as kitsch, retreating to alien invasions, monster attacks, and volcanic eruptions. A world-famous New York landmark, the Empire State Building, had been obliterated on the screen, in 1996. But the perpetrator was a hostile alien race, firing a laser beam from a gigantic spaceship.
Hijacked airplanes flying into and destroying the Twin Towers, the stark monoliths that were derided upon their construction—this was not even dream logic. It was assumed they’d blight the skyline forever, for another thousand years. Over time, into the 1990s, they became icons themselves, melding comfortably with the city’s perception of itself. The observation decks were a popular attraction, as was the mall below. The city’s white-collar workforce, flusher since the nadir of the fiscal crisis, packed the North and South Towers each morning, riding high-speed elevators to their office sprawl, where they’d land behind their desktop computers, newspapers still tucked away in briefcases. It was, as has been widely noted elsewhere, a day of clear blue skies.
I’m really enjoying this essay, and looking forward to the next installment.
"everyone owned a color TV by 2001—and the internet was becoming a viable option for the adventurous"
If you are going to write about stuff you are too young to remember you'll need to do more research. Everyone who had a desk job in 2001 was using the internet every single day. Maybe in 1995 it was only "an option for the adventurous."