How Democracies Die PDF

Summary

How Democracies Die analyzes the decline and fall of democratic regimes, focusing on the rise of authoritarian tendencies in the U.S. and the importance of political institutions in maintaining democracy. The book explores the historical and contemporary examples of democratic breakdowns, emphasizing the role of political leaders and the importance of mutual tolerance and institutional forbearance in maintaining democracy.

Full Transcript

Copyright © 2018 by Steven Levitsky and Daniel Ziblatt All rights reserved. Published in the United States by Crown, an imprint of the Crown Publishing Group, a division of Penguin Random House LLC, New York. crownpublishing.com CROWN and the Crown colophon are registered trademarks of Penguin Rando...

Copyright © 2018 by Steven Levitsky and Daniel Ziblatt All rights reserved. Published in the United States by Crown, an imprint of the Crown Publishing Group, a division of Penguin Random House LLC, New York. crownpublishing.com CROWN and the Crown colophon are registered trademarks of Penguin Random House LLC. Library of Congress Cataloging-in-Publication Data Names: Levitsky, Steven, author. | Ziblatt, Daniel, 1972– author. Title: How democracies die / Steven Levitsky and Daniel Ziblatt. Description: First edition. | New York : Crown Publishing, | Includes bibliographical references. Identifiers: LCCN 2017045872| ISBN 9781524762933 | ISBN 9781524762940 (pbk.) | ISBN 9781524762957 (ebook) Subjects: LCSH: Democracy. | Political culture. | Democracy—United States. | Political culture—United States. | United States—Politics and government—2017– Classification: LCC JC423.L4855 2018 | DDC 321.8—dc23 LC record available at https://lccn.loc.gov/2017045872 ISBN 9781524762933 Ebook ISBN 9781524762957 International edition ISBN 9780525574538 Cover design by Christopher Brand v5.1 ep To our families: Liz Mineo and Alejandra Mineo-Levitsky & Suriya, Lilah, and Talia Ziblatt Contents Cover Title Page Copyright Dedication Introduction Chapter 1: Fateful Alliances Chapter 2: Gatekeeping in America Chapter 3: The Great Republican Abdication Chapter 4: Subverting Democracy Chapter 5: The Guardrails of Democracy Chapter 6: The Unwritten Rules of American Politics Chapter 7: The Unraveling Chapter 8: Trump Against the Guardrails Chapter 9: Saving Democracy Acknowledgments Endnotes Introduction Is our democracy in danger? It is a question we never thought we’d be asking. We have been colleagues for fifteen years, thinking, writing, and teaching students about failures of democracy in other places and times—Europe’s dark 1930s, Latin America’s repressive 1970s. We have spent years researching new forms of authoritarianism emerging around the globe. For us, how and why democracies die has been an occupational obsession. But now we find ourselves turning to our own country. Over the past two years, we have watched politicians say and do things that are unprecedented in the United States—but that we recognize as having been the precursors of democratic crisis in other places. We feel dread, as do so many other Americans, even as we try to reassure ourselves that things can’t really be that bad here. After all, even though we know democracies are always fragile, the one in which we live has somehow managed to defy gravity. Our Constitution, our national creed of freedom and equality, our historically robust middle class, our high levels of wealth and education, and our large, diversified private sector—all these should inoculate us from the kind of democratic breakdown that has occurred elsewhere. Yet, we worry. American politicians now treat their rivals as enemies, intimidate the free press, and threaten to reject the results of elections. They try to weaken the institutional buffers of our democracy, including the courts, intelligence services, and ethics offices. American states, which were once praised by the great jurist Louis Brandeis as “laboratories of democracy,” are in danger of becoming laboratories of authoritarianism as those in power rewrite electoral rules, redraw constituencies, and even rescind voting rights to ensure that they do not lose. And in 2016, for the first time in U.S. history, a man with no experience in public office, little observable commitment to constitutional rights, and clear authoritarian tendencies was elected president. What does all this mean? Are we living through the decline and fall of one of the world’s oldest and most successful democracies? — At midday on September 11, 1973, after months of mounting tensions in the streets of Santiago, Chile, British-made Hawker Hunter jets swooped overhead, dropping bombs on La Moneda, the neoclassical presidential palace in the center of the city. As the bombs continued to fall, La Moneda burned. President Salvador Allende, elected three years earlier at the head of a leftist coalition, was barricaded inside. During his term, Chile had been wracked by social unrest, economic crisis, and political paralysis. Allende had said he would not leave his post until he had finished his job—but now the moment of truth had arrived. Under the command of General Augusto Pinochet, Chile’s armed forces were seizing control of the country. Early in the morning on that fateful day, Allende offered defiant words on a national radio broadcast, hoping that his many supporters would take to the streets in defense of democracy. But the resistance never materialized. The military police who guarded the palace had abandoned him; his broadcast was met with silence. Within hours, President Allende was dead. So, too, was Chilean democracy. This is how we tend to think of democracies dying: at the hands of men with guns. During the Cold War, coups d’état accounted for nearly three out of every four democratic breakdowns. Democracies in Argentina, Brazil, the Dominican Republic, Ghana, Greece, Guatemala, Nigeria, Pakistan, Peru, Thailand, Turkey, and Uruguay all died this way. More recently, military coups toppled Egyptian President Mohamed Morsi in 2013 and Thai Prime Minister Yingluck Shinawatra in 2014. In all these cases, democracy dissolved in spectacular fashion, through military power and coercion. But there is another way to break a democracy. It is less dramatic but equally destructive. Democracies may die at the hands not of generals but of elected leaders—presidents or prime ministers who subvert the very process that brought them to power. Some of these leaders dismantle democracy quickly, as Hitler did in the wake of the 1933 Reichstag fire in Germany. More often, though, democracies erode slowly, in barely visible steps. In Venezuela, for example, Hugo Chávez was a political outsider who railed against what he cast as a corrupt governing elite, promising to build a more “authentic” democracy that used the country’s vast oil wealth to improve the lives of the poor. Skillfully tapping into the anger of ordinary Venezuelans, many of whom felt ignored or mistreated by the established political parties, Chávez was elected president in 1998. As a woman in Chávez’s home state of Barinas put it on election night, “Democracy is infected. And Chávez is the only antibiotic we have.” When Chávez launched his promised revolution, he did so democratically. In 1999, he held free elections for a new constituent assembly, in which his allies won an overwhelming majority. This allowed the chavistas to singlehandedly write a new constitution. It was a democratic constitution, though, and to reinforce its legitimacy, new presidential and legislative elections were held in 2000. Chávez and his allies won those, too. Chávez’s populism triggered intense opposition, and in April 2002, he was briefly toppled by the military. But the coup failed, allowing a triumphant Chávez to claim for himself even more democratic legitimacy. It wasn’t until 2003 that Chávez took his first clear steps toward authoritarianism. With public support fading, he stalled an opposition-led referendum that would have recalled him from office—until a year later, when soaring oil prices had boosted his standing enough for him to win. In 2004, the government blacklisted those who had signed the recall petition and packed the supreme court, but Chávez’s landslide reelection in 2006 allowed him to maintain a democratic veneer. The chavista regime grew more repressive after 2006, closing a major television station, arresting or exiling opposition politicians, judges, and media figures on dubious charges, and eliminating presidential term limits so that Chávez could remain in power indefinitely. When Chávez, now dying of cancer, was reelected in 2012, the contest was free but not fair: Chavismo controlled much of the media and deployed the vast machinery of the government in its favor. After Chávez’s death a year later, his successor, Nicolás Maduro, won another questionable reelection, and in 2014, his government imprisoned a major opposition leader. Still, the opposition’s landslide victory in the 2015 legislative elections seemed to belie critics’ claims that Venezuela was no longer democratic. It was only when a new single-party constituent assembly usurped the power of Congress in 2017, nearly two decades after Chávez first won the presidency, that Venezuela was widely recognized as an autocracy. This is how democracies now die. Blatant dictatorship—in the form of fascism, communism, or military rule—has disappeared across much of the world. Military coups and other violent seizures of power are rare. Most countries hold regular elections. Democracies still die, but by different means. Since the end of the Cold War, most democratic breakdowns have been caused not by generals and soldiers but by elected governments themselves. Like Chávez in Venezuela, elected leaders have subverted democratic institutions in Georgia, Hungary, Nicaragua, Peru, the Philippines, Poland, Russia, Sri Lanka, Turkey, and Ukraine. Democratic backsliding today begins at the ballot box. The electoral road to breakdown is dangerously deceptive. With a classic coup d’état, as in Pinochet’s Chile, the death of a democracy is immediate and evident to all. The presidential palace burns. The president is killed, imprisoned, or shipped off into exile. The constitution is suspended or scrapped. On the electoral road, none of these things happen. There are no tanks in the streets. Constitutions and other nominally democratic institutions remain in place. People still vote. Elected autocrats maintain a veneer of democracy while eviscerating its substance. Many government efforts to subvert democracy are “legal,” in the sense that they are approved by the legislature or accepted by the courts. They may even be portrayed as efforts to improve democracy—making the judiciary more efficient, combating corruption, or cleaning up the electoral process. Newspapers still publish but are bought off or bullied into self-censorship. Citizens continue to criticize the government but often find themselves facing tax or other legal troubles. This sows public confusion. People do not immediately realize what is happening. Many continue to believe they are living under a democracy. In 2011, when a Latinobarómetro survey asked Venezuelans to rate their own country from 1 (“not at all democratic”) to 10 (“completely democratic”), 51 percent of respondents gave their country a score of 8 or higher. Because there is no single moment—no coup, declaration of martial law, or suspension of the constitution—in which the regime obviously “crosses the line” into dictatorship, nothing may set off society’s alarm bells. Those who denounce government abuse may be dismissed as exaggerating or crying wolf. Democracy’s erosion is, for many, almost imperceptible. — How vulnerable is American democracy to this form of backsliding? The foundations of our democracy are certainly stronger than those in Venezuela, Turkey, or Hungary. But are they strong enough? Answering such a question requires stepping back from daily headlines and breaking news alerts to widen our view, drawing lessons from the experiences of other democracies around the world and throughout history. Studying other democracies in crisis allows us to better understand the challenges facing our own democracy. For example, based on the historical experiences of other nations, we have developed a litmus test to help identify would-be autocrats before they come to power. We can learn from the mistakes that past democratic leaders have made in opening the door to would-be authoritarians —and, conversely, from the ways that other democracies have kept extremists out of power. A comparative approach also reveals how elected autocrats in different parts of the world employ remarkably similar strategies to subvert democratic institutions. As these patterns become visible, the steps toward breakdown grow less ambiguous—and easier to combat. Knowing how citizens in other democracies have successfully resisted elected autocrats, or why they tragically failed to do so, is essential to those seeking to defend American democracy today. We know that extremist demagogues emerge from time to time in all societies, even in healthy democracies. The United States has had its share of them, including Henry Ford, Huey Long, Joseph McCarthy, and George Wallace. An essential test for democracies is not whether such figures emerge but whether political leaders, and especially political parties, work to prevent them from gaining power in the first place—by keeping them off mainstream party tickets, refusing to endorse or align with them, and when necessary, making common cause with rivals in support of democratic candidates. Isolating popular extremists requires political courage. But when fear, opportunism, or miscalculation leads established parties to bring extremists into the mainstream, democracy is imperiled. Once a would-be authoritarian makes it to power, democracies face a second critical test: Will the autocratic leader subvert democratic institutions or be constrained by them? Institutions alone are not enough to rein in elected autocrats. Constitutions must be defended—by political parties and organized citizens, but also by democratic norms. Without robust norms, constitutional checks and balances do not serve as the bulwarks of democracy we imagine them to be. Institutions become political weapons, wielded forcefully by those who control them against those who do not. This is how elected autocrats subvert democracy—packing and “weaponizing” the courts and other neutral agencies, buying off the media and the private sector (or bullying them into silence), and rewriting the rules of politics to tilt the playing field against opponents. The tragic paradox of the electoral route to authoritarianism is that democracy’s assassins use the very institutions of democracy—gradually, subtly, and even legally—to kill it. — America failed the first test in November 2016, when we elected a president with a dubious allegiance to democratic norms. Donald Trump’s surprise victory was made possible not only by public disaffection but also by the Republican Party’s failure to keep an extremist demagogue within its own ranks from gaining the nomination. How serious is the threat now? Many observers take comfort in our Constitution, which was designed precisely to thwart and contain demagogues like Donald Trump. Our Madisonian system of checks and balances has endured for more than two centuries. It survived the Civil War, the Great Depression, the Cold War, and Watergate. Surely, then, it will be able to survive Trump. We are less certain. Historically, our system of checks and balances has worked pretty well—but not, or not entirely, because of the constitutional system designed by the founders. Democracies work best—and survive longer —where constitutions are reinforced by unwritten democratic norms. Two basic norms have preserved America’s checks and balances in ways we have come to take for granted: mutual toleration, or the understanding that competing parties accept one another as legitimate rivals, and forbearance, or the idea that politicians should exercise restraint in deploying their institutional prerogatives. These two norms undergirded American democracy for most of the twentieth century. Leaders of the two major parties accepted one another as legitimate and resisted the temptation to use their temporary control of institutions to maximum partisan advantage. Norms of toleration and restraint served as the soft guardrails of American democracy, helping it avoid the kind of partisan fight to the death that has destroyed democracies elsewhere in the world, including Europe in the 1930s and South America in the 1960s and 1970s. Today, however, the guardrails of American democracy are weakening. The erosion of our democratic norms began in the 1980s and 1990s and accelerated in the 2000s. By the time Barack Obama became president, many Republicans, in particular, questioned the legitimacy of their Democratic rivals and had abandoned forbearance for a strategy of winning by any means necessary. Donald Trump may have accelerated this process, but he didn’t cause it. The challenges facing American democracy run deeper. The weakening of our democratic norms is rooted in extreme partisan polarization —one that extends beyond policy differences into an existential conflict over race and culture. America’s efforts to achieve racial equality as our society grows increasingly diverse have fueled an insidious reaction and intensifying polarization. And if one thing is clear from studying breakdowns throughout history, it’s that extreme polarization can kill democracies. There are, therefore, reasons for alarm. Not only did Americans elect a demagogue in 2016, but we did so at a time when the norms that once protected our democracy were already coming unmoored. But if other countries’ experiences teach us that that polarization can kill democracies, they also teach us that breakdown is neither inevitable nor irreversible. Drawing lessons from other democracies in crisis, this book suggests strategies that citizens should, and should not, follow to defend our democracy. Many Americans are justifiably frightened by what is happening to our country. But protecting our democracy requires more than just fright or outrage. We must be humble and bold. We must learn from other countries to see the warning signs—and recognize the false alarms. We must be aware of the fateful missteps that have wrecked other democracies. And we must see how citizens have risen to meet the great democratic crises of the past, overcoming their own deep-seated divisions to avert breakdown. History doesn’t repeat itself. But it rhymes. The promise of history, and the hope of this book, is that we can find the rhymes before it is too late. 1 Fateful Alliances A quarrel had arisen between the Horse and the Stag, so the Horse came to a Hunter to ask his help to take revenge on the Stag. The Hunter agreed but said: “If you desire to conquer the Stag, you must permit me to place this piece of iron between your jaws, so that I may guide you with these reins, and allow this saddle to be placed upon your back so that I may keep steady upon you as we follow the enemy.” The Horse agreed to the conditions, and the Hunter soon saddled and bridled him. Then, with the aid of the Hunter, the Horse soon overcame the Stag and said to the Hunter: “Now get off, and remove those things from my mouth and back.” “Not so fast, friend,” said the Hunter. “I have now got you under bit and spur and prefer to keep you as you are at present.” —“The Horse, the Stag, and the Hunter,” Aesop’s Fables On October 30, 1922, Benito Mussolini arrived in Rome at 10:55 A.M. in an overnight sleeping car from Milan. He had been invited to the capital city by the king to accept Italy’s premiership and form a new cabinet. Accompanied by a small group of guards, Mussolini first stopped at the Hotel Savoia and then, wearing a black suit jacket, black shirt, and matching black bowler hat, walked triumphantly to the king’s Quirinal Palace. Rome was filled with rumors of unrest. Bands of Fascists—many in mismatched uniforms—roamed the city’s streets. Mussolini, aware of the power of the spectacle, strode into the king’s marble-floored residential palace and greeted him, “Sire, forgive my attire. I come from the battlefield.” This was the beginning of Mussolini’s legendary “March on Rome.” The image of masses of Blackshirts crossing the Rubicon to seize power from Italy’s Liberal state became fascist canon, repeated on national holidays and in children’s schoolbooks throughout the 1920s and 1930s. Mussolini did his part to enshrine the myth. At the last train stop before entering Rome that day, he had considered disembarking to ride into the city on horseback surrounded by his guards. Though the plan was ultimately abandoned, afterward he did all he could to bolster the legend of his rise to power as, in his own words, a “revolution” and “insurrectional act” that launched a new fascist epoch. The truth was more mundane. The bulk of Mussolini’s Blackshirts, often poorly fed and unarmed, arrived only after he had been invited to become prime minister. The squads of Fascists around the country were a menace, but Mussolini’s machinations to take the reins of state were no revolution. He used his party’s 35 parliamentary votes (out of 535), divisions among establishment politicians, fear of socialism, and the threat of violence by 30,000 Blackshirts to capture the attention of the timid King Victor Emmanuel III, who saw in Mussolini a rising political star and a means of neutralizing unrest. With political order restored by Mussolini’s appointment and socialism in retreat, the Italian stock market soared. Elder statesmen of the Liberal establishment, such as Giovanni Giolitti and Antonio Salandra, found themselves applauding the turn of events. They regarded Mussolini as a useful ally. But not unlike the horse in Aesop’s fable, Italy soon found itself under “bit and spur.” Some version of this story has repeated itself throughout the world over the last century. A cast of political outsiders, including Adolf Hitler, Getúlio Vargas in Brazil, Alberto Fujimori in Peru, and Hugo Chávez in Venezuela, came to power on the same path: from the inside, via elections or alliances with powerful political figures. In each instance, elites believed the invitation to power would contain the outsider, leading to a restoration of control by mainstream politicians. But their plans backfired. A lethal mix of ambition, fear, and miscalculation conspired to lead them to the same fateful mistake: willingly handing over the keys of power to an autocrat-in-the-making. — Why do seasoned elder statesmen make this mistake? There are few more gripping illustrations than the rise of Adolf Hitler in January 1933. His capacity for violent insurrection was on display as early as Munich’s Beer Hall Putsch of 1923—a surprise evening strike in which his group of pistol-bearing loyalists took control of several government buildings and a Munich beer hall where Bavarian officials were meeting. The ill-conceived attack was halted by the authorities, and Hitler spent nine months in jail, where he wrote his infamous personal testament, Mein Kampf. Thereafter, Hitler publicly committed to gaining power via elections. Initially, his National Socialist movement found few votes. The Weimar political system had been founded in 1919 by a prodemocratic coalition of Catholics, Liberals, and Social Democrats. But beginning in 1930, with the German economy reeling, the center-right fell prey to infighting, and the Communists and Nazis grew in popularity. The elected government collapsed in March 1930 amid the pain of the Great Depression. With political gridlock blocking government action, the figurehead president, World War I hero Paul von Hindenburg, took advantage of a constitutional article giving the head of state the authority to name chancellors in the exceptional circumstance that parliament failed to deliver governing majorities. The aim of these unelected chancellors—and the president—was not only to govern but to sideline radicals on the left and right. First, Center Party economist Heinrich Brüning (who would later flee Germany to become a professor at Harvard) attempted, but failed, to restore economic growth; his time as chancellor was short-lived. President von Hindenburg turned next to nobleman Franz von Papen, and then, in growing despondency, to von Papen’s close friend and rival, former defense minister General Kurt von Schleicher. But without parliamentary majorities in the Reichstag, stalemate persisted. Leaders, for good reason, feared the next election. Convinced that “something must finally give,” a cabal of rivalrous conservatives convened in late January 1933 and settled on a solution: A popular outsider should be placed at the head of the government. They despised him but knew that at least he had a mass following. And, most of all, they thought they could control him. On January 30, 1933, von Papen, one of the chief architects of the plan, dismissed worries over the gamble that would make Adolf Hitler chancellor of a crisis-ridden Germany with the reassuring words: “We’ve engaged him for ourselves….Within two months, we will have pushed [him] so far into a corner that he’ll squeal.” A more profound miscalculation is hard to imagine. The Italian and German experiences highlight the type of “fateful alliance” that often elevates authoritarians to power. In any democracy, politicians will at times face severe challenges. Economic crisis, rising public discontent, and the electoral decline of mainstream political parties can test the judgment of even the most experienced insiders. If a charismatic outsider emerges on the scene, gaining popularity as he challenges the old order, it is tempting for establishment politicians who feel their control is unraveling to try to co-opt him. If an insider breaks ranks to embrace the insurgent before his rivals do, he can use the outsider’s energy and base to outmaneuver his peers. And then, establishment politicians hope, the insurgent can be redirected to support their own program. This sort of devil’s bargain often mutates to the benefit of the insurgent, as alliances provide outsiders with enough respectability to become legitimate contenders for power. In early 1920s Italy, the old Liberal order was crumbling amid growing strikes and social unrest. The failure of traditional parties to forge solid parliamentary majorities left the elderly fifth-term prime minister Giovanni Giolitti desperate, and against the wishes of advisors he called early elections in May 1921. With the aim of tapping into the Fascists’ mass appeal, Giolitti decided to offer Mussolini’s upstart movement a place on his electoral group’s “bourgeois bloc” of Nationalists, Fascists, and Liberals. This strategy failed—the bourgeois bloc won less than 20 percent of the vote, leading to Giolitti’s resignation. But Mussolini’s place on the ticket gave his ragtag group the legitimacy it would need to enable its rise. Such fateful alliances are hardly confined to interwar Europe. They also help to explain the rise of Hugo Chávez. Venezuela had prided itself on being South America’s oldest democracy, in place since 1958. Chávez, a junior military officer and failed coup leader who had never held public office, was a political outsider. But his rise to power was given a critical boost from a consummate insider: ex-president Rafael Caldera, one of the founders of Venezuelan democracy. Venezuelan politics was long dominated by two parties, the center-left Democratic Action and Caldera’s center-right Social Christian Party (known as COPEI). The two alternated in power peacefully for more than thirty years, and by the 1970s, Venezuela was viewed as a model democracy in a region plagued by coups and dictatorships. During the 1980s, however, the country’s oil-dependent economy sank into a prolonged slump, a crisis that persisted for more than a decade, nearly doubling the poverty rate. Not surprisingly, Venezuelans grew disaffected. Massive riots in February 1989 suggested that the established parties were in trouble. Three years later, in February 1992, a group of junior military officers rose up against President Carlos Andrés Pérez. Led by Hugo Chávez, the rebels called themselves “Bolivarians,” after revered independence hero Simón Bolívar. The coup failed. But when the now-detained Chávez appeared on live television to tell his supporters to lay down their arms (declaring, in words that would become legendary, that their mission had failed “for now”), he became a hero in the eyes of many Venezuelans, particularly poorer ones. Following a second failed coup in November 1992, the imprisoned Chávez changed course, opting to pursue power via elections. He would need help. Although ex-president Caldera was a well-regarded elder statesman, his political career was waning in 1992. Four years earlier, he had failed to secure his party’s presidential nomination, and he was now considered a political relic. But the seventy-six-year-old senator still dreamed of returning to the presidency, and Chávez’s emergence provided him with a lifeline. On the night of Chávez’s initial coup, the former president stood up during an emergency joint session of congress and embraced the rebels’ cause, declaring: It is difficult to ask the people to sacrifice themselves for freedom and democracy when they think that freedom and democracy are incapable of giving them food to eat, of preventing the astronomical rise in the cost of subsistence, or of placing a definitive end to the terrible scourge of corruption that, in the eyes of the entire world, is eating away at the institutions of Venezuela with each passing day. The stunning speech resurrected Caldera’s political career. Having tapped into Chávez’s antisystem constituency, the ex-president’s public support swelled, which allowed him to make a successful presidential bid in 1993. Caldera’s public flirtation with Chávez did more than boost his own standing in the polls; it also gave Chávez new credibility. Chávez and his comrades had sought to destroy their country’s thirty-four-year-old democracy. But rather than denouncing the coup leaders as an extremist threat, the former president offered them public sympathy—and, with it, an opening to mainstream politics. Caldera also helped open the gates to the presidential palace for Chávez by dealing a mortal blow to Venezuela’s established parties. In a stunning aboutface, he abandoned COPEI, the party he had founded nearly half a century earlier, and launched an independent presidential bid. To be sure, the parties were already in crisis. But Caldera’s departure and subsequent antiestablishment campaign helped bury them. The party system collapsed after Caldera’s 1993 election as an antiparty independent, paving the way for future outsiders. Five years later, it would be Chávez’s turn. But back in 1993, Chávez still had a major problem. He was in jail, awaiting trial for treason. However, in 1994, now-President Caldera dropped all charges against him. Caldera’s final act in enabling Chávez was literally opening the gates—of prison—for him. Immediately after Chávez’s release, a reporter asked him where he was going. “To power,” he replied. Freeing Chávez was popular, and Caldera had promised such a move during the campaign. Like most Venezuelan elites, he viewed Chávez as a passing fad— someone who would likely fall out of public favor by the time of the next election. But in dropping all charges, rather than allowing Chávez to stand trial and then pardoning him, Caldera elevated him, transforming the former coup leader overnight into a viable presidential candidate. On December 6, 1998, Chávez won the presidency, easily defeating an establishment-backed candidate. On inauguration day, Caldera, the outgoing president, could not bring himself to deliver the oath of office to Chávez, as tradition dictated. Instead, he stood glumly off to one side. Despite their vast differences, Hitler, Mussolini, and Chávez followed routes to power that share striking similarities. Not only were they all outsiders with a flair for capturing public attention, but each of them rose to power because establishment politicians overlooked the warning signs and either handed over power to them (Hitler and Mussolini) or opened the door for them (Chávez). The abdication of political responsibility by existing leaders often marks a nation’s first step toward authoritarianism. Years after Chávez’s presidential victory, Rafael Caldera explained his mistakes simply: “Nobody thought that Mr. Chávez had even the remotest chance of becoming president.” And merely a day after Hitler became chancellor, a prominent conservative who aided him admitted, “I have just committed the greatest stupidity of my life; I have allied myself with the greatest demagogue in world history.” — Not all democracies have fallen into this trap. Some—including Belgium, Britain, Costa Rica, and Finland—have faced challenges from demagogues but also have managed to keep them out of power. How have they done it? It is tempting to think this survival is rooted in the collective wisdom of voters. Maybe Belgians and Costa Ricans were simply more democratic than their counterparts in Germany or Italy. After all, we like to believe that the fate of a government lies in the hands of its citizens. If the people hold democratic values, democracy will be safe. If citizens are open to authoritarian appeals, then, sooner or later, democracy will be in trouble. This view is wrong. It assumes too much of democracy—that “the people” can shape at will the kind of government they possess. It’s hard to find any evidence of majority support for authoritarianism in 1920s Germany and Italy. Before the Nazis and Fascists seized power, less than 2 percent of the population were party members, and neither party achieved anything close to a majority of the vote in free and fair elections. Rather, solid electoral majorities opposed Hitler and Mussolini—before both men achieved power with the support of political insiders blind to the danger of their own ambitions. Hugo Chávez was elected by a majority of voters, but there is little evidence that Venezuelans were looking for a strongman. At the time, public support for democracy was higher there than in Chile—a country that was, and remains, stably democratic. According to the 1998 Latinobarómetro survey, 60 percent of Venezuelans agreed with the statement “Democracy is always the best form of government,” while only 25 percent agreed that “under some circumstances, an authoritarian government can be preferable to a democratic one.” By contrast, only 53 percent of respondents in Chile agreed that “democracy is always the best form of government.” Potential demagogues exist in all democracies, and occasionally, one or more of them strike a public chord. But in some democracies, political leaders heed the warning signs and take steps to ensure that authoritarians remain on the fringes, far from the centers of power. When faced with the rise of extremists or demagogues, they make a concerted effort to isolate and defeat them. Although mass responses to extremist appeals matter, what matters more is whether political elites, and especially parties, serve as filters. Put simply, political parties are democracy’s gatekeepers. — If authoritarians are to be kept out, they first have to be identified. There is, alas, no foolproof advance warning system. Many authoritarians can be easily recognized before they come to power. They have a clear track record: Hitler led a failed putsch; Chávez led a failed military uprising; Mussolini’s Blackshirts engaged in paramilitary violence; and in Argentina in the mid– twentieth century, Juan Perón helped lead a successful coup two and a half years before running for president. But politicians do not always reveal the full scale of their authoritarianism before reaching power. Some adhere to democratic norms early in their careers, only to abandon them later. Consider Hungarian Prime Minister Viktor Orbán. Orbán and his Fidesz party began as liberal democrats in the late 1980s, and in his first stint as prime minister between 1998 and 2002, Orbán governed democratically. His autocratic about-face after returning to power in 2010 was a genuine surprise. So how do we identify authoritarianism in politicians who don’t have an obvious antidemocratic record? Here we turn to the eminent political scientist Juan Linz. Born in Weimar Germany and raised amid Spain’s civil war, Linz knew all too well the perils of losing a democracy. As a professor at Yale, he devoted much of his career to trying to understand how and why democracies die. Many of Linz’s conclusions can be found in a small but seminal book called The Breakdown of Democratic Regimes. Published in 1978, the book highlights the role of politicians, showing how their behavior can either reinforce democracy or put it at risk. He also proposed, but never fully developed, a “litmus test” for identifying antidemocratic politicians. Building on Linz’s work, we have developed a set of four behavioral warning signs that can help us know an authoritarian when we see one. We should worry when a politician 1) rejects, in words or action, the democratic rules of the game, 2) denies the legitimacy of opponents, 3) tolerates or encourages violence, or 4) indicates a willingness to curtail the civil liberties of opponents, including the media. Table 1 shows how to assess politicians in terms of these four factors. A politician who meets even one of these criteria is cause for concern. What kinds of candidates tend to test positive on a litmus test for authoritarianism? Very often, populist outsiders do. Populists are antiestablishment politicians—figures who, claiming to represent the voice of “the people,” wage war on what they depict as a corrupt and conspiratorial elite. Populists tend to deny the legitimacy of established parties, attacking them as undemocratic and even unpatriotic. They tell voters that the existing system is not really a democracy but instead has been hijacked, corrupted, or rigged by the elite. And they promise to bury that elite and return power to “the people.” This discourse should be taken seriously. When populists win elections, they often assault democratic institutions. In Latin America, for example, of all fifteen presidents elected in Bolivia, Ecuador, Peru, and Venezuela between 1990 and 2012, five were populist outsiders: Alberto Fujimori, Hugo Chávez, Evo Morales, Lucio Gutiérrez, and Rafael Correa. All five ended up weakening democratic institutions. Table 1: Four Key Indicators of Authoritarian Behavior Do they reject the Constitution or express a willingness to violate it? 1. Rejection of (or weak commitment to) Do they suggest a need for antidemocratic measures, such as canceling elections, violating or suspending the Constitution, banning certain organizations, or restricting basic civil or political rights? Do they seek to use (or endorse the use of) democratic rules of the game extraconstitutional means to change the government, such as military coups, violent insurrections, or mass protests aimed at forcing a change in the government? Do they attempt to undermine the legitimacy of elections, for example, by refusing to accept credible electoral results? Do they describe their rivals as subversive, or opposed to the existing constitutional order? 2. Denial of the legitimacy of political opponents Do they claim that their rivals constitute an existential threat, either to national security or to the prevailing way of life? Do they baselessly describe their partisan rivals as criminals, whose supposed violation of the law (or potential to do so) disqualifies them from full participation in the political arena? Do they baselessly suggest that their rivals are foreign agents, in that they are secretly working in alliance with (or the employ of) a foreign government—usually an enemy one? Do they have any ties to armed gangs, paramilitary forces, militias, guerrillas, or other organizations that engage in illicit violence? 3. Toleration or encouragement of violence Have they or their partisan allies sponsored or encouraged mob attacks on opponents? Have they tacitly endorsed violence by their supporters by refusing to unambiguously condemn it and punish it? Have they praised (or refused to condemn) other significant acts of political violence, either in the past or elsewhere in the world? 4. Readiness to curtail civil liberties of opponents, including media Have they supported laws or policies that restrict civil liberties, such as expanded libel or defamation laws, or laws restricting protest, criticism of the government, or certain civic or political organizations? Have they threatened to take legal or other punitive action against critics in rival parties, civil society, or the media? Have they praised repressive measures taken by other governments, either in the past or elsewhere in the world? Keeping authoritarian politicians out of power is more easily said than done. Democracies, after all, are not supposed to ban parties or prohibit candidates from standing for election—and we do not advocate such measures. The responsibility for filtering out authoritarians lies, rather, with political parties and party leaders: democracy’s gatekeepers. Successful gatekeeping requires that mainstream parties isolate and defeat extremist forces, a behavior political scientist Nancy Bermeo calls “distancing.” Prodemocratic parties may engage in distancing in several ways. First, they can keep would-be authoritarians off party ballots at election time. This requires that they resist the temptation to nominate these extremists for higher office even when they can potentially deliver votes. Second, parties can root out extremists in the grass roots of their own ranks. Take the Swedish Conservative Party (AVF) during the perilous interwar period. The AVF’s youth group (an organization of voting-age activists), called the Swedish Nationalist Youth Organization, grew increasingly radical in the early 1930s, criticizing parliamentary democracy, openly supporting Hitler, and even creating a group of uniformed storm troopers. The AVF responded in 1933 by expelling the organization. The loss of 25,000 members may have cost the AVF votes in the 1934 municipal elections, but the party’s distancing strategy reduced the influence of antidemocratic forces in Sweden’s largest center-right party. Third, prodemocratic parties can avoid all alliances with antidemocratic parties and candidates. As we saw in Italy and Germany, prodemocratic parties are sometimes tempted to align with extremists on their ideological flank to win votes or, in parliamentary systems, form governments. But such alliances can have devastating long-term consequences. As Linz wrote, the demise of many democracies can be traced to a party’s “greater affinity for extremists on its side of the political spectrum than for [mainstream] parties close to the opposite side.” Fourth, prodemocratic parties can act to systematically isolate, rather than legitimize, extremists. This requires that politicians avoid acts—such as German Conservatives’ joint rallies with Hitler in the early 1930s or Caldera’s speech sympathizing with Chávez—that help to “normalize” or provide public respectability to authoritarian figures. Finally, whenever extremists emerge as serious electoral contenders, mainstream parties must forge a united front to defeat them. To quote Linz, they must be willing to “join with opponents ideologically distant but committed to the survival of the democratic political order.” In normal circumstances, this is almost unimaginable. Picture Senator Edward Kennedy and other liberal Democrats campaigning for Ronald Reagan, or the British Labour Party and their trade union allies endorsing Margaret Thatcher. Each party’s followers would be infuriated at this seeming betrayal of principles. But in extraordinary times, courageous party leadership means putting democracy and country before party and articulating to voters what is at stake. When a party or politician that tests positive on our litmus test emerges as a serious electoral threat, there is little alternative. United democratic fronts can prevent extremists from winning power, which can mean saving a democracy. — Although the failures are more memorable, some European democracies practiced successful gatekeeping between the wars. Surprisingly big lessons can be drawn from small countries. Consider Belgium and Finland. In Europe’s years of political and economic crisis in the 1920s and 1930s, both countries experienced an early warning sign of democratic decay—the rise of antisystem extremists—but, unlike Italy and Germany, they were saved by political elites who defended democratic institutions (at least until Nazi invasion several years later). During Belgium’s 1936 general election, as the contagion of fascism was spreading from Italy and Germany across Europe, voters delivered a jarring result. Two authoritarian far-right parties—the Rex Party and the Flemish nationalist party, or Vlaams Nationaal Verbond (VNV)—surged in the polls, capturing almost 20 percent of the popular vote and challenging the historical dominance of three establishment parties: the center-right Catholic Party, the Socialists, and the Liberal Party. The challenge from the leader of the Rex Party, Léon Degrelle, a Catholic journalist who would become a Nazi collaborator, was especially strong. Degrelle, a virulent critic of parliamentary democracy, had departed from the right edges of the Catholic Party and now attacked its leaders as corrupt. He received encouragement and financial support from both Hitler and Mussolini. The 1936 election shook the centrist parties, which suffered losses across the board. Aware of the antidemocratic movements in nearby Italy and Germany and fearful for their own survival, they confronted the daunting task of deciding how to respond. The Catholic Party, in particular, faced a difficult dilemma: collaborate with their longtime rivals, the Socialists and Liberals, or forge a right-wing alliance that included the Rexists, a party with whom they shared some ideological affinity but that rejected the value of democratic politics. Unlike the retreating mainstream politicians of Italy and Germany, the Belgian Catholic leadership declared that any cooperation with the Rexists was incompatible with party membership and then pursued a two-pronged strategy to combat the movement. Internally, Catholic Party leaders heightened discipline by screening candidates for pro-Rexist sympathies and expelling those who expressed extremist views. In addition, the party leadership took a strong stance against cooperation with the far right. Externally, the Catholic Party fought Rex on its own turf. The Catholic Party adopted new propaganda and campaign tactics that targeted younger Catholics, who had formerly been part of the Rexist base. They created the Catholic Youth Front in December 1935 and began to run former allies against Degrelle. The final clash between Rex and the Catholic Party, in which Rex was effectively sidelined (until the Nazi occupation), centered around the formation of a new government after the 1936 election. The Catholic Party supported the incumbent Catholic prime minister Paul van Zeeland. After van Zeeland regained the premiership, there were two chief options for forming a government: The first was an alliance with the rival Socialists, along the lines of France’s “Popular Front,” which van Zeeland and other Catholic leaders had initially hoped to avoid. The second was a right-wing alliance of antisocialist forces that would include Rex and VNV. The choice was not easy; the second option was supported by a traditionalist faction that sought to upset the fragile van Zeeland cabinet by rallying the Catholic rank and file, organizing a “March on Brussels,” and forcing a by-election in which Rex leader Degrelle would run against van Zeeland. These plans were thwarted in 1937 when Degrelle lost the by-election, largely because the Catholic Party MPs had taken a stand: They refused to go with the traditionalists’ plan and instead united with the Liberals and Socialists behind van Zeeland. This was the Catholic Party’s most important gatekeeping act. The Catholic Party’s stand on the right was also made possible by King Leopold III and the Socialist Party. The election of 1936 had left the Socialist Party as the largest party in the legislature, which gave it the prerogative to form a government. However, when it became evident that the Socialists could not gain enough parliamentary support, rather than call a new election— which may have handed even more seats to extremist parties—the king met with leaders of the largest parties to talk them into a power-sharing cabinet, led by incumbent prime minister van Zeeland, which would include both the conservative Catholics and the Socialists but exclude antisystem parties on both sides. Although the Socialists distrusted van Zeeland, a Catholic Party man, they nevertheless put democracy ahead of their own interests and endorsed the grand coalition. A similar dynamic unfolded in Finland, where the extreme-right Lapua Movement burst onto the political stage in 1929, threatening the country’s fragile democracy. The movement sought the destruction of communism by any means necessary. It threatened violence if its demands were not met and attacked mainstream politicians whom it deemed collaborators with Socialists. At first, politicians from the governing center-right Agrarian Union flirted with the Lapua Movement, finding its anticommunism politically useful; they met the movement’s demands to deny communist political rights while tolerating extreme-right violence. In 1930, P. E. Svinhufvud, a conservative whom the Lapua leaders considered “one of their own,” became prime minister, and he offered them two cabinet posts. A year later, Svinhufvud became president. Yet the Lapua Movement continued its extremist behavior; with the communists banned, it targeted the more moderate Social Democratic Party. Lapua thugs abducted more than a thousand Social Democrats, including union leaders and members of parliament. The Lapua Movement also organized a 12,000-person march on Helsinki (modeled on the mythical March on Rome), and in 1932, it backed a failed putsch aimed at replacing the government with one that was “apolitical” and “patriotic.” As the Lapua Movement grew more radical, however, Finland’s traditional conservative parties broke decisively with it. In late 1930, the bulk of the Agrarian Union, the liberal Progress Party, and much of the Swedish Peoples Party joined their main ideological rival, the Social Democrats, in the socalled Lawfulness Front to defend democracy against violent extremists. Even the conservative president, Svinhufvud, forcefully rejected—and eventually banned—his former allies. The Lapua Movement was left isolated, and Finland’s brief burst of fascism was aborted. It is not only in distant historical cases that one finds successful gatekeeping. In Austria in 2016, the main center-right party (the Austrian People’s Party, ÖVP) effectively kept the radical-right Freedom Party (FPÖ) out of the presidency. Austria has a long history of extreme right politics, and the FPÖ is one of Europe’s strongest far-right parties. Austria’s political system was growing vulnerable because the two main parties, the Social Democratic SPÖ and the Christian Democratic ÖVP, which had alternated in the presidency throughout the postwar period, were weakening. In 2016, their dominance was challenged by two upstarts—the Green Party’s former chairman, Alexander Van der Bellen, and the extremist FPÖ leader Norbert Hofer. To the surprise of most analysts, the first round left Van der Bellen and the right-wing outsider Hofer as the two candidates in a second-round runoff. After a procedural error in October 2016, the runoff was held in December. At this point, several leading politicians, including some from the conservative ÖVP, argued that Hofer and his Freedom Party had to be defeated. Hofer had appeared to encourage violence against immigrants, and many questioned whether an elected Hofer would privilege his party in ways that violated longstanding norms of the president remaining above politics. In the face of this threat, some important ÖVP leaders worked to defeat Hofer by supporting their ideological rival, the left-leaning Green candidate, Van der Bellen. The ÖVP’s presidential candidate, Andreas Khol, endorsed Van der Bellen, as did Chairman Reinhold Mitterlehner, Cabinet Minister Sophie Karmasin, and dozens of ÖVP mayors in the Austrian countryside. In one letter, former chairman Erhard Busek wrote that he endorsed Van der Bellen “not with passion but after careful deliberation,” and that, furthermore, the decision was motivated by the sentiment that “we don’t want congratulations from Le Pen, Jobbik, Wilders and the AfD [and other extremists] after our presidential elections.” Van der Bellen won by a mere 300,000 votes. This stance took considerable political courage. According to one Catholic Party mayor of a small city outside Vienna, Stefan Schmuckenschlager, who endorsed the Green Party candidate, it was a decision that split families. His twin brother, another party leader, had supported Hofer. As Schmuckenschlager explained it, power politics sometimes has to be put aside to do the right thing. Did the endorsements from the ÖVP help? There is evidence that they did. According to exit polls, 55 percent of respondents who identified as ÖVP supporters said they voted for Van der Bellen, and 48 percent of Van der Bellen voters said they had voted for him to prevent Hofer from winning. In addition, the strong urban/rural division that has always marked Austrian politics (between left-wing urban areas and right-wing rural areas) was dramatically diminished in the second round in December 2016, with a surprising number of traditional rural conservative states switching to vote for Van der Bellen. In short, in 2016, responsible leaders in the ÖVP resisted the temptation to ally with an extremist party on their own ideological flank, and the result was that party’s defeat. The FPÖ’s strong performance in the 2017 parliamentary elections, which positioned it to become a junior partner in a new right-wing government, made it clear that the dilemma facing Austrian conservatives persists. Still, their effort to keep an extremist out of the presidency provides a useful model of contemporary gatekeeping. For its part, the United States has an impressive record of gatekeeping. Both Democrats and Republicans have confronted extremist figures on their fringes, some of whom enjoyed considerable public support. For decades, both parties succeeded in keeping these figures out of the mainstream. Until, of course, 2016. 2 Gatekeeping in America In The Plot Against America, American novelist Philip Roth builds on real historical events to imagine what fascism might have looked like in prewar America. An early American mass-media hero, Charles Lindbergh, is the novel’s central figure: He skyrockets to fame with his 1927 solo flight across the Atlantic and later becomes a vocal isolationist and Nazi sympathizer. But here is where history takes a fantastic turn in Roth’s hands: Rather than fading into obscurity, Lindbergh arrives by plane at the 1940 Republican Party convention in Philadelphia at 3:14 A.M., as a packed hall finds itself deadlocked on the twentieth ballot. Cries of “Lindy! Lindy! Lindy!” erupt for thirty uncontained minutes on the convention floor, and in a moment of intense collective fervor, his name is proposed, seconded, and approved by acclamation as the party’s nominee for president. Lindbergh, a man with no political experience but unparalleled media savvy, ignores the advice of his advisors and campaigns by piloting his iconic solo aircraft, Spirit of St. Louis, from state to state, wearing his flight goggles, high boots, and jumpsuit. In this world turned upside down, Lindbergh beats Franklin Delano Roosevelt, the incumbent, to become president. And Lindbergh, whose campaign is later revealed to be linked to Hitler, goes on to sign peace treaties with America’s enemies. A wave of anti-Semitism and violence is unleashed across America. Many Americans have found parallels between the 2016 presidential election and Roth’s work of fiction. The premise—an outsider with dubious democratic credentials comes to power with the aid of a foreign nation— cannot help but resonate. But the comparison raises another striking question: Given the severity of the economic crisis in 1930s America, why didn’t this happen here? — The reason no extremist demagogue won the presidency before 2016 is not the absence of contenders for such a role. Nor is it the lack of public support for them. To the contrary, extremist figures have long dotted the landscape of American politics. In the 1930s alone, as many as eight hundred right-wing extremist groups existed in the United States. Among the most important figures to emerge during this period was Father Charles Coughlin, an antiSemitic Catholic priest whose fiery nationalist radio program reached up to forty million listeners a week. Father Coughlin was openly antidemocratic, calling for the abolition of political parties and questioning the value of elections. His newspaper, Social Justice, adopted pro-fascist positions in the 1930s, naming Mussolini its “Man of the Week” and often defending the Nazi regime. Despite his extremism, Father Coughlin was immensely popular. Fortune magazine called him “just about the biggest thing ever to happen to radio.” He delivered speeches to packed stadiums and auditoriums across the country; as he traveled from city to city, fans lined his route to see him passing by. Some contemporary observers called him the most influential figure in the United States after Roosevelt. The Depression also gave rise to Louisiana governor and senator Huey Long, who called himself “the Kingfish.” Long was described by the historian Arthur M. Schlesinger Jr. as “the great demagogue of the day, a man who resembled…a Latin American dictator, a Vargas or a Perón.” The Kingfish was a gifted stump speaker, and he routinely flouted the rule of law. As governor, Long built what Schlesinger described as “the nearest approach to a totalitarian state the American republic has ever seen,” using a mix of bribes and threats to bring the state’s legislature, judges, and press to heel. Asked by an opposition legislator if he had heard of the state constitution, Long replied, “I’m the constitution just now.” Newspaper editor Hodding Carter called Long “the first true dictator out of the soil of America.” When Franklin Roosevelt’s campaign manager, James A. Farley, met Mussolini in Rome in 1933, he wrote that the Italian dictator “reminded me of Huey Long.” Long built a massive following with his call to redistribute wealth. In 1934, he was said to have “received more mail than all other senators combined, more even than the president.” By then his Share Our Wealth movement had more than 27,000 cells across the country and a mailing list of nearly eight million names. Long planned a presidential run, telling a New York Times reporter, “I can take this Roosevelt….I can out-promise him. And he knows it.” Roosevelt viewed Long as a serious threat but was spared when Long was assassinated in September 1935. America’s authoritarian tendency persisted through the post–World War II golden age. Senator Joseph McCarthy, who used the Cold War fear of communist subversion to promote blacklisting, censorship, and book banning, enjoyed wide backing among the American public. At the height of McCarthy’s political power, polls showed that nearly half of all Americans approved of him. Even after the Senate’s 1954 censure of him, McCarthy enjoyed 40 percent support in Gallup polls. A decade later, Alabama governor George Wallace’s defiant segregationist stance vaulted him to national prominence, leading to surprisingly vigorous bids for the presidency in 1968 and 1972. Wallace engaged in what journalist Arthur Hadley called the “old and honorable American tradition of hate the powerful.” He was, Hadley wrote, a master at exploiting “plain old American rage.” Wallace often encouraged violence and displayed a casual disregard for constitutional norms, declaring: There is one thing more powerful than the Constitution….That’s the will of the people. What is a Constitution anyway? They’re the products of the people, the people are the first source of power, and the people can abolish a Constitution if they want to. Wallace’s message, which mixed racism with populist appeals to working-class whites’ sense of victimhood and economic anger, helped him make inroads into the Democrats’ traditional blue-collar base. Polls showed that roughly 40 percent of Americans approved of Wallace in his third-party run in 1968, and in 1972 he shocked the establishment by emerging as a serious contender in the Democratic primaries. When Wallace’s campaign was derailed by an assassination attempt in May 1972, he was leading George McGovern by more than a million votes in the primaries. In short, Americans have long had an authoritarian streak. It was not unusual for figures such as Coughlin, Long, McCarthy, and Wallace to gain the support of a sizable minority—30 or even 40 percent—of the country. We often tell ourselves that America’s national political culture in some way immunizes us from such appeals, but this requires reading history with rosecolored glasses. The real protection against would-be authoritarians has not been Americans’ firm commitment to democracy but, rather, the gatekeepers —our political parties. — On June 8, 1920, as Woodrow Wilson’s presidency was winding down, Republican delegates gathered to choose their nominee in the flag-draped but poorly ventilated Chicago Coliseum, where the withering heat reached over one hundred degrees. After nine ballots over four days, the convention remained undecided. On Friday evening, in Suite 404 on the thirteenth floor of the nearby Blackstone Hotel, Republican National Committee Chairman Will Hays and George Harvey, the powerful publisher of Harvey’s Weekly, hosted a rotating group of U.S. senators and party leaders in the original “smoke-filled back room.” The Old Guard, as journalists called them, poured themselves drinks, smoked cigars, and talked late into the night about how to break the deadlock to get a candidate the 493 delegates needed for the nomination. The leading contender on the convention floor was Major General Leonard Wood, an old ally of Theodore Roosevelt who had generated popular enthusiasm in the primaries and dominated the ballot earlier in the week, with 287 delegates. He was followed by Illinois governor Frank Lowden, California senator Hiram Johnson, and Ohio senator Warren G. Harding, trailing in a distant fourth place with only 65½ delegates. From the convention floor, reporters wrote, “Nobody is talking Harding…[He is] not even considered as among the most promising dark horses.” But as reporters heard rumors about the discussions taking place at the Blackstone, the most motivated of them found their way to the thirteenth floor of the hotel and quietly gathered in the hallways outside Suite 404 to catch a glimpse as leading senators—including Henry Cabot Lodge of Massachusetts, McCormick of Illinois, Phipps of Colorado, Calder of New York, former senator Crane of Massachusetts, and others—came and went. Inside Suite 404, the upsides and downsides of each candidate were carefully reviewed and debated (Knox was too old; Lodge didn’t like Coolidge). At one in the morning, seven members of the Old Guard remained in the room and took a “standing vote.” Called in at 2:11 A.M. by George Harvey, a stunned Harding was informed that he had been selected. Word spread. By the next evening, on the tenth ballot and to the great relief of the sweltering delegates, Warren G. Harding received an overwhelming 692½ convention delegates amid rousing cheers. Though he garnered just over 4 percent of the primary vote, he was now the Republican Party’s 1920 presidential nominee. Nobody likes smoke-filled rooms today—and for good reason. They were not very democratic. Candidates were chosen by a small group of power brokers who were not accountable to the party rank and file, much less to average citizens. And smoke-filled rooms did not always produce good presidents—Harding’s term, after all, was marked by scandal. But backroom candidate selection had a virtue that is often forgotten today: It served a gatekeeping function, keeping demonstrably unfit figures off the ballot and out of office. To be sure, the reason for this was not the high-mindedness of party leaders. Rather, party “bosses,” as their opponents called them, were most interested in picking safe candidates who could win. It was, above all, their risk aversion that led them to avoid extremists. Gatekeeping institutions go back to the founding of the American republic. The 1787 Constitution created the world’s first presidential system. Presidentialism poses distinctive challenges for gatekeeping. In parliamentary democracies, the prime minister is a member of parliament and is selected by the leading parties in parliament, which virtually ensures that he or she will be acceptable to political insiders. The very process of government formation serves as a filter. Presidents, by contrast, are not sitting members of Congress, nor are they elected by Congress. At least in theory, they are elected by the people, and anyone can run for president and—if he or she earns enough support—win. Our founders were deeply concerned with gatekeeping. In designing the Constitution and electoral system, they grappled with a dilemma that, in many respects, remains with us today. On the one hand, they sought not a monarch but an elected president—one who conformed to their idea of a republican popular government, reflecting the will of the people. On the other, the founders did not fully trust the people’s ability to judge candidates’ fitness for office. Alexander Hamilton worried that a popularly elected presidency could be too easily captured by those who would play on fear and ignorance to win elections and then rule as tyrants. “History will teach us,” Hamilton wrote in the Federalist Papers, that “of those men who have overturned the liberties of republics, the great number have begun their career by paying an obsequious court to the people; commencing demagogues, and ending tyrants.” For Hamilton and his colleagues, elections required some kind of built-in screening device. The device the founders came up with was the Electoral College. Article II of the Constitution created an indirect election system that reflected Hamilton’s thinking in Federalist 68: The immediate election should be made by men most capable of analyzing the qualities adapted to the station, and acting under the circumstances favorable to deliberation, and to a judicious combination of all the reasons and inducements which were proper to govern them. The Electoral College, made up of locally prominent men in each state, would thus be responsible for choosing the president. Under this arrangement, Hamilton reasoned, “the office of president will seldom fall to the lot of any man who is not in an eminent degree endowed with the requisite qualifications.” Men with “talents for low intrigue, and the little arts of popularity” would be filtered out. The Electoral College thus became our original gatekeeper. This system proved short-lived, however, due to two shortcomings in the founders’ original design. First, the Constitution is silent on the question of how presidential candidates are to be selected. The Electoral College goes into operation after the people vote, playing no role in determining who seeks the presidency in the first place. Second, the Constitution never mentions political parties. Though Thomas Jefferson and James Madison would go on to pioneer our two-party system, the founders did not seriously contemplate those parties’ existence. The rise of parties in the early 1800s changed the way our electoral system worked. Instead of electing local notables as delegates to the Electoral College, as the founders had envisioned, each state began to elect party loyalists. Electors became party agents, which meant that the Electoral College surrendered its gatekeeping authority to the parties. The parties have retained it ever since. Parties, then, became the stewards of American democracy. Because they select our presidential candidates, parties have the ability—and, we would add, the responsibility—to keep dangerous figures out of the White House. They must, therefore, strike a balance between two roles: a democratic role, in which they choose the candidates that best represent the party’s voters; and what political scientist James Ceaser calls a “filtration” role, in which they screen out those who pose a threat to democracy or are otherwise unfit to hold office. These dual imperatives—choosing a popular candidate and keeping out demagogues—may, at times, conflict with each other. What if the people choose a demagogue? This is the recurring tension at the heart of the presidential nomination process, from the founders’ era through today. An overreliance on gatekeeping is, in itself, undemocratic—it can create a world of party bosses who ignore the rank and file and fail to represent the people. But an overreliance on the “will of the people” can also be dangerous, for it can lead to the election of a demagogue who threatens democracy itself. There is no escape from this tension. There are always trade-offs. — For most of American history, political parties prioritized gatekeeping over openness. There was always some form of a smoke-filled room. In the early nineteenth century, presidential candidates were chosen by groups of congressmen in Washington, in a system known as Congressional Caucuses. The system was soon criticized as too closed, so beginning in the 1830s, candidates were nominated in national party conventions made up of delegates from each state. Delegates were not popularly elected; they were chosen by state and local political party committees, and they were not bound to support particular candidates. They generally followed the instructions of the state party leaders who sent them to the convention. The system thus favored insiders, or candidates backed by the party leaders who controlled the delegates. Candidates who lacked support among their party’s network of state and local politicians had no chance of success. The convention system was also criticized for being closed and undemocratic, and there was no shortage of efforts to reform it. Primary elections were introduced during the Progressive era; the first was held in Wisconsin in 1901, and in 1916, primaries were held in two dozen states. Yet these brought little change—in part because many states didn’t use them, but mostly because elected delegates were not required to support the candidate who won the primary. They remained “unpledged,” free to negotiate their vote on the convention floor. Party leaders—with their control over government jobs, perks, and other benefits—were well-positioned to broker these deals, so they remained the presidency’s gatekeepers. Because primaries had no binding impact on presidential nominations, they were little more than beauty contests. Real power remained in the hands of party insiders, or what contemporaries called “organization men.” For prospective candidates, securing the backing of the organization men was the only viable road to the nomination. The old convention system highlights the trade-offs inherent to gatekeeping. On the one hand, the system wasn’t very democratic. The organization men were hardly representative of American society. Indeed, they were the very definition of an “old boys” network. Most rank-and-file party members, especially the poor and politically unconnected, women, and minorities, were not represented in the smoke-filled rooms and were thus excluded from the presidential nomination process. On the other hand, the convention system was an effective gatekeeper, in that it systematically filtered out dangerous candidates. Party insiders provided what political scientists called “peer review.” Mayors, senators, and congressional representatives knew the candidates personally. They had worked with them, under diverse conditions, over the years and were thus well-positioned to evaluate their character, judgment, and ability to operate under stress. Smoke-filled back rooms therefore served as a screening mechanism, helping to keep out the kind of demagogues and extremists who derailed democracy elsewhere in the world. American party gatekeeping was so effective that outsiders simply couldn’t win. As a result, most didn’t even try. Consider Henry Ford, the founder of the Ford Motor Company. One of the richest men in the world in the early twentieth century, Ford was a modern version of the kind of extremist demagogue Hamilton had warned against. Using his Dearborn Independent as a megaphone, he railed against bankers, Jews, and Bolsheviks, publishing articles claiming that Jewish banking interests were conspiring against America. His views attracted praise from racists worldwide. He was mentioned with admiration by Adolf Hitler in Mein Kampf and described by future Nazi leader Heinrich Himmler as “one of our most valuable, important, and witty fighters.” In 1938, the Nazi government awarded him the Grand Cross of the German Eagle. Yet Ford was also a widely admired, even beloved, figure in the United States, especially in the Midwest. A “poor farm boy who made good,” the plainspoken businessman was revered by many rural Americans as a folk hero, alongside such presidents as Washington and Lincoln. Ford’s restless imperiousness eventually lured him into politics. He began with opposition to World War I, launching an amateurish but high-profile “peace mission” to Europe. He dipped in and out of politics after the Great War, nearly winning a Senate seat in 1918 and then flirting with the idea of running for president (as a Democrat) in 1924. The idea quickly generated enthusiasm, especially in rural parts of the country. Ford for President clubs sprang up in 1923, and the press began to write of a “Ford Craze.” That summer, the popular magazine Collier’s began a weekly national poll of its readers, which suggested that Ford’s celebrity, reputation for business acumen, and unremitting media attention could translate into a popular presidential candidacy. As the results rolled in each week, they were accompanied by increasingly reverential headlines: “Politics in Chaos as Ford Vote Grows” and “Ford Leads in Presidential Free-for-All.” By the end of the two-month straw poll of upward of 250,000 readers, Henry Ford ran away from the competition, outpacing all twelve contenders, including President Warren Harding and future president Herbert Hoover. With these results, Collier’s editors concluded, “Henry Ford has become the issue in American politics.” But if Ford harbored serious presidential ambitions, he was born a century too soon. What mattered far more than public opinion was the opinion of party leaders, and party leaders soundly rejected him. A week after publishing the results of its readers’ poll, in a series of articles, including one titled “The Politicians Pick a President,” Collier’s reported the results of its poll of the ultimate insiders—a group of 116 party leaders in both parties, including all members of the Republican and Democratic Party National Committees, 14 leading governors, and senators and congressmen in each party. Among these kingmakers, Ford lagged in a distant fifth position. The Collier’s editors observed that fall: When Democratic [Party] chieftains are asked: “What about Ford?” they all shrug their shoulders. Almost without a single exception the men who constitute what is usually known as the “organization” in every State are opposed to Ford. In all the States except where there are presidential primaries these men practically hand-pick the delegates to the national conventions….Nobody denies the amount of Ford sentiment among the masses of the people—Democratic and Republican. Every Democratic leader knows his State is full of it—and he is afraid of it. He thinks, however, that because of the machinery of selection of delegates there is little likelihood that Ford will make much of a showing. Despite popular enthusiasm for his candidacy, Ford was effectively locked out of contention. Senator James Couzens called the idea of his candidacy ridiculous. “How can a man over sixty years old, who…has no training, no experience, aspire to such an office?” he asked. “It is most ridiculous.” It is, therefore, not surprising that when Ford was interviewed for Collier’s at the end of that long summer, his presidential ambitions were tempered: I can’t imagine myself today accepting any nomination. Of course, I can’t say…what I will do tomorrow. There might be a war or some crisis of the sort, in which legalism and constitutionalism and all that wouldn’t figure, and the nation wanted some person who could do things and do them quick. What Ford was saying, in effect, was that he would only consider running if the gatekeeping system blocking his path were somehow removed. So, in reality, he never stood a chance. Huey Long didn’t live long enough to test the presidential waters, but despite his extraordinary political skills, popularity, and ambition, there is good reason to think that he, too, would have been stopped by the partisan gatekeepers. When he was elected to the Senate in 1932, Long’s normbreaking behavior quickly isolated him from his peers. Lacking support among Democratic Party leaders, Long would have stood no chance of defeating Roosevelt at the 1936 convention. He would have had to mount an independent presidential bid, which would have been extraordinarily difficult. Polls suggested that a Long candidacy could divide the Democratic vote and throw the 1936 race to the Republicans but that Long himself had little chance of winning. Party gatekeeping also helped confine George Wallace to the margins of politics. The segregationist governor participated in a few Democratic primaries in 1964, performing surprisingly well. Running against civil rights and under the slogan “Stand Up for America,” Wallace shocked the pundits by winning nearly a third of the vote in Wisconsin and Indiana and a stunning 43 percent in Maryland. But primaries mattered little in 1964, and Wallace soon bowed out in the face of an inevitable Lyndon Johnson candidacy. Over the next four years, however, Wallace campaigned across the country in anticipation of the 1968 presidential race. His mix of populism and white nationalism earned him strong support among some white working-class voters. By 1968, roughly 40 percent of Americans approved of him. In other words, Wallace made a Trump-like appeal in 1968, and he enjoyed Trumplike levels of public support. But Wallace operated in a different political world. Knowing that the Democratic Party establishment would never back his candidacy, he ran as the candidate of the American Independence Party, which doomed him. Wallace’s performance—13.5 percent of the vote—was strong for a third-party candidate, but it left him far from the White House. We can now grasp the full scale of Philip Roth’s imaginative leap in his novel The Plot Against America. The Lindbergh phenomenon was not entirely a figment of Roth’s imagination. Lindbergh—an advocate of “racial purity” who toured Nazi Germany in 1936 and was awarded a medal of honor by Hermann Göring—emerged as one of America’s most prominent isolationists in 1939 and 1940, speaking nationwide on behalf of the America First Committee. And he was extraordinarily popular. His speeches drew large crowds, and in 1939, according to Reader’s Digest editor Paul Palmer, his radio addresses generated more mail than those of any other person in America. As one historian put it, “Conventional wisdom had had it that Lindbergh would eventually run for public office,” and in 1939, Idaho senator William Borah suggested that Lindbergh would make a good presidential candidate. But here is where we return to reality. The Republican Party’s 1940 convention was not even remotely like the fictionalized one described in The Plot Against America. Not only did Lindbergh not appear at the convention, but his name never even came up. Gatekeeping worked. In the conclusion of their history of radical-right politics in the United States, The Politics of Unreason, Seymour Martin Lipset and Earl Raab described American parties as the “chief practical bulwark” against extremists. They were correct. But Lipset and Raab published their book in 1970, just as the parties were embarking on the most dramatic reform of their nomination systems in well over a century. Everything was about to change, with consequences far beyond what anyone might have imagined. — The turning point came in 1968. It was a heart-wrenching year for Americans. President Lyndon Johnson had escalated the war in Vietnam, which was now spiraling out of control—16,592 Americans died in Vietnam in 1968 alone, more than in any previous year. American families sat in their living rooms each evening watching the TV nightly news, assaulted with ever more graphic scenes of combat. In April 1968, an assassin gunned down Martin Luther King Jr. Then, in June, within hours of his winning the California Democratic presidential primary, Robert F. Kennedy’s presidential campaign—centered on opposition to Johnson’s escalating war—was abruptly halted by a second assassin’s gun. The cries of despair in Los Angeles’s Ambassador Hotel ballroom that night were given expression by novelist John Updike, who wrote that it felt as if “God might have withdrawn His blessing from America.” Meanwhile, the Democrats grew divided between supporters of Johnson’s foreign policy and those who had embraced Robert Kennedy’s antiwar position. This split played out in a particularly disruptive manner at the Democratic convention in Chicago. With Kennedy tragically gone, the traditional party organization stepped into the breach. The party insiders who dominated on the convention floor favored Vice President Hubert Humphrey, but Humphrey was deeply unpopular among antiwar delegates because of his association with President Johnson’s Vietnam policies. Moreover, Humphrey had not run in a single primary. His campaign, as one set of analysts put it, was limited to “party leaders, union bosses, and other insiders.” Yet, with the backing of the party regulars, including the machine of powerful Chicago mayor Richard Daley, he won the nomination on the first ballot. Humphrey was hardly the first presidential candidate to win the nomination without competing in primaries. He would, however, be the last. The events that unfolded in Chicago—displayed on television screens across America— mortally wounded the party-insider presidential selection system. Even before the convention began, the crushing blow of Robert Kennedy’s assassination, the escalating conflict over Vietnam, and the energy of the antiwar protesters in Chicago’s Grant Park sapped any remaining public faith in the old system. On August 28, the protesters turned to march on the convention: Bluehelmeted police attacked protesters and bystanders, and bloodied men, women, and children sought refuge in nearby hotels. The so-called Battle of Michigan Avenue then spilled over into the convention hall itself. Senator Abraham Ribicoff of Connecticut, in his nomination speech for antiwar candidate George McGovern, decried “the gestapo tactics” of the Chicago police, looking—on live television—directly at Mayor Daley. As confrontations exploded on the convention floor, uniformed police officers dragged several delegates from the auditorium. Watching in shock, NBC anchor Chet Huntley observed, “This surely is the first time policemen have ever entered the floor of a convention.” His coanchor, David Brinkley, wryly added, “In the United States.” The Chicago calamity triggered far-reaching reform. Following Humphrey’s defeat in the 1968 election, the Democratic Party created the McGovern–Fraser Commission and gave it the job of rethinking the nomination system. The commission’s final report, published in 1971, cited an old adage: “The cure for the ills of democracy is more democracy.” With the legitimacy of the political system at stake, party leaders felt intense pressure to open up the presidential nomination process. As George McGovern put it, “Unless changes are made, the next convention will make the last look like a Sunday-school picnic.” If the people were not given a real say, the McGovern– Fraser report darkly warned, they would turn to “the anti-politics of the street.” The McGovern–Fraser Commission issued a set of recommendations that the two parties adopted before the 1972 election. What emerged was a system of binding presidential primaries. Beginning in 1972, the vast majority of the delegates to both the Democratic and Republican conventions would be elected in state-level primaries and caucuses. Delegates would be preselected by the candidates themselves to ensure their loyalty. This meant that for the first time, the people who chose the parties’ presidential candidates would be neither beholden to party leaders nor free to make backroom deals at the convention; rather, they would faithfully reflect the will of their state’s primary voters. There were differences between the parties, such as the Democrats’ adoption of proportional rules in many states and mechanisms to enhance the representation of women and minorities. But in adopting binding primaries, both parties substantially loosened their leaders’ grip over the candidate selection process—opening it up to voters instead. Democratic National Committee chair Larry O’Brien called the reforms “the greatest goddamn changes since the party system.” George McGovern, who unexpectedly won the 1972 Democratic nomination, called the new primary system “the most open political process in our national history.” McGovern was right. The path to the nomination no longer had to pass through the party establishment. For the first time, the party gatekeepers could be circumvented—and beaten. The Democrats, whose initial primaries were volatile and divisive, backtracked somewhat in the early 1980s, stipulating that a share of national delegates would be elected officials—governors, big-city mayors, senators, and congressional representatives—appointed by state parties rather than elected in primaries. These “superdelegates,” representing between 15 and 20 percent of national delegates, would serve as a counterbalance to primary voters—and a mechanism for party leaders to fend off candidates they disapproved of. The Republicans, by contrast, were flying high under Ronald Reagan in the early 1980s. Seeing no need for superdelegates, the GOP opted, fatefully, to maintain a more democratic nomination system. Some political scientists worried about the new system. Binding primaries were certainly more democratic. But might they be too democratic? By placing presidential nominations in the hands of voters, binding primaries weakened parties’ gatekeeping function, potentially eliminating the peer review process and opening the door to outsiders. Just before the McGovern– Fraser Commission began its work, two prominent political scientists warned that primaries could “lead to the appearance of extremist candidates and demagogues” who, unrestrained by party allegiances, “have little to lose by stirring up mass hatreds or making absurd promises.” Initially, these fears seemed overblown. Outsiders did emerge: Civil rights leader Jesse Jackson ran for the Democratic Party nomination in 1984 and 1988, while Southern Baptist leader Pat Robertson (1988), television commentator Pat Buchanan (1992, 1996, 2000), and Forbes magazine publisher Steve Forbes (1996) ran for the Republican nomination. But they all lost. Circumventing the party establishment was, it turned out, easier in theory than in practice. Capturing a majority of delegates required winning primaries all over the country, which, in turn, required money, favorable media coverage, and, crucially, people working on the ground in all states. Any candidate seeking to complete the grueling obstacle course of U.S. primaries needed allies among donors, newspaper editors, interest groups, activist groups, and state-level politicians such as governors, mayors, senators, and congressmen. In 1976, Arthur Hadley described this arduous process as the “invisible primary.” He claimed that this phase, which occurred before the primary season even began, was “where the winning candidate is actually selected.” Members of the party establishment—elected officials, activists, allied interest groups—were, thereby, not necessarily locked out of the game. Without them, Hadley argued, it was nearly impossible to win either party’s nomination. For a quarter of a century, Hadley was right. 3 The Great Republican Abdication On June 15, 2015, real estate developer and reality-TV star Donald Trump descended an escalator to the lobby of his own building, Trump Tower, to make an announcement: He was running for president. At the time, he was just another long-shot candidate who thought his wealth and celebrity might give him a chance or, at the very least, allow him to bask in the spotlight for a few months. Like fellow businessman Henry Ford a century earlier, Trump held some extremist views—his most recent experience with politics had been as a “birther,” questioning whether President Barack Obama was born in the United States. To the extent that leading media and political figures took him seriously, it was to denounce him. But the primary system had opened up the presidential nomination process more than ever before in American history. And openness is always doubleedged. In this new environment, a wider range of politicians, from George McGovern to Barack Obama, could now compete seriously for the presidency. But the window was now also open to true outsiders—individuals who had never held elective office. In the twenty-three years between 1945 and 1968, under the old convention system, only a single outsider (Dwight Eisenhower) publicly sought the nomination of either party. By contrast, during the first two decades of the primary system, 1972 to 1992, eight outsiders ran (five Democrats and three Republicans), an average of 1.25 per election; and between 1996 and 2016, eighteen outsiders competed in one of the two parties’ primaries—an average of three per election. Thirteen of these were Republicans. The post-1972 primary system was especially vulnerable to a particular kind of outsider: individuals with enough fame or money to skip the “invisible primary.” In other words, celebrities. Although conservative outsiders Pat Robertson, Pat Buchanan, and Steve Forbes did not manage to overcome the effects of the invisible primary during the 1980s and 1990s, their relative success provided clues into how it might be done. Forbes, an extraordinarily wealthy businessman, was able to buy name recognition, while Robertson, a televangelist who founded the Christian Broadcasting Network, and Buchanan, a television commentator (and early Republican proponent of white nationalism), were both colorful figures with special media access. Although none of them won the nomination, they used massive wealth and celebrity status to become contenders. But in the end, celebrity outsiders had always fallen short. And so on that early-summer afternoon in the gilded lobby of Trump Tower, there seemed no reason to think things would be different. To win the nomination, Trump would have to compete in an intricate web of caucuses and primaries against sixteen other candidates. Many of his rivals boasted the kind of résumé that had been the hallmark of successful candidates in the past. At the head of the pack was Florida governor Jeb Bush, son and brother of former presidents. There were other governors, as well, including Wisconsin’s Scott Walker, Louisiana’s Bobby Jindal, New Jersey’s Chris Christie, and Ohio’s John Kasich, and several rising Republican stars—younger, media-savvy politicians such as Senators Marco Rubio and Rand Paul, who hoped to replicate Barack Obama’s fast track to the presidency. Texas, home to three of the last eight elected presidents, offered two more candidates: Senator Ted Cruz and former governor Rick Perry. Besides Trump, two other outsiders threw their hats into the ring: businesswoman Carly Fiorina and neurosurgeon Ben Carson. Trump could not hope to win the support of the establishment. Not only did he lack any political experience, but he wasn’t even a lifelong Republican. Whereas Bush, Rubio, Cruz, Christie, Walker, and Kasich all had deep Republican roots, Trump had switched his party registration several times and had even contributed to Hillary Clinton’s campaign for the U.S. Senate. Even after Trump began to surge in the polls, few people took his candidacy seriously. In August 2015, two months after Trump declared his candidacy, Las Vegas bookmakers gave him one-hundred-to-one odds of winning the White House. And in November 2015, as Trump sat high atop the Republican polls, Nate Silver, founder of the FiveThirtyEight blog, whose uncannily accurate predictions in the 2008 and 2012 elections had earned him fame and prestige, wrote an article titled “Dear Media: Stop Freaking Out About Donald Trump’s Poll Numbers.” The article predicted that Trump’s weakness among party insiders would spell his demise. Despite Trump’s seemingly large lead, Silver assured us, his chances of winning the nomination were “considerably less than 20 percent.” But the world had changed. Party gatekeepers were shells of what they once were, for two main reasons. One was a dramatic increase in the availability of outside money, accelerated (though hardly caused) by the Supreme Court’s 2010 Citizens United ruling. Now even marginal presidential candidates— Michele Bachmann, Herman Cain, Howard Dean, Bernie Sanders—could raise large sums of money, either by finding their own billionaire financier or through small donations via the Internet. The proliferation of well-funded primary candidates indicated a more open and fluid political environment. The other major factor diminishing the power of traditional gatekeepers was the explosion of alternative media, particularly cable news and social media. Whereas the path to national name recognition once ran through relatively few mainstream channels, which favored establishment politicians over extremists, the new media environment made it easier for celebrities to achieve wide name recognition—and public support—practically overnight. This was particularly true on the Republican side, where the emergence of Fox News and influential radio talk-show personalities—what political commentator David Frum calls the “conservative entertainment complex”—radicalized conservative voters, to the benefit of ideologically extreme candidates. This gave rise to such phenomena as Herman Cain, the former Godfather Pizza CEO and radio talk-show host who rocketed to the top of the Republican polls in late 2011 before flaming out because of scandal. The nomination process was now wide open. While the rules of the game hardly guaranteed the rise of a Trump-like figure, they could no longer prevent it, either. It was like a game of Russian roulette: The chances of an extremist outsider capturing the presidential nomination were higher than ever before in history. — Although many factors contributed to Donald Trump’s stunning political success, his rise to the presidency is, in good measure, a story of ineffective gatekeeping. Party gatekeepers failed at three key junctures: the “invisible primary,” the primaries themselves, and the general election. Trump finished dead last in the invisible primary. When the actual primary season began on February 1, 2016, the day of the Iowa Caucus, he had no endorsements among Republican power brokers. Measured by the backing of governors, U.S. senators, and congressional representatives at the time of the Iowa Caucus, Jeb Bush won the invisible primary with 31 endorsements. Marco Rubio finished second with 27. Ted Cruz finished third with 18, followed by Rand Paul with 11. Chris Christie, John Kasich, Mike Huckabee, Scott Walker, Rick Perry, and Carly Fiorina all won more endorsements than Trump. By all standard wisdom, then, Trump’s candidacy was a nonstarter. If history were any guide, his lead in the polls would inevitably fade. Trump’s performance in the first state contest, Iowa—24 percent, good for second place—did little to alter these expectations. After all, outsiders Pat Robertson (25 percent of the vote in 1988), Pat Buchanan (23 percent in 1996), and Steve Forbes (31 percent in 2000) had all finished second in Iowa but faded away soon thereafter. Then Trump did something no previous outsider had done: He easily won subsequent primaries in New Hampshire and South Carolina. Still, he was shunned by the party establishment. On the day of the South Carolina primary, Trump did not yet have a single endorsement from a sitting Republican governor, senator, or congressperson. It was only after winning South Carolina that Trump gained his first supporters: congressional backbenchers Duncan Hunter (California) and Chris Collins (New York). Even as he proceeded to rout his Republican rivals at the polling stations, Trump never gained a substantial number of endorsements. When the primary season ended, he had forty-six—less than a third of Marco Rubio’s total and barely as many as the long-ended Bush campaign. By the time Trump rolled to victory in the March 1 Super Tuesday primaries, it was clear that he had laid waste to the invisible primary, rendering it irrelevant. Undoubtedly, Trump’s celebrity status played a role. But equally important was the changed media landscape. From early on in the campaign, Trump had the sympathy or support of right-wing media personalities such as Sean Hannity, Ann Coulter, Mark Levin, and Michael Savage, as well as the increasingly influential Breitbart News. Although Trump initially had a contentious relationship with Fox News, he reaped the benefits of its polarized media landscape. Trump also found new ways to use old media as a substitute for party endorsements and traditional campaign spending. A “candidate with qualities uniquely tailored to the digital age,” Trump attracted free mainstream coverage by creating controversy. By one estimate, the Twitt

Use Quizgecko on...
Browser
Browser