Chapter 10 - The United States In A Conservative Era, 1981-2001

CHAPTER TEN: THE UNITED STATES IN A CONSERVATIVE ERA, 1981-2001

Terms for Week 10

  • The Sunbelt
  • Social Conservatives
  • Jerry Falwell/Moral Majority
  • “Reaganomics”
  • George Herbert Walker Bush
  • The Culture Wars
  • Operation Desert Storm
  • Clarence Thomas/Anita Hill
  • Osama bin Laden
  • The Taliban
  • The underclass
  • Undocumented immigrants
  • NAFTA
  • Newt Gingrich/Contract With America
  • Serbia
  • The End of Welfare as We Know It
  • Monica Lewinsky
  • Bush v. Gore

CHANGING ATTITUDES TOWARD GOVERNMENT

During the 1930s when the United States was in the throes of the Great Depression most Americans welcomed and indeed demanded an activist government that would reinvigorate the economy and protect their rights.  Over the years however, attitudes toward government and what it can and should accomplish have undergone a dramatic shift.  The quotes from four American Presidents reflect that shift.

The liberal party is a party which believes that, as new conditions and problems arise beyond the power of men and women to meet as individuals, it becomes the duty of the Government itself to find new remedies with which to meet them.  The liberal party insists that the Government has the definite duty to use all its power and resources to meet new social problems with new social controlsto insure to the average person the right to his own economic and political life, liberty, and the pursuit of happiness.

  Franklin D. Roosevelt, June 16, 1941

Statements are made labeling the Federal Government an outsider, an intruder, an adversary... The people of this (TVA) area know that the United States Government is not a stranger or not an enemy.  It is the people of fifty states joining in a national effort...  Only a great national effort by a great people working together can explore the mysteries of space, harvest the products at the bottom of the ocean, and mobilize the human, natural, and material resources of our lands.

John F. Kennedy, May 18, 1963

Government cannot solve our problems.  It can't set our goals.  It cannot define our vision.  Government cannot eliminate poverty, or provide a bountiful economy, or reduce inflation, or save our cities, or cure illiteracy, or provide energy.

Jimmy Carter, January 19, 1978

Government is not the solution to our problem.  Government is the problem.

Ronald Reagan, January 20, 1981

Source: John M. Blum, The National Experience: A History of the United States (New York: Harcourt Brace Jovanovich, 1989), p. 812.


RONALD REAGAN: THE MAKING OF A PRESIDENT

In the following discussion historians George Brown Tindall and David Emory Shi briefly describe the rise of Ronald Reagan to the Presidency.

Born in the drab prairie town of Tampico, Illinois, in 1911, the son of an often-drunk shoe salesman and a devout, Bible-quoting mother, Ronald Reagan graduated from tiny Eureka college in 1932 during the depths of the Great Depression.  He first worked as a radio sportscaster before starting a movie career in Hollywood in 1937.  He served three years in the army during the Second World War, making training films.  At the time, as he recalled, he was a Democrat, “a New Dealer to the core” who voted for Franklin D. Roosevelt four times.

After the war Reagan became president of the acting profession’s union, the Screen Actors Guild (SAG).  His leadership of SAG honed his negotiating skills and intensified his anti-communism as he fended off efforts to infiltrate the union.  He learned “from firsthand experience how Communists used lies, deceit, violence, or any other tactic that suited them.”   Reagan campaigned for Harry S. Truman in the 1948 presidential election, but during the fifties he decided  that federal taxes were too high.  In 1960 he campaigned as a Democrat for Richard Nixon, and two years later he joined the Republican Party.  Reagan achieved [political] stardom in 1964 when he delivered a rousing speech on national television on behalf of [Arizona Senator] Barry Goldwater’s presidential candidacy.

Republican conservatives found in Ronald Reagan a new idol, whose appeal survived the defeat of Goldwater in 1964.  Those who dismissed Reagan as a minor actor and a mental midget underrated his many virtues, including the importance of his many years in front of a camera.  Politics is a performing art, all the more so in an age of television, and few if any others in public life had Reagan’s stage presence.  Blessed with a baritone voice and a wealth of entertaining stories, he was a superb speaker who charmed audiences.  Wealthy admirers convinced Reagan to run for governor of California in 1966, and he won by a landslide….  

Source: George Brown Tindall and David Emory Shi, America: A Narrative History (New York: W.W. Norton and Company, 2013), pp. 1098-1099.


THE RISE OF THE NEW RIGHT

Historians George Brown Tindall and David Emory Shi briefly describe the rise of the “New Right” as a Major Force in American Politics by the 1980s.

By the eve of the 1980 election, Ronald Reagan had benefited from demographic development that made his conservative vision a major asset.  The 1980 census revealed that the proportion of the population overt age sixty-five was soaring and moving from the Midwest and the Northeast to the sunbelt states of the South and the West.   Fully 90 percent of the nation’s total population growth during the eighties occurred in southern or western states.  These population shifts forced a massive redistricting of the House of Representatives, with Florida, California, and Texas gaining seats and northern states such as New York losing them.

The sunbelt states were attractive not only because of their mild climate; they also had the lowest tax rates in the nation, the highest rates of economic growth, and growing numbers of evangelical Christians and retirees.  Such attributes also made the sunbelt states fertile ground for the Republican Party.  This dual development—an increase in the number of senior citizens and the steady relocation of a significant portion of the population to conservative regions of the country, where hostility to “big government” was deeply rooted—meant that demographics were carrying the United States toward Reagan’s conservative philosophy.

A related development during the 1970s was a burgeoning tax revolt that swept across the nation as a result of the prolonged inflationary spiral.  Inflation increased home values, which in turn brought a dramatic spike in property taxes.  In California, Reagan’s home state, voters organized a massive grassroots taxpayer revolt.  Skyrocketing property taxes threatened to force many working class people from their homes.  The solution?  Cut back on the size and cost of government to enable reductions in property taxes.  In June 1978, tax rebels in California, with Reagan’s support, succeeded in getting Proposition 13 on the state ballot.  An overwhelming majority of voters, both Republicans and Democrats, approved the measure, which slashed property taxes by 57 percent and amended the state constitution to make raising taxes much more difficult.  The tax revolt in California soon spread across the nation as other states passed measures similar to those in California.  The New York Times compared the phenomenon to a “modern Boston Tea Party.”

Source: George Brown Tindall and David Emory Shi, America: A Narrative History (New York: W.W. Norton and Company, 2013), pp. 1099-1100.


THE MORAL MAJORITY

Religious Conservatives emerged in the late 1970s as a major force in the movement of the nation to the right and played a crucial role in Ronald Reagan’s presidential election victory in 1980.  The largest and most influential of these organizations was the Moral Majority founded in 1979 by the Rev. Jerry Falwell, a self-proclaimed fundamentalist preacher of the Thomas Road Baptist Church in Lynchburg, Virginia.  What follows is a brief discussion of the Moral Majority and its leader.  

The tax revolt [of the late 1970s] fed into a national conservative resurgence that benefited from a massive revival of evangelical religion aggressively working to influence social and political change and the local and nation levels.  By the [1980s], religious conservatism was no longer a local or provincial phenomenon.  Catholic conservatives and Protestant evangelicals now owned television and radio stations, operated numerous schools and universities, and organized “mega churches” that sprung up in the sprawling suburbs, where the serves as animating centers of social activity and spiritual life.  A survey in 1977 revealed that more than 70 million Americans described themselves as “born-again Christians.”  And religious conservatives formed the strongest grassroots movement of the late twentieth century.  During the 1970s and 1980s, they launched a cultural crusade against the forces of secularism and liberalism.

The Reverend Jerry Falwell’s Moral Majority (later renamed the Liberty Alliance), formed in 1979, expressed the major political and social goals of the religious right wing: the economy should operate without “interference” by the government, which should be reduced in size; the Supreme Court decision in Roe v. Wade (1973) legalizing abortion should be reversed; Darwinian evolution should be replaced in school textbooks by the biblical story of creation; prayer should be allowed back in public schools; women should submit to their husbands; and Soviet communism should be opposed as a form of pagan totalitarianism.

Falwell, the “televangelist” minister of a huge Baptist church in Lynchburg, Virginia, stressed that the Moral Majority was not a religious fellowship; it was a purely political organization open to conservatives of all faiths.  “If you would like to know where I am politically” Falwell told reporters, “I am to the right of wherever you are.  I thought [Arizona Senator Barry] Goldwater was too liberal.”

The moralistic zeal and financial resources of the religious right made its adherents formidable opponents of liberal political candidates and programs.  Falwell’s Moral Majority recruited over four million members in eighteen states.  Its base of support was in the South and was strongest among Baptists, but its appeal extended across the country…. 

Source: George Brown Tindall and David Emory Shi, America: A Narrative History (New York: W.W. Norton and Company, 2013), p. 1100.


CHRISTIANITY AND PUBLIC LIFE: OPPOSING VIEWPOINTS

With the rise of the Moral Majority, a new intense debate arouse over the role of religion in public life. Moral Majority supporters argued that Christian conservatives had a right to express their views and advocate for the America they felt was God-loving and God-fearing.  Secular opponents often felt the religious right wanted to impose their views on all Americans regardless of the ideas and freedoms of others.  In the passages below we see those views expressed by President Ronald Reagan in a 1983 speech to the National Association of American Evangelicals, and Yale University President A. Bartlett Giamatti, in 1981, in a speech to his university’s undergraduates. 

President Ronald Reagan: I want you to know that this administration is motivated by a political philosophy that sees the greatness of America in you [who help] people in your families, churches, neighborhoods, communities, the institutions that foster and nourish values like concern for others and respect for the rule of law under God.

Now, I don't have to tell you that this puts us in opposition to, or at least out of step with, a prevailing attitude of many who have turned to a modern-day secularism, discarding the tried and time-tested values upon which our very civilization is based.  No matter how well intentioned, their value system is radically different from that of most Americans. And while they proclaim that they're freeing us from superstitions of the past, they've taken upon themselves the job of superintending us by government rule and regulation. Sometimes their voices are louder than ours, but they are not yet a majority....

Freedom prospers when religion is vibrant and the rule of law under God is acknowledged. When our Founding Fathers passed the First Amendment, they sought to protect churches from government interference. They never intended to construct a wall of hostility between government and the concept of religious belief itself.

A. Bartlett Giamatti: A self-proclaimed “Moral Majority,” and its satellite or client groups, cunning in the use of a.…blend of old intimidation and new technology, threaten the values [of pluralism and freedom].... From the maw of this "morality" come those who presume to know what justice for all is; come those who presume to know which books are fit to read, which television programs are fit to watch .... From the maw of this "morality" rise the tax-exempt [ministers] who believe they, and they alone, possess the "truth.”  There is no discussion, no dissent. They know…. What nonsense. What dangerous, malicious nonsense ….

We should be concerned that so much of our political and religious leadership acts intimidated for the moment and will not say with clarity that this most recent denial of the legitimacy of differentness is a radical assault on the very pluralism of peoples, political beliefs, values, forms of merit and systems of religion our country was founded to welcome and foster. 

Liberty protects the person from unwarranted government intrusions into a dwelling or other private places. In our tradition the
State is not omnipresent in the home.  And there are other spheres of our lives and existence, outside the home, where the State should not be a dominant presence. Freedom extends beyond spatial bounds. Liberty presumes an autonomy of self that includes freedom of thought, belief, expression, and certain intimate conduct.

Sources: Ronald Reagan, Speaking My Mind: Selected Speeches (New York: Simon & Schuster, 1989),169-180; Yale University Archives.  Reprinted in James A. Henretta, Rebecca Edwards, and Robert O. Self, America:  A Concise History (New York: Bedford/St. Martin, 2012), p. 919.


THE COMPUTER AGE ARRIVES

The vignette below describes the emergence of the personal computer and ironically its debt to the counterculture generation.

One of the most significant technical developments of the 1970s was the personal computer.  Personal computers (PCs) sprang from several sources, notably the military's patronage of microelectronics and the interests of hobbyists in democratizing the use of computers.  An essential component of the PC was the integrated circuit, which formed all its electrical parts out of a flat piece of silicon, photo etching the connections between them. It was devised independently at Texas Instruments and at Fairchild Semiconductor Laboratories, in Palo Alto, California, which was an incubator for many of the engineers who would develop the computing industry in what came to be known as Silicon Valley, the region heavy with computer firms on the peninsula south of San Francisco.  Although integrated circuits were not developed with military patronage, the Defense Department and NASA provided a sizable fraction of the early market for them.  One Minuteman II missile used 2,000; the Apollo guidance system, 5,000.  By the late 1960s, engineers in Silicon Valley were creating an integrated circuit on a small chip containing the calculating circuits equivalent to all those in a mainframe computer of the 1950s.  In 1973, the Intel Corporation, founded by several veterans of Fairchild, announced that it had produced such a chip: the 8080.

The development of the personal computer was encouraged by the abundant technical resources of Silicon Valley notably the electronics graduates from nearby Stanford University and the University of California at Berkeley and the engineering innovations from local firms such as Hewlett-Packard-and by the inspiration that hobbyists drew from timesharing computers.  Built around a central computer that automatically allocated processing time to different individuals, timesharing gave users in their offices access to their personal files and encouraged them to think they could have interactive access to their own computers at any time for any purpose.  Computer hobbyists, some of them in tune with the countercultural ambience of the San Francisco Bay Peninsula, called for bringing computing power to the people by, for example, providing the public with free access to timeshared terminals.  One enthusiast recalled the "strong feeling that we were subversives.  We were subverting the way the giant corporations had run things."

In 1974, a small firm that three hobbyists had founded in Albuquerque, New Mexico, to sell radio transmitters for model airplanes went beyond the dream of universal terminal access to put computers themselves into everyone's hands.  They started marketing a personal computer kit called the Altair.  Selling for $397, the Altair ran on the Intel 8080 chip and was an instant hit with hobbyists, even though it had no keyboard or monitor. It spurred Bill Gates, a twentyyearold Harvard student, and his high school friend Paul Allen, twentytwo, to write a software program for it that they licensed to the Albuquerque firm. Gates dropped out of Harvard to develop the Microsoft Corporation, a software firm he and Allen founded in 1975 for the Altair venture.  In 1976, Steve Wozniak, twentyfive, and Steve Jobs, twenty, began marketing a comparable personal computer, the Apple.  Both were Tshirtsandjeans devotees of the hobbyist electronics culture in Silicon Valley, where they grew up; Jobs, with long hair and sandals, was an acolyte of vegetarianism, the Beatles, and transcendental meditation.  They built the first Apples in the home garage of Jobs's parents.

Eager to expand the business, Jobs and Wozniak relinquished their Tshirts for suits, obtained venture capital, and in 1977 brought out the Apple II, which included a keyboard, a monitor, and a floppydisk drive for storage.  A later version, introduced in 00 operated with a mouse and pull-down menus, both of which had been originally developed under contracts the Defense Department and NASA.  By this time, several other companies were selling personal computer software for them was initially confined to educational programs and games such as the wildly popular "Pacman," but in 1979 VisiCalc, a spreadsheet program, came on the market and demonstrated the value of the PC for business.

Bill Gates had already warned the hobbyists that he would consider free sharing of the software that Microsoft had produced for the Altair a form of piracy.  By the late 1970s, personal computing was rapidly turning away from its countercultural origins into a lucrative forprofit enterprise.  In 1981, IBM entered the PC market, enlisting Microsoft to provide the operating software for its machines.  In response, Microsoft bought a software package that had been devised at Seattle Computer Products by Tim Paterson, a recent college graduate, and provided it to IBM as MSDOS (short for "Microsoft Disk Operating System").  Gates sold IBM the right to use the system but maintained Microsoft's ownership, an arrangement that permitted the company eventually to earn billions of dollars by selling the right to use the system, which soon became an industry standard, to other makers of personal computers. The PC caught on so fast that two years later Time magazine designated the personal computer its "Man of the Year."

Source: Pauline Maier, Inventing America: A History of the United States, vol. 2 (New York, 2003), pp. 991-993.


"GREED IS GOOD": THE 1980s

Historian Pauline Maier, in this vignette, provides one description of the 1980s.

 The Reagan years reminded some observers of the 1920s, not only in the ebullience of the prosperity but in the unevenness of it, and in the naked materialism of the culture associated with it. Between 1982 and 1988, the gross domestic product grew at an average annual rate of about 4 percent, generating more than 630,000 new businesses, 11 million jobs, and a drop in the unemployment rate from 7.4 percent to 5.5 percent. By 1988, mortgage rates had plummeted roughly 40 percent, and by 1989 median family income corrected for inflation had shot up 12.5 percent.

Corporate profits broke records, and so did the stock market-at least until October 19, 1987, when the Dow Jones industrial average (an indicator of stockmarket value) plummeted 508 points, losing almost a quarter of its worth wiping out $750 billion in paper wealth, and generating fears that the country might be headed for another Depression. But the jitters were shortlived. By 1989, the Dow Jones had more than doubled its level in 1982.

The decade produced a new group called "yuppies," a derivative acronym for "young urban professionals," upwardly mobile men and women with degrees in law or business, dressed for success and exuding the ambitions of an unrestrained materialism. Americans of all sorts became absorbed with celebritiesprofessional athletes, television newscasters, entertainers, clothing designers, even chefs, most of whom were admired for their professional skills but also for their opulent incomes.  Among the heroes of Wall Street ere manipulators of junk bonds, loans issued to finance the purchase of corporations for prices far higher than the corporations were worth. Some of the heroes, who received several hundred million dollars a year in commissions, were later exposed as crooked and went to jail.

Tom Wolfe's bestselling novel Bonfire of the Vanities relentlessly explored the culture of avarice, but reality outdid fiction. Amid the weakened oversight of Reaganite deregulation, a number of savingsandloan institutions were looted by whitecollar thieves, some of whom bought yachts and threw lavish entertainments. Ivan Boesky, one of the financial buccaneers of the decadehe later went to jail for fraudulent manipulations proclaimed, "Greed is all right...everybody should be a little greedy," a sentiment that pervaded the popular film Wall Street.

Source: Pauline Maier, Inventing America: A History of the United States, vol. 2 (New York, 2003), pp. 1026-1027.


THE INTERNET

Few late 20th Century inventions have so profoundly changed U.S. society as the Internet.  Indeed, you are reading this vignette courtesy of internet technology.  A brief history of the Internet appears below. 

Like so many innovations that changed the way people lived, the Internet originated in the national defense program's patronage of science and technology.  It was principally conceived in the late 1960s by a computer scientist at MIT named J. C. R. Licklider as a network that would preserve communications in the event of nuclear attack.  In the seventies, scientists and engineers at different institutions developed the essential hardware and software that would permit different types of computers and networks to communicate with each other through an intermediate service provider.  With the sponsorship of the Defense Department, a nationwide network rapidly developed among industrial and university scientists.  It was used mainly for email, which was pioneered in 1971 and which an authoritative 1978 report dubbed a "smashing success" that would "sweep the country."

Between the mid1980s and early 1990s, partly at the initiative of thenSenator Al Gore, the Internet was transferred to civilian control and then opened up to commercial use.  In the meantime, scientists in Europe developed a program to retrieve information from any computer connected to the Internet by latching on to its standard address (called a "URL," for universal resource locator).  They also devised a language ("html," for hypertext markup language) for presenting text and images, and protocols ("http," for hypertext transfer protocol) for transferring them from one computer to another.  Programmers at a government computing facility in Illinois, having devised a browser, left in 1994 to develop a new, commercial version that they called Netscape.  Together, these innovations led to the birth of the World Wide Web.  After the mid1990s, the Web spread with the freely accessible Internet across the globe.  Its diffusion was accompanied by an avalanche of companies founded to exploit it commercially, most of them with URLs that ended in the designation ".com" and were known accordingly as "dot.com" companies.  By early 1999, about 74 million people, including two out of five adults, were accessing the Internet.

Source: Pauline Maier, Inventing America: A History of the United States, vol. 2 (New York, 2003), p. 1065-1066.


THE E-MAIL REVOLUTION BEGINS

The following passage from a 1985 Los Angeles Times article describes the advent of electronic mail.  At the time electronic mail services charged $40 to sign-up or $10 for a monthly service rate.  Ironically Microsoft advertised its Word program for the Apple-Mcintosh in that section of the paper for $149.95

From offices in San Francisco, the Bechtel Group, Inc. coordinated its Tedi River gold mining operations around the globe in Papua New Guinea by exchanging information over a computer message network.  In Mexico agricultural scientists are using computer links to remote experimental crop stations to monitor data of new strains of wheat being grown there.  And in Dearborn, Michigan, the Society of Manufacturing Engineers coordinates plans for its annual convention and distributes abstracts of technical papers to engineers across the United States over a computer communications system.

Today, information that might otherwise require costly long distance calls or delay for postal delivery can be exchanged across town or around the world virtually in an instant via "electronic mail"-- a computer-to-computer communications system regarded as the most revolutionary since the telegraph and telephone replaced horseback couriers more than a century ago.

"Electronic mail will be the 21st Century version of the telex--which it clearly makes obsolete," said communications consultant Richard Miller, president of International Telematics in Palo Alto.  He predicated dramatic changers in international communications.  "It allows me, for example, to send you a message regardless of where in the world either one of us is at the time."

Although still a fledgling industry, with revenues last year estimated at $200 million, electronic mail use is growing at an annual rate of nearly 60%--faster than any other segment of the computer industry, according to analysts.

Last year, for example, Columbus, Ohio-based CompuServe--the nation's largest electronic mail service--doubled its subscribers to 185,000.  And Echo, a Marina del Rey-based newcomer, has established 14,000 subscribers in less than a year, adding 3,000 in the last month alone.  Today there are an estimated 1 million electronic "mailboxes" in use...

"In the next decade electronic communication is going to become as routine as making phone calls," said Jan Lewis, an analyst for InfoCorp, a Cupertino, California-based marketing research firm.  She predicted that the average home in the mid-1990s will be equipped with a telephone with a built-in computer that will permit easy access not only to electronic mail, but to various databases, the latest stock quotes, weather reports and computerized directory assistance.  "We won't even have to memorize telephone numbers anymore," Lewis said.

The electronic mail concept is not new.  Back in the days when mail was delivered by horseback, telegraph--the original electronic communications system--revolutionized the way the world conducted business.  News of a gold discovery in the West, for example, could be relayed in a matter of hours to financial centers in the East.  Today, however, the computer has squeezed the hours down to milliseconds.  It is technically possible today to move the contents of an entire set of encyclopedia from a computer in Chicago to another terminal in Los Angeles in the time it takes to read this sentence.

The increasing business use of electronic mail will affect consumer use as well. "People who use it in the office are going to want to use it at home," said Michael J. Cavanagh, executive director of the Electronic Mail Association--a Washington-based industry group...citing the example of an early electronic mail network set up a few years ago through the Defense Department--a system designed for the exchange of important scientific information.  "After a time they found that there were also personal messages being exchanged like plans for Friday night poker games." said Cavanagh.  

He conceded, however, that consumer growth will lag behind business use of electronic mail.  "More people need to buy personal computers and telephone modems for their homes," he said.  "Until they do, we'll have the same problem that the telephone had for the first few decades--that is, even if some the of earliest users had a telephone, the chances were that very few of their friends did.  So, who could they call?"  That's the case now with electronic mail," Cavanagh said. "Its consumer value will increase as the numbers of subscribes increase."

Source: Los Angeles Times, February 24, 1985, Part VI, p. 1.


THE IRAN-CONTRA AFFAIR 

The Iran-Contra Affair was the greatest scandal to hit the Reagan Administration.  In 1986 reporters discovered that the United States had been secretly selling arms to Iran and using the proceeds to fund Contra guerilla activity in Nicaragua. Both actions were illegal although only one person, Admiral John Poindexter, the President’s National Security Advisor, serve time in jail for his actions.  A brief account of the scandal appears below.

…Reports surfaced in late 1986 that the United States had been secretly selling arms to Iran in the hope of securing the release of American hostages held in Lebanon by extremist groups sympathetic to Iran. Such action contradicted [President Ronald] Reagan's repeated insistence that his administration would never negotiate with terrorists. The disclosures angered America's allies as well as many Americans who vividly remembered the 1979 Iranian takeover of their country's embassy in Tehran. 

Over the next several months, revelations reminiscent of the Watergate affair disclosed a complicated series of covert activities carried out by administration officials. At the center of what came to be called the Iran-Contra affair was marine lieutenant colonel Oliver North. A swashbuckling aide to the National Security Council who specialized in counterterrorism, North, from the basement of the White House, had been secretly selling military supplies to Iran and using the money to subsidize the Contra rebels fighting in Nicaragua at a time when Congress had voted to ban such aid.  Oliver North's illegal activities, it turned out, had been approved by national security adviser Robert McFarlane; McFarlane's successor, Admiral John Poindexter; and CIA director William Casey. Both Secretary of State George Shultz and Secretary of Defense Caspar Weinberger criticized the arms sale to Iran, but their objections were ignored, and they were thereafter kept in the dark about what was going on. Later, on three occasions, Shultz threatened to resign over the continuing operation of the “pathetic” scheme. After information about the secret (and illegal) dealings surfaced in the press, McFarlane attempted suicide, Poindexter resigned, and North was fired. Casey, who denied any connection, left the CIA for health reasons and died shortly thereafter from a brain tumor.

Under increasing criticism, Reagan appointed both an independent counsel and a three-man commission, led by former Republican senator John Tower, to investigate the scandal. The Tower Commission issued a devastating report early in 1987 that placed much of the responsibility for the bungled Iran-Contra affair on Reagan's loose management style. During the spring and summer of 1987, a joint House-Senate investigating committee began holding hearings into the Iran-Contra affair. The televised sessions revealed a tangled web of inept financial and diplomatic transactions, the shredding of incriminating government documents, crass profiteering, and misguided patriotism.

The investigations of the independent counsel led to six indictments in1988. A Washington jury found Oliver North guilty of three relatively minor charges but innocent of nine more serious counts, apparently reflecting the judge’s reasoning that he acted as an agent of higher-ups. His conviction was later overturned on appeal. Of those involved in the affair, only John Poindexter got a jail sentence--six months for his conviction on five felony counts of obstructing justice and lying to Congress.

Source: George Brown Tindall and David Emory Shi, America: A Narrative History (New York: W.W. Norton, 2013), pp. 1109-1110.


BUSH 41

George Herbert Walker Bush, the 41st President of the United States, took office on the heels of the Reagan Revolution.  Many conservatives expected him to continue the conservative policies of his former boss.  That proved far more difficult than he and his supporters realized.  In the account below we are introduced to the first President Bush.

Americans, who are intensely proud of their democratic institutions, have often picked patricians and multimillionaires as their presidential candidates. In the years between 1904 and 1960, these included Theodore Roosevelt, William Howard Taft, Herbert Hoover, FDR, Adlai Stevenson, and JFK.

In selecting George Herbert Walker Bush as its nominee in 1988, the Republican Party followed this tradition. Bush, who turned sixty-four in June of that year, was the son of Dorothy Walker and Prescott Bush, a stern and accomplished man who had risen to become managing partner of Brown Brothers, Harriman, a top Wall Street firm, and to serve between 1953 and 1963 as a Republican senator from Connecticut. 

George Bush fashioned a record that would have made any parent proud. Sent to Phillips Academy, Andover, he flourished as a student and an athlete and was chosen' president of his senior class. After graduating in 1942, he quickly joined the navy, becoming its youngest pilot. During World War II he flew fifty-eight missions in the Pacific, receiving the Distinguished Flying Cross for completing a mission in a burning plane before bailing into the sea. After the war he attended Yale, where he captained the baseball team and graduated Phi Beta Kappa in 1948. His accomplishments to that point were as substantial as any presidential candidate in modern United States history.

After leaving Yale, Bush set out for Texas, where family money helped him fare handsomely in the oil development business. He then turned to the world of politics, serving on Capitol Hill as a Republican representative from a suburban Houston district between 1967 and 1971.  After losing to Lloyd Bentsen in a race for the Senate in 1970, he served as Nixon's ambassador to the United Nations and then as chairman of the GOP National Committee. Ford then tapped him to head the CIA. Though he failed to win the GOP presidential nomination in 1980, he became Reagan's running mate.  For the next eight years he was a discreet and totally loyal vice president--so conscientiously that many people, slow to recognize that he was keenly ambitious and competitive, derided him as a “wimp” and an “errand boy” who would never be able to stand on his own. Others who knew him (including some Reagan loyalists) thought he was a political chameleon who had no strong opinions.  Most people who worked with him, however, found him to be an unusually genial, courteous, and well-mannered man. He generated considerable and lasting loyalty among his inner circle of friends and advisers.

Though Bush hit a few snags on the path to his nomination in 1988, he proved to be more popular in the primaries than his major foes, Kansas senator Robert Dole and Pat Robertson, who had resigned his ordination in the Southern Baptist Convention before launching his challenge.  By March, Bush was assured of the nomination, which, like all major party presidential nominees in late twentieth-century America, he took on the first ballot at the GOP convention in August. (By then, the political conventions had become scripted, anachronistic rituals, not decision-making events.) Bush selected Indiana senator J. Danforth Quayle as his running mate, arousing widespread complaints that he had chosen a poorly regarded senator who had used family connections to escape the draft during the Vietnam War.

Though the criticism of Quayle, an often lampooned figure, was unnerving, Bush, who was distrusted by many conservatives, hoped that his running mate, a vocal supporter of “family values,” would help him with white evangelical Christian voters.  In accepting the presidential nomination, Bush gave a strong speech in which he pledged to hold the line on taxation.  “Read my lips,’” he told the delegates, “No new taxes.”

….Bush was a centrist Republican who had never been comfortable with the right wing within his party. A moderate Episcopalian, he had few ties with the evangelical Christian Right, members of which tended to regard him as an irreligious country-club Republican.  But in 1988, as during his years as vice president, he was careful not to antagonize it. Like many other Republicans, he demonized the l word, “liberal.” Opposing gun control, he supported voluntary prayers in the public schools and the death penalty for people who committed extraordinarily violent crimes.  He opposed abortion, except in cases of rape or incest, or to save the life of the mother. Among the aides who helped him get in touch with socially conservative religious people was his eldest son, George W. Bush. Young George had kicked a serious drinking habit two years earlier and had found God. “It was goodbye Jack Daniels, hello Jesus,” he said.

Republicans zeroed in with special zeal on what they called the “revolving door prison policy” of Dukakis's governorship. This was a program instituted by a Republican predecessor-which enabled prisoners to take brief furloughs. Most states, including California during Reagan's tenure as governor, had comparable programs, as did the federal prison system, though only Massachusetts made it available to lifers.  One of these Massachusetts prisoners was Willie Horton, a convicted first-degree murderer, who on a weekend furlough had repeatedly beaten and stabbed a man and assaulted and raped his fiancée.  Dukakis, defending the program, did not discontinue it until April 1988.  Some of Bush's state party committees and independent groups circulated pictures of Horton, an ominous-looking black man, and produced TV ads that showed streams of prisoners going in and out of prison via a turnstile. Though Bush's national committee disavowed material that identified and pictured the prisoner, there was no doubting that the Bush team knew and approved of the ads.

Tactics like these revealed that Bush, for all his gentility, could be ruthless in pursuit of his ambitions. They also indicated that social and cultural issues involving crime and race continued to be large and divisive in American life, and that Republicans would use these issues, as they had since 1968, in order to blunt the appeal of bread-and-butter economic platforms favoring Democrats. In 1988, these negative tactics virtually dominated Bush's campaign, making it difficult to know what he stood for. At times, he dismissed “the vision thing” -that is, the idea that he should try to sell any sweeping or inspiring message (as Reagan had tried to do) to the American people. 

Though Bush did not match Reagan's resounding success of 1984, he won easily, thereby becoming the nation's forty-first president. (When his son George W. Bush became the forty-third in 2001, many people labeled the father “Bush 41.”)  Receiving 48.9 million votes to 41.8 million for Dukakis, he captured 53.4 percent of the vote to his opponent's 45.6 percent.  He carried forty states for a triumph in the electoral college of 426 to 111. He was especially strong in the South, which (like Reagan in 1984) he swept, and where he won an estimated 66 percent of the white vote. Analysts concluded that he received 60 percent overall of middle-class votes and that he even bested Dukakis, 50 percent to 49 percent, among women voters. Three straight GOP victories in presidential elections indicated that the Republican Party, which had been badly battered by Watergate, had staged a considerable comeback. 

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 218-219, 222-224.


THE CLARENCE THOMAS-ANITA HILL CONTROVERSY

In the brief passage below historian James Patterson describes controversial nomination of Judge Clarence Thomas which spurred the first significant public discussion of sexual harassment in the workplace.  

President George H.W. Bush’s nomination, of Clarence Thomas in 1991, provoked one of the nastiest congressional battles in years. Thomas, only forty-three years old at that time, was a relatively rare phenomenon: a black conservative. In selecting him, Bush and his advisers expected that liberals, while disapproving of Thomas's conservative opinions, would think twice before rejecting an African American. But Thomas, as head of the EEOC and then as a federal judge, had opposed affirmative action procedures.

Advocates of choice fretted that he would vote to overturn Roe v. Wade. The national boards of the NAACP and the Urban League opposed his nomination. They were appalled to think of such a conservative taking the “black seat” that Thurgood Marshall, “Mr. Civil Rights,” had filled since 1967.

Thomas's road to confirmation in the Senate, already bumpy, became rocky when Anita Hill, a black law professor at the University of Oklahoma, charged that he had sexually harassed her when she worked for him at the EEOC. Her reluctant but startling testimony-which among other things revealed that Thomas looked at porn movies-infuriated his supporters. Thomas, denying Hill's accusations, exclaimed hotly that he was the victim of a “high-tech lynching for uppity blacks.” Some Democrats, including a few liberals, worried about the wisdom of voting against a black nominee. Thomas's nomination opened a rift between his supporters, most of them Republican men, and a great many women, black as well as white.

All but two of forty-six Republicans in the Senate ultimately voted for Thomas's confirmation, which was finally approved, 52 to 48, in October 1991. But the partisan, often vicious fight-reminiscent in many ways the battle against the nomination of [Robert] Bork that had polarized the Senate in 1987--left bruises. Politically liberal women charged that the Senate, which was 98 percent male, had brushed aside Hill's accusations concerning sexual harassment.  They resolved to fight back at the polls, where in 1992 women succeeded in winning five Senate races and increasing their numbers in the House and in state legislatures. Encouraged, they hailed 1992 as “the Year of the Woman.”  Hill's accusations also highlighted the issue of sexual harassment. Sexual harassment lawsuits, which multiplied number during the 1990s, further empowered the courts as arbiters of the rights revolution in American life.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 243-244.


OPERATION DESERT STORM

In 1991, in the largest operation since the Vietnam War, a U.S.-led military coalition drove the Iraqi Army out of Kuwait after that country had been invaded and occupied five months earlier.  Historian James Patterson provides a brief description of that conflict which lasted from January 17 to February 28, 1991.  

The coalition attack featured two phases. The first entailed massive bombings of Kuwait, Baghdad, and other Iraqi cities and installations [beginning in December 1990]. This lasted thirty-nine days. Coalition aircraft fired off 89,000 tons of explosives, a minority of them laser-guided “smart bombs” and cruise missiles, at these targets. The bombing was frightening, causing the desertion of an estimated 100,000 or more of the 400,000 or so Iraqi troops that were believed to have been initially deployed. It was also devastating. The air offensive destroyed Iraq's power grid and sundered its military communications. A number of contemporary observers thought that the bombing created severe food shortages, damaged facilities guarding against water pollution, provoked outbreaks of cholera and typhoid, and seriously disrupted medical care.

Some of this air war was featured on network television and on Cable News Network (CNN), which since its creation in 1980 had established itself as a successful network providing twenty-four-hours-a-day news of events all over the world.  CNN offered mesmerizing coverage from Baghdad of streaking missiles and flashing explosions. Iraqi Scud missiles were zooming off toward targets in Israel and Saudi Arabia, and American Patriot missiles were shooting up to intercept them. Though one of the Scuds (of the eighty-six or so believed to have been fired during the war) killed twenty-eight American military personnel in Dhahran, Saudi Arabia, most of them did no serious damage. American claims to the contrary, later studies concluded that the Patriots had relatively little luck hitting the Scuds. The Scuds, however, did prompt a widely told joke: “Question: How many Iraqis does it take to fire a Scud missile at Israel? Answer: Three, one to arm, one to fire, the third to watch CNN to see where it landed.” President Bush avidly followed CNN, which he said was a quicker source of news than the CIA.

With Iraqi defenses rendered virtually helpless, coalition forces undertook the second stage, Operation Desert Storm, of the Gulf War. Led by American general Norman Schwarzkopf, this was the ultimate demonstration of the [Joint Chief of Staff General Colin] Powell Doctrine, which called for the dispatch of overwhelmingly superior military power. Some of the troops used GPS gadgetry enabling them to navigate the desert.  In only 100 hours between February 23 and 27, this army shattered Iraqi resistance. Tens of thousands of Iraqi troops were taken prisoner. Most of the rest fled from Kuwait into Iraq.

Unreliable early estimates of Iraqi soldiers killed varied wildly, ranging as high as 100,000. The actual numbers, most later estimates concluded, had probably been far smaller. One carefully calculated count, by a former Defense Intelligence Agency military analyst, conceded that estimates varied greatly, but set the maximum number of Iraqi military casualties from the air and ground wars at 9,500 dead and 26,500 wounded. His minimum estimates were 1,500 dead and 3,000 wounded. Civilian deaths, he added, may have been fewer than 1,000. American losses may have been tiny by comparison.  Most estimates agreed that the number of United States soldiers killed in action was 48 (at least 35 of them from friendly fire) and that the number who died in non-battle accidents was 145. A total of 467 American troops were wounded in battle. These same estimates concluded that a total of 65 soldiers from other coalition nations (39 from Arab nations, 24 from Britain, 2 from France) had also been killed in battle.

As Iraqi soldiers fled on February 27, Bush stopped the fighting. The abrupt end allowed many of Hussein's assets, including Russian-made tanks and units of his elite Republican Guard, to escape destruction or capture. When the coalition armies ground to a halt, they left Baghdad and other important cities in the hands of their enemies. Saddam Hussein and his entourage licked their wounds but regrouped and remained in control until another American-led war toppled him from power. 

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 234-235.


THE HEALTH CARE DEBATE IN THE 1990s

On March 23, 2010, President Barack Obama signed into law the Patient Protection and Affordable Care Act.  Often called “Obamacare,” the measure ended an intense political struggle between conservatives and liberals over the government’s role in health care.  While that debate began during the Theodore Roosevelt Administration, it became a contentious national issue in the 1990s.  Here historian James Patterson reminds us of the 1990s debate.

Though jolted by…controversies, [President Bill] Clinton consoled himself with the hope that he would succeed in achieving his major goal of 1993: securing legislation reforming America's jerry-built system of health insurance coverage. “If I don't get health care,” he said, “I'll wish I didn't run for President.” As he emphasized in calling for reform, private expenditures for health purposes were continuing to skyrocket-from $246 billion in 1980 to $880 billion in 1993. Yet more than 35 million Americans around 14 percent of the population, had no medical insurance, either private or governmental, and another 20 million were said to lack adequate coverage. Most of these people were poor or unemployed. Their plight graphically exposed the persistence of poverty and inequality in the “world's richest nation.”

In selecting health insurance reform as his major objective, Clinton surprised many solons on the Hill, who had expected him instead to overhaul the welfare system. After all, he had pledged during the campaign to reform welfare, and in February 1993 he proclaimed that America must “end welfare as we know it,” so that it will “cease to be a way of life.” New York senator Daniel Moynihan, a liberal, was eager to undertake revision of welfare and denied that the country faced a “health care crisis.” Most Americans, he said, had fairly decent coverage. The president ignored Moynihan's appeals…. Reform of health care insurance…was a daunting project that had frustrated previous presidents dating back to Harry Truman. Still, he pressed ahead, entrusting development of a plan to a team headed by his wife and an old friend, Ira Magaziner.  

Unfortunately for advocates of reform, Magaziner and Mrs. [Hillary] Clinton enshrouded their activities in secrecy. They virtually ignored Congress, including moderate Republicans, as well as the Department of Health and Human Services, where such a proposal might otherwise have gestated.  They listened instead to a host of academics and other “experts,” sometimes in gatherings of 100 or more people who wrangled far into the night. When a plan finally emerged from this laborious process in September, it was bulky beyond belief, l,342 pages long.  Liberals were upset that Clinton, perhaps fearing political repercussions in 1996 if he called for tax increases to support a governmentally financed plan, did not recommend a “single-payer” system such as the one in Canada. Rather, the plan required most employers to pay for 80 percent of their workers' health benefits. The key to this system would be regional insurance—purchasing alliances that were expected to promote “managed competition” among private health insurers, thereby lowering premiums. The government was to pay for the uninsured, ensuring universal coverage.

Most liberals agreed that the plan, though complicated, promised to reduce economic inequality in the United States. Some large employers, too, backed it, in the hope that it would reduce the cost of their health care benefits for their workers. From the start, however, the proposal ran into sharp opposition from interest groups, notably from small insurers, who feared that larger companies would squeeze them out of the action, and from many small employers, who bridled at being told to pay for 80 percent of their workers' health premiums. Aroused, they spent millions of dollars on television ads denouncing the plan. On Capitol Hill, [Republican leader Newt] Gingrich aroused his forces to fight the effort. The Clintons, he said, “were going against the entire tide of Western history. I mean centralized, command bureaucracies are dying. This is the end of that era, not the beginning of it.”

Foes such as these seriously damaged chances for reform, as did Clinton when he refused to consider compromises that would have settled for less than universal coverage. In 1994, when congressional committees began to consider his plans, the opposing interest groups mobilized effectively.  As Moynihan had warned, moreover, it was not only “selfish interest Groups” that were cool to Clinton's plans: The majority of Americans (those with health insurance) seemed mostly content with their fee-for-service arrangements and exerted little pressure for the erection of a new and complicated system. So it was that Clinton's most ambitious dream never even reached a vote on the floor of the Democratic Congress. It finally collapsed in August 1994.  Badly beaten, the president was forced to drop the issue, leaving millions of Americans without coverage and millions more dependent on the will or the capacity of employers to provide for them.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 328-330.


TERRORISM IN THE 1990s

In the following vignette historian Pauline Meier describes the emerging Al Qaeda terrorist network led by Osama bin Laden and its relationship to the Taliban, who controlled Afghanistan through the decade.

The sanctions against Iraq and the civilian suffering they generated, the presence of American troops on Saudi Arabian soil during and after the Gulf War, and the United States' support of Israel all angered a number of Muslims in the Middle East.  They infuriated Osama bin Laden, a rich Saudi exile living in Afghanistan.  Bin Laden hated the United States enough to finance a network of terror called Al Qaeda, directed against the country.  In February 1993, four Muslim terrorists connected to bin Laden exploded a car bomb in the garage under one of the World Trade Center towers in New York City.  Although they failed in their ambition to topple the tower into its twin, they succeeding in killing 6 and injuring more than 1,000.  In 1996, terrorists drove a truck bomb into an American army barracks in Saudi Arabia itself, killing 19 U.S. military service people.  And in 1998, several other suicide truck bombers blew up the American embassy in Tanzania, killing 11, and [the one] in Kenya, killing 213 Kenyan citizens and injuring thousands of civilians.

A few hours after the attacks in 1998, President Clinton declared, "We will use all the means at our disposal to bring those responsible to justice, no matter what or how long it takes." In an operation codenamed "Infinite Reach," U.S. planes attacked two targets believed to be associated with bin Ladenthe Al Shifa pharmaceutical plant in Sudan, alleged to be a source of biochemical weapons, and a temporary base camp in Afghanistan, labeled by Clinton "one of the most active terrorist bases in the world."  (The owner of the plant denied that he had anything to do with bin Laden, and reporters visiting the site saw no evidence that he did.)  During the trial of the organizers of the Africa bombings, testimony indicated that bin Laden and Al Qaeda had attempted to acquire weapons of mass destruction about five years earlier.

In 1996, the Taliban, a group of extreme Islamic fundamentalists, gained control of Afghanistan and extended their protection to bin Laden as a "guest."  In October 1999, the U.N. Security Council, alarmed, resolved to impose limited sanctions against the Taliban in an effort to force them to turn over bin Laden immediately to a country where he could be brought to justice.  The Taliban refused, and bin Laden and Al Qaeda grew bolder. A year later, terrorists linked to bin Laden attacked the USS Cole while it was anchored in the Yemeni port of Aden, killing 17 of its crew and injuring 47.

Between 1993 and 1999, the FBI's counterterrorism budget more than tripled, to some $300 million a year.  Still, in the wake of so many successful assaults, a number of analysts believed that the United States was inadequately on guard against the war of terrorism that was increasingly being waged against it.  Some contended that it was only a matter of time before the terrorists would strike on American shores with far greater destructive effect than they had achieved in the 1993 bombing at the World Trade Center.

Source: Pauline Maier, Inventing America: A History of the United States, vol. 2 (New York, 2003), pp. 1062-1063.


INCOME INEQUILITY

In the following account James Patterson describes the growing economic gap between the wealthiest and poorest Americans which seemed to be accelerating at the end of the 20th Century.

Commentators focused grimly on the stressful culture of work in the United States. Americans, they pointed out, toiled for far more hours per week than did people in most other industrialized nations. Workers, stressed out, had little time to spend with their families or to volunteer for community activities. Wages and salaries, though rising for most workers in the late 1990s, never seemed sufficient. As one unhappy Chicagoan, managing director of a company, despaired in 1997, “I earn more in a month than my dad did in a year, but I feel my life is more difficult.” He added, “I don't gamble, I don't have season tickets to the Bulls. How can I earn this much but not have anything left over?”

Some of these many complaints about American economic, environmental, and social conditions in the mid- and late 1990s were on target. Until very late in the decade, poverty remained stubborn, reflecting not only the large number of low-income female-headed families and racial inequalities-but also the persistence of holes in the nation's safety net. Thanks to poverty, drug abuse, and lack of adequate prenatal services in many low-income areas, infant mortality in the United States, though roughly half of what it had been in the 1970s, continued to be higher than it was in twenty-five other industrialized nations.  

The “underclasses” in America's urban ghettos, Native Americans on reservations, and migrant workers and other low-income people in depressed rural areas still struggled to stay afloat. As in the 1970s and 1980s, long-range structural trends advancing the spread of relatively low-wage service work, as well as competition from abroad, threatened American manufacturing jobs. The wages of production and non-supervisory workers continued to stagnate.  Though Congress raised the minimum wage in 1996 (from $4.25 to $5.15 an hour), its real buying power, having fallen since the 1970s, continued to decline.

Americans with full-time employment (still reckoned, as it long had been, at forty hours per week) did in fact work considerably longer hours on the average-perhaps 350 to 400 more a year-than did Western Europeans, who enjoyed shorter workdays and more holidays.  Many Europeans (living in the cradle of Calvinism) were stunned by the strength of the work ethic in the United States and deplored the stress that they said it created. Critics were also correct to point out that American energy use remained enormous: With approximately 6 percent of the world's population, the United States in the late 1990s was annually responsible for a quarter of the planet's total energy consumption and emitted a quarter of the world's greenhouse gases. By 2002, the United States had to import 54 percent of its crude oil, compared to less than 40 percent during the frightening energy crises of the late 1970s.

It was also true that America's eager consumers and investors were continuing to amass levels of personal debt that were far higher than those in other nations. People were also gambling more than ever, and speculating eagerly in the stock market, sometimes as day traders and as members of proliferating investment clubs. Given the listed value of stocks, this activity was hardly surprising: Between January 1991, when the Dow Jones industrial average hit a low of 2,588, and January 2000, by which time it had soared to a high of 11,722, stock prices more than quadrupled.  In the same month of 2000, starry-eyed (though, as it later turned out, badly blurred) visions of the future led AOL to acquire Time Warner for $180 billion in stock and debt. This, the largest corporate merger in United States history, was but the most spectacular instance of a merger mania that dwarfed that of the Reagan years. Successful investors such as Warren Buffett (the “oracle of Omaha”) of Berkshire Hathaway and Peter Lynch, who managed the Magellan Fund of Fidelity Investments, received adulatory attention in the media, in a culture that seemed more mesmerized than ever by dreams of moneymaking. By 2001, 51 percent of American families had some investments in stock, compared to 32 percent in 1989 and 13 percent in 1980.

Trouble lay ahead, especially for tech-obsessed buyers who were plunging their money into increasingly overpriced “dot-com” stocks. Federal Reserve chairman Alan Greenspan, who most of the time intoned that the United States was entering a “new age” economy of huge potential, paused in December 1996 to warn against America's “irrational exuberance.” Greenspan did not want to stick a pin in the bubble, however, and stock prices continued to increase greatly until early 2000. By that time, enthusiastic onlookers were declaring that the United States had become a “shareholder nation.”  The boom in stocks, part of the larger advance of prosperity in the late 1990s, did much to give Americans-already feeling good about the end of the Cold War, a triumphant but illusory sense of empowerment.

Most economists agreed that inequality of income, as measured by shares of national earnings held by various levels of the income pyramid, was not only continuing to rise in the United States but also that it was sharper than in other industrial nations. The share of aggregate income held by the poorest one-fifth of American households declined from 4.4 percent of total income in 1975 to 3.7 percent in 1995, or by almost one-sixth. The share held by the richest fifth increased in the same years from 43.2 percent to 48.7 percent, a rise of more than 12 percent. The IRS reported in 1999 that 205,000 American households had incomes of more than $1 million. The very wealthy, including many CEOs, were enjoying salaries, perks, and comforts on an unprecedented scale. By 1998, the average income of the 13,000 wealthiest families in the United States was 300 times that of average families. These families earned as much income as the poorest 20 million families.

Why this inequality continued to mount remained disputed. Some writers emphasized that top corporate leaders had become greedier and less paternalistic and that tax cuts favoring the very wealthy were to blame. Others stressed that racial discrimination still played a key role, and that female-headed families, which were disproportionately African American, and steadily larger numbers of relatively poor immigrants weighted the bottom of the income pyramid. The increase in immigration was surely an important source of ascending inequality. Virtually all analysts agreed that another major cause of inequality was lack of growth in relatively well paid manufacturing employment and the ongoing rise in the number of low-wage service-sector jobs. Many of these openings were taken out of necessity by women, recent immigrants, and other people with low levels of education and skill.

All these factors helped account for the worsening of economic inequality. So did the actions of some large corporations. The inflated sense of monetary entitlement expressed by a large number of corporate executives” we made big profits for the company, and we deserve big rewards,” they insisted-exposed in an especially crass and magnified fashion the entitlement mentality of much of the culture at large.  Some major corporations, anxious to lessen huge obligations, began cutting back or discontinuing long promised defined-benefit pension and medical plans. Many employers continued to take a hard line with trade unions, whose losses of members badly sapped the bargaining power of organized labor. Lobbying effectively in Washington and in state capitals, representatives of big business, including titans of agribusiness, demanded-and often received generous subsidies, protections, and tax breaks from legislators. Not without cause, liberals (and others) concluded that the harshly dog-eat-dog approach of many American business leaders in the 1990s was creating a new, often nastier “corporate culture.”

Trends in education further threatened equality of opportunity in America. In the “knowledge economy” of globalization and computerization that spread in the 1990s, specialized expertise became particularly important in the professions, the sciences, and the business world, yet the cost of tuition and fees at most colleges and universities increased at a considerably more rapid rate than wages and salaries.  Although a handful of wealthy private universities managed to offer substantial financial aid to students, very few could afford to establish “need-blind” admissions programs or to set aside the large sums necessary to support graduate students. The sons and daughters of wealthy parents, enabled to attend expensive private schools and elite universities, were gaining an increasingly enviable edge over their economically less fortunate competitors. By 2000, many critics worried that an intergenerational educational elite was taking shape, which in the future would dangerously expand the power of class privilege in the United States.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 349-353.


CULTURE WARS

By the 1970s liberal and conservatives disagreed not only over domestic political issues and U.S. foreign policy goals, they now reserved their most intense arguments over the ways in which American society was being cultural transformed by changing attitudes over the family, religion, and sex.  Conservatives argued that these changes if unchecked, would lead to the decline of American society. Thus they and liberals engaged in “culture war” to determine which “type” of America would emerge.

Robert Bork, having been denied a seat on the Supreme Court by the Senate in 1987, emerged in the 1990s as a belligerent conservative in the “culture wars,” as contemporary writers saw them, of that contentious decade. He opened his angry, widely noticed Slouching Toward Gomorrah (1996) by citing William Butler Yeats's “The Second Coming,” written in 1919 in the aftermath of World War I. Among the poem's despairing lines were these: “Things fall apart; the center cannot hold; / Mere anarchy is nloosed upon the world, / The blood-dimmed tide is loosed, and everywhere / The ceremony of innocence is drowned; / The best lack all conviction, while the worst / Are full of passionate intensity.”

The subtitle of Bork's polemical book, Modem Liberalism and American Decline, highlighted two main themes that many conservatives bemoaned in the 1990s: America was in “decline,” and liberals were to blame for culture wars that were splintering the nation. Bork wrote, “There are aspects of almost every branch of our culture that are worse than ever before and the rot is spreading.” He fired at an array of targets: America's “enfeebled, hedonistic culture,” its “uninhibited display of sexuality,” its “popularization of violence in ... entertainment,” and “its angry activists of feminism, homosexuality, environmentalism, animal rights-the list could be extended almost indefinitely.” He closed by complaining that the United States was “now well along the road to the moral chaos that is the end of radical individualism and the tyranny that is the goal of radical egalitarianism. Modern liberalism has corrupted our culture across the board.”

Bork was by no means the only writer to lament the “decline” of America in the 1990S. Carl Rowan, an African American journalist, weighed in, also in1996, with an irate book titled The Coming Race War in America: A Wake-upCall. Though his major target was the totally different one of white racism, Rowan agreed that the United States was “in decline ... on the rocks spiritually, morally, racially, and economically.” He added, “Everywhere I see signs of decadence, decay, and self-destruction.” America, he said, was “sinking in greed” and in “sexual rot and gratuitous violence.” Inviting readers' attention to the fates of ancient Rome and Greece, and of the British Empire, Rowan despaired, “this country ... is in precipitous decline.”

Perceptive readers might have noted that Rowan's allusion to Rome, Greece, and the British Empire echoed warnings that Paul Kennedy, a liberal, had issued in his widely cited Rise and Fall of the Great Powers, published in 1987. They would also have known that the ideological and cultural warfare that seemed to wrack the early and mid-1990s had its origins in battles that had escalated as far back as the 1960s. These had heated up in the late 1980s, when the literary critic Allan Bloom, in his aggressively titled The Closing of the American Mind, had lashed out against the trivialization of American intellectual life. In the same year, E. D. Hirsch Jr., in Cultural Literacy: What Every American Needs to Know, more temperately complained of what he and fellow authors considered to be the bewildering and culturally divisive nature of curricula in the schools.

Jeremiads about “American decline,” however, seemed to hit a larger cultural nerve in the early and mid-1990s. Many of these, like Bork's, emanated from conservatives who were feeling marginalized by liberalizing cultural changes and who were outraged by what they perceived as ever expanding evils:  sexual immorality, violent crime, vulgarity and sensationalism in the media, schools without standards, trash that passed as “art,” and just plain bad taste. As Zbigniew Brzezinski wrote in 1993, a “massive collapse ... of almost all established values” threatened to destroy American civilization. What was one to supposed to think, other critics demanded, about a jury that awarded a woman $2.9 million because she had spilled hot coffee from McDonald's that had badly scalded her? Or about Bill Clinton, president of the United States, who responded to a questioner who asked him on M1V whether he wore boxers or briefs? Perhaps thinking of the youth vote, Clinton replied, “Usually boxers.”

As earlier, many conservative writers located the source of cultural decline in the way Americans-especially boomers-had raised their children. For culture warriors such as these, old-fashioned “family values” were among the highest of virtues. Alarmed by what they perceived as the catch as-catch-can quality of family life, they highlighted articles that reported only 30 percent of American families sitting down to eat supper together as opposed to 50 percent that were supposed to have done so in the 1970s. As David Blankenhorn, director of the revealingly named Institute for American Values, exclaimed in 1993, America's central problem was “family decline.” He added, “It's not the ‘economy, stupid.' It's the culture.”

Religious conservatives, enlarging organizations such as the Family Research Council, swelled this chorus of laments, evoking outcries from liberals who warned that that the Religious Right was becoming ever more aggressive in waging wars against abortion, gay rights, and other matters. Though [Rev. Jerry] Falwell, having weakened the Moral Majority by giving highly controversial speeches (in one, he defended the apartheid policies of South Africa), disbanded the organization in early 1989, a new force, the Christian Coalition, grew out of Pat Robertson's presidential campaign of 1988. Appearing on the scene in 1989, it rapidly gained visibility under the leadership of Ralph Reed, a young, boyish-faced Georgian who had earlier headed the College Republican National Committee. Reed displayed extraordinary political, organizational, and fund-raising skills and managed at the same time to earn a PhD in history from Emory University in 1991. By mid-1992, the Christian Coalition claimed to have more than 150,000 members and to control Republican parties in several southern states.

In the early 1990s, another religious group, the Promise Keepers, also came into being. Founded by Bill McCartney, football coach at the University of Colorado, this was an all-male organization of evangelical Christians who vowed to cherish their wives and children and thereby strengthen family life in America. Growing slowly at first, Promise Keepers surged forward by mid-decade. At its peak in 1997, it staged a massive meeting and rally on the mall in Washington, where an estimated 480,000 men were said to promise to be loving and supportive husbands and fathers.

Many Americans who joined groups such as these were still contesting the divisive cultural and political legacy of the 1960s--a secular legacy, as they saw it, of pot smoking, bra burning, love beads, radical feminism, black power, crime in the streets, pornography and sexual license, abortion, family decline, Darwinian ideas of evolution, and gross-out popular culture. Stung by what they considered to be the hauteur of upper-middle-class liberals, they complained that an elitist left-wing liberal culture had

captured universities, foundations, Hollywood, and the media. A “great disruption” was ravaging late twentieth-century America.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 254-257, 260-261.


IMMIGRATION: THE DEBATE CONTINUES

This vignette describes the issue of immigration which returned to the category of a major political debate by the 1980s and has remained so since then.

In challenging George H. W. Bush for the GOP presidential nomination in 1992, Patrick Buchanan proclaimed that rising immigration was threatening to tear the United States apart. “Our own country,” he said, “is undergoing the greatest invasion in its history, a migration of millions of illegal aliens yearly from Mexico .... A nation that cannot control its own borders can scarcely call itself a state any longer.”

Though Buchanan was an especially vocal opponent of large-scale immigration, he was by no means the only American to fret about the “Balkanization” of the nation, or about the surge of “multiculturalism,” as rising rights-consciousness by various minorities was dubbed at the time. Six years earlier, 73 percent of voters in California had approved Proposition 63, which amended the state constitution so as to establish English as the state's “official language.” Seventeen other states followed California's example in the late 1980s. Though Proposition 63 was not implemented in California, its symbolic thrust-aimed in part against bilingual education programs--was clear. In California, as in Texas and other states where high numbers of immigrants had been arriving since the 1970s, ethnic tensions were rising….

Given the rapidly rising number of immigrants to America since the 1970S, it was hardly surprising that alarmists such as Buchanan captured attention in the late 1980s and 1990s. These numbers were striking compared to those of the recent past. Between the early 1920s, when restrictive and racially discriminatory immigration laws had been enacted, and the late 1960s, when new, more liberal legislation of 1965 began to take effect, immigration to the United States had remained low. In the thirty-four years between 1931 and 1965, the total number of legal immigrants had averaged around 150,000 per year, or around 5 million in all. Thereafter, the numbers exploded in size, to 4.5 million during the 1970s, 7.3 million during the 1980s, and 9.1 million during the 1990s. Many millions more, guesses placed these numbers at between 250,000 and 350,000 per year in the 1980s and 1990s, entered illegally.  The total number of immigrants who came to the United States between 1970 and 2000 was therefore estimated to be slightly more than 28 million. Their arrival raised the percentage of Americans who were foreign-born from 4.7 in 1970 (an all-time twentieth-century low) to 10.4 in 2000, or 29. 3 million people in a total population that had risen from 203.3 million in 1970 to 281.4 million thirty years later.

The primary origins of these new arrivals--Latin America and East Asia--were strikingly different from those early in the century, when most immigrants had come from Eastern and Southern Europe. Between 1980 and 2000, only two million people arrived from Europe, most of them from Eastern Europe or from the Soviet Union and its successor states. Considerably more people, 5.7 million, came from Asia, and 6.8 million were natives of Mexico, Central America, and the Caribbean. An additional million hailed from South America. Smaller numbers migrated from Africa (around 600,000) and from Canada (250,000). Some 4 million legal immigrants, nearly a fourth of the total of legal arrivals from all countries during these twenty years, came from Mexico alone.

If migrants from Asia and from south of the border were identified as “people of color”--as many were--the United States was experiencing something of a “complexion revolution” in these years. Nearly three-fourths of the newcomers were Asian (26 percent of the total) or Latino (46 percent) by background.  By 2002, the number of American people (immigrants and others) identified as Latinos (38.8 million, or 13.5 percent of the population) had surpassed the number who were African American (36.7 million, or 12.7 percent of the population). The number of Asian Americans, which had been tiny in 1970, had also become impressive: 13 million, or 4 percent of the population by 2002. As of 2000, more than half of California's population was Asian, Latino, or black….

Thanks to an odd coalition of interests with influence in Congress, the liberal immigration policies that had flourished since 1965 managed to survive. This coalition united legislators (many of them conservative on other issues) who heeded the interests of employers in their constituencies: cash crop farmers, retail chain managers, hotel and restaurant owners, parents looking for housekeepers or babysitters-with liberals and others who sympathized with the plight of would-be immigrants (many of them refugees from oppression) and who proclaimed the virtues of cultural pluralism and ethnic diversity. Allying with the employer interests that clamored for low-wage workers, Americans with multicultural views such as these, including increasing numbers of newly naturalized voters-were more successful politically in the earlier in the century, when Congress had enacted tough, racially discriminatory immigration laws and when highly ethnocentric Americanization programs had proliferated in school districts. The political influence of pro-immigration views such as these was one of many indications that the United States in the 1980s and 1990s, a more welcoming nation than many other Western countries, was more receptive to ethnic diversity—more tolerant--than it had been in the past.

A cartoon in 2003 captured the political power of pro-immigrant interests in the United States- interests that helped to sustain one of the greatest social and cultural changes of late twentieth-century American history. It depicted a cluster of reporters with microphones surrounding a United States senator. A newsman asked him, “So you endorse the idea of sending all illegal immigrants back where they came from, Senator?” He replied, “Right! As soon as the grass trimming, cleanup, farm picking, and fast food work is done.”

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 292-296, 298-304.


DREAMS OF PROSPERITY: NEWPORT, OREGON, AND LATINO IMMIGRATION

In a 1992 Eugene [Oregon] Register-Guard article, reporter Larry Bacon describes the experiences of recently arrived Latino immigrants in Newport, Oregon.  Here is his account. 

NEWPORT Narciso Tamayo and Jesus Hernandez came to Newport in search of a better lives for themselves and their families Tamayo, 37, a former shoemaker from the industrial city of Purisime del Bustos in central Mexico, left his family behind and came to Newport four years ago when there were few Hispanics in the area Hernandez, 23, a former fisherman from the seacoast city of Puerto Angel in the south of Mexico, arrived in March with his wife and two children.  At the same time Hernandez arrived, hundreds of other Hispanics also came looking for jobs in Newport's new whiting processing industry.  The workers brought racial diversity to a community where few people of color have lived before.  But life in the United States is not always easy for the two men, and their dreams have proved somewhat elusive. 

Yet Tamayo has grown comfortable with his new life in Newport.  He's learned to speak English fairly well.  He has friends in both the Anglo and Hispanic communities.  He's been able to find enough work at the seafood plants to stay employed almost year-round.  He makes from $18,000 to $24,000 a yearmuch more than he could hope to make in Mexicoand still spends two months each winter at home with his family.  

Even though he hopes to bring his family to this country someday, he has some reservations. "I am afraid the white people have prejudice about my kids," he says.  Most of the prejudice he's experienced has not been not overt "It's something you can feel when they see you."  He recalls a white co-worker telling him a joke based on the racial stereotype that Mexicans steal. "It's like he was trying to be nice, but at the same timeput the knife inside," Tamayo says. Prejudice kept the local Eagles lodge from accepting him as a member, both he and a lodge official say.  Tamayo rejected a friend's advice to sue the lodge for discrimination, however. "I don't want to make trouble with anybody," he says.  Dick Gearin, president of the Eagles lodge, says Tamayo and three other Hispanics were "blackballed" by three members who were angry about problems some other Hispanics had caused at a lodge function.  At the time, three negative votes could bar anyone from membership.  Gearin, who helped sponsor Tamayo..., says he and most other lodge members were so upset by the blackballing that they changed the rules.  Now members are admitted by majority vote.  Tamayo and his friends have since joined the Eagles lodge at nearby Toledo.

Meanwhile, Hernandez and his wife, Saray Gabriel Luna, are less concerned about prejudice than learning English and making their way in a new country. They say they have made Anglo friends who have been warm and friendly. The friends, primarily from their church, have invited them to dinner and given them clothes for their children.

They have had help learning American ways from Luna's older sister, Maria Luisa Dale, who married an Anglo and moved to Newport eight years ago.  The young newcomers lived with the Dalles for four months until they could rent a one-bedroom apartment of their own.  Hernandez dreams of making enough in the fish plants to return to Puerto Angel and buy a small fishing boat for about $3,000.  But it is expensive for them to live in Newport, and they have saved little so far.  Their salaries$5.75 an hour for him and $5.25 an hour for her part-time workare eaten up by living expenses, particularly rent.  Their tiny apartment costs $340 a month.  Now the whiting season is over, and they have both been laid off. They are looking for any type of work to tide them over until whiting season begins again next April...

Source: Eugene [Oregon] Register-Guard, November 8, 1992, p. 1


NAFTA

In 1992 President George Herbert Walker Bush negotiated the North American Free Trade Agreement with the governments of the Canada and Mexico.  The agreement, however, was finally ratified during the Clinton Administration and began in January 1994.  NAFTA as it was called, was controversial and extremely unpopular among labor union members and environmentalists.  In this account historian James Patterson describes its impact.

Having secured the budget package, [President Bill] Clinton concentrated on another domestic goal that he favored on its merits and that he hoped would further establish his credentials as a moderate. This was congressional approval of the North American Free Trade Agreement that Bush had negotiated with Canada and Mexico in December 1992. The agreement proposed to create a free-market trading zone involving the three nations. Clinton, a strong advocate of more open trade, allied himself with leading corporate figures and with Republicans in Congress, including Gingrich.  In the process he encountered heated opposition from labor union leaders and from many Democrats, including House majority leader Gephardt, who feared that American corporations would move their operations to cheap-labor Mexico, thereby harming American workers. Opponents of NAFTA also demanded better safeguards against environmental pollution that they expected would spread in Mexico and across its border into the United States. Clinton, however, refused to compromise, and NAFTA, approved in late 1993, went into effect in January 1994.

NAFTA did not seem to greatly benefit Mexico, which suffered, as earlier, from widespread poverty and unemployment. Struggling peasants raising maize, hit hard by competition from the United States, were devastated. These and other desperately poor people continued to stream into the United States, provoking rising tensions in many parts of the Southwest. Meanwhile, soil and air pollution, already heavy in many areas of Mexico, increased. Whether NAFTA was good or bad for the economy of the United States continued to be hotly debated during the 1990s and later. Clinton and a great many economists maintained that breaking down trade barriers forced American exporters to become more efficient, thereby advancing their competitiveness and market share. American workers, therefore, would benefit, at least in the long run. The flight of  American jobs to Mexico, moreover, turned out to be smaller than many NAFTA opponents had predicted, and thanks to America's strong economy in the late 1990s, most people who were displaced from their jobs in the United States seemed to find other work. America's unemployment rate decreased from 6.1 percent in 1994 to a low of 4 percent in 2000.

But some corporations did move operations to Mexico, and pollution did plague some areas near the Mexican-American border. Labor leaders, complaining of the persisting stagnation of manufacturing wages in the United States, continued to charge that American corporations were not only “outsourcing” jobs to Mexico (and to other cheap-labor nations) but were also managing to depress payrolls by threatening to move. When the American economy soured in 2001, foes of NAFTA stepped up their opposition to it.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 333-334.


IMMIGRATION, GLOBALIZATION, AND “INSOURCING”

While much of the national attention was focused on NAFTA and the possibility of American jobs going abroad, there were growing numbers of immigrants who were taking low-wage jobs in the United States both before and after the treaty was ratified.  What follows is an account of two of those immigrants, Petra Mata from Mexico and Feiyi Chen from China.

Petra Mata: My name is Petra Mata. I was born in Mexico. I have completed no more than the sixth grade in school. In 1969, my husband and I came to the U.S. believing we would find better opportunities for our children and ourselves. We first arrived without documents, then became legal, and finally became citizens. For years I moved from job to job until I was employed in 1976 by the most popular company in the market, Levi Strauss & Company. I earned $9.73 an hour and also had vacation and sick leave. Levi's provided me and my family with a stable situation, and in return I was a loyal employee and worked there for fourteen years.  On January 16, 1990, Levi's closed its plant in San Antonio, Texas, where I had been working, leaving 1,150 workers unemployed, a majority of whom were Mexican-American women. The company moved its factory to Costa Rica....

As a result of being laid off, I personally lost my house, my method of transportation, and the tranquility of my home. My family and I had to face new problems. My husband was forced to look for a second job on top of the one he already had. He worked from seven in the morning to six at night. Our reality was very difficult. At that time, I had not the slightest idea what free trade was or meant….

Feiyi Chen: My name is Feiyi Chen. I immigrated to the United States in December 1998 from China. I began my working career as a seamstress in a garment factory because I did not speak English and the garment manufacturing industry was one of the few employment opportunities available to me. I typically worked ten hours a day, six days a week, at a backbreaking pace .... I learned from some of the older garment workers that garment workers in San Francisco actually made a decent living before free trade lured many of the better paying garment factories over to other countries and forced the smaller and rule-abiding factories to shut down because they could not compete with the low cost of production from neighboring countries.....

Working as a seamstress and an assembly worker has always been hard, but with so many of the factories leaving the country in search of cheaper labor, life for immigrant workers like myself is getting worse. For example, many garment workers who were paid one dollar for sewing a piece of clothing are now only making fifty cents for the same amount of work. There are a lot of garment workers who still work ten hours a day but make less than thirty dollars a day.  

Source: Christine Ahn, Shafted: Free Trade and Americas Working Poor (Oakland: Food First Books, 2003), pp. 32-38.


THE CONTINUING SIGNIFICANCE OF RACE IN THE 1990s

In the 1970s some conservative and liberal social observers confidently predicted that class would soon overtake race as the major social divide in American society.  In following account of the 1990s historian James T. Patterson explores how that change may yet occur but as of the last decade of the 20th Century race remained the great marker of division.

In 2004, the black scholar Henry Louis Gates Was among the many worried Americans who looked back on recent trends and surveyed some roads not yet taken. The 1990s, Gates concluded, were “the best of times, the worst of times.” As he pointed out, black-white relations continued to be the nation's number one socio-economic problem.

Central to this problem was the enduring power of social class--a power nearly as great, many observers maintained, as that of race. Though middle-class blacks gained economically in the late 1990s, the median money income of African American households in 2000 was still only 69 percent as high as that of white households. Statistics measuring personal resources that include not only income but also inheritance, possessions, and investments revealed that the average net worth of African Americans may actually have declined relative to that of whites, from one-eighth of white net worth in the 1970s to one-fourteenth by 2004.

Though poverty rates among blacks were declining, millions of African Americans remained in need. Their poverty rate in 2000 was still 2.5 times that of whites. The unemployment rate for blacks (7.6 percent in 2000) remained more than twice what it was for whites (3.5). A third or more of black people who were eligible for means-tested programs like food stamps or Medicaid were unaware of their eligibility.  African Americans were far more likely than whites to lack health insurance. For these and other reasons, the life expectancy of blacks continued to lag behind that of whites: In 2000, it was 71.2, compared to 77.4 for whites. 

The “underclass” problem, while slightly less severe during the more prosperous late 1990s, had surely not gone away. The grim statistics on black crime and imprisonment were vivid, shaming reminders of that fact. Dramatizing these problems, Louis Farrakhan, head of the Nation of Islam, organized a widely publicized Million Man March of black men in Washington in 1995. Black men, he said, were going to “clean up their lives and rebuild their neighborhoods.”

Equally worrisome were numbers regarding poverty among black children. By 2000, these numbers looked better than they had between 1970 and 1995, when more than 40 percent of black children under the age of eighteen had been so classified. But in large part because of the still high percentages of female-headed black families, 30.4 percent of Mexican Americans under the age of eighteen lived in poverty in 2000. Many of these children had serious health problems: Like millions of children in low-income white families, they suffered from high rates of asthma, mental retardation, lead poisoning, diabetes, and learning disabilities.

It was also obvious that residential segregation remained widespread in the 1990s. Though it was true that one-half of Mexican Americans resided in neighborhoods that were at least 50 percent non-black, an additional 40 percent lived within almost wholly black enclaves. In a great many parts of the country, America remained a nation of vanilla suburbs and chocolate cities. Some suburbs, too-such as Prince George County, Maryland - had become heavily black in composition. Many blacks, of course, preferred to reside in predominantly black areas; living close to whites had little appeal to them. It was also clear that significant cultural preferences continued to hinder relaxed interracial socialization: Mexican Americans, blacks, and whites had distinctly different tastes in music, film, and television shows. In any event, truly mixed-race neighborhoods and social groups remained very much the exception rather than the rule in the United States.

Studies of marriage further exposed continuing racial divisions. Concerning this ever sensitive issue, some statistics suggested that increasing numbers of black-white marriages might be launching a trend toward interracial amalgamation. In 2000, for instance, there were an estimated 363,000 black-white married couples in the United States, a 70 percent increase over the number (211,000) in 1990. This signified a rise in the percentage of married Mexican Americans with non-black spouses from 6 to 10 within only ten years. The percentage in 2000, moreover, was higher than the percentage of Jewish-gentile marriages had been in 1940---a rate that escalated to 50 percent during the ensuing sixty years. In time, some people speculated, comparably rapid increases in black-white marriages might develop. Moreover, percentages of white-black cohabitation in the 1990s were thought to be higher than those of white-black marriage. Statistics such as these made it obvious that America had moved substantially beyond the situation in 1967, when the Supreme Court, in Loving v. Virginia, had finally ruled that laws against interracial marriage were unconstitutional.  As of the early 2000s, however, the percentage of blacks and whites who were intermarrying was still small-far smaller than the percentages of American-born Latinos, American-born Asians, or Native Americans who were doing so.  And neither TV nor Hollywood appeared eager in the early 2000s to depict romance across the color line. In the new century, it was premature to predict large increases in the number of black-white marriages in the future.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 308-310.


SERBIA 

The major military intervention of the United States in the last two decades of the 20th Century took place in the divided nation formally known as Yugoslavia.  The long simmering cauldron of ethnic tension between the Croatians, Bosnians, Serbs, and smaller groups spilled over into open violence and the independence of new nations. Violence, further complicated by rivalries between Christians and Muslims in these new nations, often turned to campaigns of ethnic cleansing (genocide) which prompted a number of European nations and eventually the United States to intervene.  What follows is a brief description of the U.S. intervention.   

During the politically tense summer months of 1995, [President Bill] Clinton tried to reestablish diplomatic relations with Vietnam. Though his initiative aroused heated controversy, it succeeded in July after senatorial veterans of the war-John Kerry of Massachusetts, John McCain of Arizona, Robert Kerrey of Nebraska-aided him politically by stating that Vietnam was not hiding POWs.

Debates over the recognition of Vietnam, however, paled in seriousness compared to political controversies over the fighting and ethnic cleansing, ferociously conducted by Bosnian Serbs, that continued in 1994 and 1995 to ravage Croatia and Bosnia. Clinton, carefully studying public opinion polls, still recognized that most Americans (like NATO allies) remained nervous indeed about being swept into the carnage. The United Nations, equally cautious, had agreed only to keep their weakly supported force of 6,000 peacekeepers in Bosnia and (as of late 1994) to authorize minor air strikes by NATO planes. These were pinpricks that in no way deterred the Bosnian Serbs. In July 1995, the Serbs forced some 25,000 women and children to flee chaotically from the town of Srebrenica, a “safe area” in eastern Bosnia that sheltered 40,000 Muslim refugees who were supposedly being protected by the U.N. peacekeepers. The Bosnian Serbs then murdered between 7,000 and 7,500 Muslim men and boys.

This barbaric act coincided with major developments on the military front. A Croatian army, having been trained in part by the United States, joined uneasily with Muslim military forces and waged a devastating offensive that soon drove the Bosnian Serbs out of Croatia and northwestern Bosnia. President Milosevic of Serbia, who once had dreamed of controlling sizeable chunks of Croatia and Bosnia, watched glumly as thousands of Serbs fled toward Belgrade.

These dramatic events spurred new approaches to the situation in the region. The massacre at Srebrenica appalled many Americans, including members of Congress who had long urged the United States to take a tougher stance against the Serbs. The impressive Croat-Muslim offensive, they insisted, indicated that forceful NATO engagement would finally win the war…  As Clinton pondered his options, the Serbs shelled the marketplace of Sarajevo, killing thirty-eight civilians. This attack on August 28 induced the president to act. Two days later he authorized American participation in massive NATO air strikes against Bosnian Serb positions around Sarajevo.

Many Americans opposed this move, perceiving it as meddling in a faraway civil war. But seventeen days of more extensive bombing smashed Serbian positions. Together with the continuation of aggressive fighting by Croatian and Muslim ground forces, the bombing forced Milosevic to negotiate. In November, he met for three weeks with European and American representatives and with Croatian and Bosnian Muslim leaders for talks at an American airbase in Dayton, Ohio. These discussions confirmed an uneasy cease-fire and produced a settlement. Under the Dayton Peace Accords, brokered by Assistant Secretary of State Richard Holbrooke, a single state, the Federation of Bosnia and Herzegovina, was created. It was to have a “dual government” in which Muslims and Croats were to share power. Displaced persons were to return to their homes, and an international tribunal was to try alleged war criminals. An international authority was to oversee the area. The United States agreed to send 20,000 troops to the region as part of a force of 60,000 NATO soldiers who would uphold the accords.

Because these troops, heavily armed, were expected to succeed in maintaining order, and because Clinton indicated that the American soldiers would leave within a year, most Americans seemed to acquiesce in this considerable expansion of United States military presence abroad…. The forceful military intervention of the United States, aided by its NATO allies, was a turning point…in the post-Cold War history of American foreign policies. It signified that when the world's number one military power decided to use its awesome might as part of efforts to stamp out killing in Europe that otherwise was unlikely to stop. It also suggested that the end of the Cold War would not enable the United States to retreat back across the Atlantic: Its international responsibilities as a military giant might be difficult in the future to limit.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 368-371.


MONICAGATE

The following account describes the biggest presidential scandal since Watergate, the one that ultimately led to the impeachment of President William Jefferson Clinton

Thanks in large part to the influence of a family friend, who contributed generously to the Democratic Party, Monica Lewinsky secured an internship at the White House in July 1995. Raised in Brentwood, O. J. Simpson's Los Angeles neighborhood, and a recent graduate of Lewis and Clark College in Oregon, she was then twenty-one years old. Once in the nation's capital, she lived at the Watergate apartment complex. Four months later, on November 15, she and Bill Clinton had a tryst in the White House, the home, of course, of the president and his wife. This was the second day of the partial shutdown of the federal government that had stemmed from partisan battling over the budget that summer and fall, and the White House staff of 430 had temporarily shrunk to 90.

Their tryst was the first of ten meetings, nine of them involving Lewinsky performing oral sex, that the furtive couple arranged over the next sixteen months, most of them between November 15, 1995, and April 7, 1996, by which point Lewinsky, who had by then become a government employee, had been transferred to the Pentagon. Most of these rendezvous took place either in a small private study just off the Oval Office or in a windowless hallway outside the study. According to Lewinsky, the president telephoned her frequently, fifteen times to engage in phone sex. The couple also exchanged gifts. The last of their serious assignations took place on March 29,1997, after which they had two meetings at the Oval Office, in August and December 1997, involving kissing. 

There the matter might have rested. Though Lewinsky was a bold and brazen young woman who flirted with the president, lifting up her skirt before him and flashing her thong underwear, and though Clinton was known for his roving eye, no one on the scene imagined that the president was engaging in oral sex next to the Oval Office. Until January 1998, the American public had no idea that such meetings might be taking place.  If Clinton's near-legendary luck had held out-as it might have done if he had been chief executive during pre-Watergate days when reporters had turned a relatively blind eye to the promiscuity of politicians-he would have joined a number of other American presidents who had engaged in extramarital relations without being publicly exposed while in office.

A complex set of developments deprived Clinton of such good fortune. One of these was the partisan zeal of conservative political enemies, who had long regarded him as a schemer and a womanizer. Well before January 1998, when the Lewinsky affair hit the headlines, the president had been forced to hire attorneys to contest two investigations into his conduct. Both had landed him in the glare of often sensational news coverage. One of these was the probe by Kenneth Starr, the independent counsel who in mid-1994 had begun to dig into the Clintons' involvement in a range of matters, notably the Whitewater River land development project in Arkansas. The other was the civil suit that Paula Corbin Jones had filed in May 1994, alleging that as governor of Arkansas in 1991 he had sexually harassed her in a Little Rock hotel room.

Though Clinton's lawyers had fought these probes every step of the way, they had not been able to quash them. In May 1997, the Supreme Court unanimously rejected Clinton's claim that the Constitution immunized him while in office from civil suits such as the one that Jones had initiated. It ruled that the case must go forward. 3 Many legal analysts criticized this decision, arguing that it placed all future chief executives at risk of having to contend with time-consuming, potentially frivolous litigation alleging misbehavior that might have taken place before a president took office. The Court's decision nonetheless stood, forcing Clinton and a battery of high-priced legal help to cope with the suit thereafter.

Still, neither Starr nor the lawyers helping Jones began to learn much about Clinton's involvement with Lewinsky until the fall of 1997. At that point Linda Tripp, a co-worker of Lewinsky's at the Pentagon, became a major actor in the drama that was soon to unfold.  Tripp, nearly fifty years old when she met Lewinsky in 1996, nursed a number of grudges, especially against Clinton and his aides, who had earlier moved her out of her secretarial position at the White House and sent her to the Pentagon. When she came across Lewinsky, a fellow exile at the Pentagon, she realized that her young colleague was infatuated with Clinton. Pretending to befriend her, she learned that Lewinsky had enjoyed a sexual relationship with the president.

Starting in September 1997, Tripp secretly recorded a number of conversations in which Lewinsky revealed intimate details of her trysts. Tripp then began sharing the tapes with Paula Jones's legal team, which had been searching for evidence concerning extramarital affairs that Clinton might have engaged in over the years. When Lewinsky submitted an affidavit in January 1998 in the Jones case, Starr (who had talked to Jones's attorneys before becoming independent counsel and who closely monitored developments in that case) saw a chance to broaden his probe.

This complicated chain of circumstances and personal connections indicated several facts. First, Lewinsky, like many people having affairs, could not keep a secret. Second, Tripp was a friend from hell. Armed with apparently damning information from Tripp, Starr sought authority from Attorney General Janet Reno to broaden his office's inquiries. In particular, he hoped to prove that Clinton had engaged in a conspiracy to obstruct justice by getting Lewinsky a new job in New York and by encouraging her to commit perjury in her affidavit in the Jones case, and that he had violated federal law in his dealings with witnesses, potential witnesses, or others concerned with that case.

Reno, confronted with developments such as these, recommended to the three-judge federal panel empowered to oversee the activities of independent counsels that Starr receive authority to widen his investigation, and the panel quickly gave it. For Starr, who been unable to gather evidence that implicated the president in improprieties surrounding the Whitewater deals, this authority was wonderful. It was the key decision that enabled him to dig deeper and deeper and that led to one astonishing news story after another -- and eventually to the impeachment of the president.

At the same time, moreover, word leaked out, via the Drudge Report, an Internet site, that Clinton had been involved in an ongoing sexual relationship with a White House intern. The next day the Report updated the information, naming Lewinsky. Newspapers, distrusting the source, were at first reluctant to print such a sensational story, but the facts seemed to check out. When the Washington Post published the Drudge Report's information on January 21, its revelations rocked the nation. On and off for the next thirteen months, “Monicagate” dominated the front pages.
Clinton, who was a dogged fighter, refused to give ground… Pursuing a bold strategy, Clinton denied everything in the first few days after the news broke out-to his wife, Cabinet members, friends, aides, and interviewers. On January 26, in the presence of his wife and Cabinet officers in the Roosevelt Room of the White House he faced a battery of television cameras and vigorously proclaimed his innocence.

Wagging his finger forcefully at the cameras, he exclaimed: “I want to say one thing to the American people. I want you to listen to me. I'm going to say this again. I did not [here he slashed his finger downward) have sexual relations with that woman, Ms. Lewinsky. I never told anybody to lie, not a single time, never. These allegations are false (more vigorous gestures here) and I need to go back to work for the American people.”

His wife, Hillary, accepting her husband's statement, forcefully backed him up the next day, telling a vast television audience watching the Today show that his administration was being victimized by a “vast right-wing conspiracy.” Denouncing Starr, she added: “We get a politically motivated prosecutor who ... has literally spent four years looking at every telephone ... call we've made, every check we've ever written, scratching for dirt, intimidating witnesses, doing everything possible to try to make some accusation against my husband .... It's not just one person, it's an entire operation.”

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 387-390.


THE IMPEACHMENT OF PRESIDENT BILL CLINTON

In the vignette below historian James Patterson describes the second impeachment of a sitting president.

Congressional Republicans…were far less tolerant of Clinton's behavior than were the majority of the American people. Many of them hoped to advance partisan interests and raise campaign funds from their conservative base. On October 8, a month after they had received Starr's report, House Republicans voted to proceed with an impeachment inquiry. In speeches on and off the floor they excoriated the president. With the off-year elections less than a month away, they hoped for a GOP triumph and then for impeachment, which, as prescribed by the Constitution, would enable the Senate to put the president on trial. A vote of two-thirds of the senators would remove the president from office.

These Republicans were following the action from way out in right field, unable to see how out of touch they were with the major players or with public opinion. This became especially clear in the election results that November. In some gubernatorial races, notably in Texas, where George W. Bush scored a resounding victory, the GOP had cause for celebration. But the voting produced no changes in the partisan lineup of the Senate, which remained in the control of the GOP, 55 to 45. In the House, Democrats gained five seats, slightly reducing the Republican majority to 222 to 212.

This was the first off-year election since 1934 in which the party holding the White House had actually enlarged its numbers in the House of Representatives. [Congressman Newt) Gingrich, who had led the charge against Clinton, felt humiliated. Three days later he resigned as Speaker and announced that he would leave the House at the end of his term in January 1999. It also became known that he had been having an affair with a congressional aide young enough to be his daughter. When his designated replacement as Speaker, Robert Livingston of Louisiana, was revealed as an adulterer in December, he, too, announced that he would leave Congress in January. He proclaimed, “I must set the example that I hope President Clinton will follow.” Not all the impeachers, it seemed, were of unimpeachable moral character.

Undeterred by setbacks such as these, the Republican majority in the House pressed ahead to impeach the president. On December 19, a day on which Clinton had authorized massive American air strikes on Iraq, they achieved their aims. On largely party-line votes, Republicans prevailed on two of four counts recommended by the House Judiciary Committee. One charge, which stated that the president had committed perjury before a federal grand jury, passed, 228 to 206. The other, which asserted that he had obstructed justice in the Jones case, was approved by a vote of 221 to 212. Clinton was the first duly elected president in United States history to be impeached.

Because conviction in the Senate required a two-thirds vote of sixty-seven, the outcome of the trial in the upper chamber, where Republicans had only fifty-five seats, was never in doubt. The proceedings nevertheless lasted for thirty-seven days. House Republicans presenting their case to the Senate asserted that Clinton's perjury and obstruction of justice met the constitutional definition of “high crimes and misdemeanors” and justified his removal. 

Clinton's lawyers retorted that while he had acted badly, his misbehavior hardly justified the extreme step of ejecting him from office. On February 12, 1999, the Senate rejected the perjury charge, 55 to 45, with ten Republicans joining all forty-five Democrats in the majority. The vote on the obstruction of justice charge was closer, 50 to 50, with five Republicans joining forty-five Democrats in opposition. Clinton had won.

Though the biggest battles were over, several legal issues explored by Starr remained to be settled. By June 30, 1999 (at which point authorization for independent prosecutors was allowed to lapse), his investigations into Whitewater and other matters had succeeded in generating twenty indictments, fourteen of which resulted in pleas of guilty or convictions. Among those jailed were Webster Hubbell, a former law partner of Hillary Clinton and a deputy attorney general under Clinton, and Arkansas governor Jim Guy Tucker. Hubbell was guilty of fraud and tax evasion, Tucker of fraud. A month later Judge Susan Webber Wright, who had dismissed the Jones case. in 1998, ruled that Clinton had given under oath “false, misleading, and evasive answers that were designed to obstruct the judicial process.” Holding him in contempt, she ordered him to pay Jones's lawyers $90 ,000.

On Clinton's last day in office, he admitted giving false testimony regarding his relations with Lewinsky and was fined $25,000, to be paid to the Arkansas Bar Association. His law license was suspended for five years. In return he received immunity as a private citizen from prosecution for perjury or obstruction of justice. Finally, in March 2002, prosecutor Robert Ray, who had succeeded Starr in late 1999, delivered the concluding report of the investigation. It occupied five volumes and ran for 2,090 pages. It concluded that there was insufficient evidence that Bill or Hillary Clinton had been guilty of any crimes relating to Whitewater. It was estimated at that time that the investigation had cost American taxpayers a total of $60 million….

Several…conclusions about Monicagate seem irrefutable. First, though the partisan confrontation had been extraordinary, it was but the most sensational episode in a series of highly politicized conflicts over cultural issues that had arisen in the 1960s, that had divided American society in the 1970s and 1980s, and that had exploded in the culture wars of the early 1990s. Those struggles, like Monicagate, had revealed ideological polarization over standards of sexual behavior, as well as sharp differences over religion and a range of other socially divisive matters. The impeachment and trial of the president was the latest (though it wa not to be the last) of these politicized struggles, most of which liberal Americans, backed especially by younger people, had won.

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 395-399.


THE CHANGING QUALITY OF LIFE AT THE DAWN OF THE  21st CENTURY

In the 1960s Americans became increasingly concerned about the mostly-man made degradation of the environment.  That concern led to the environmental movement beginning 1970 with the first Earth Day and continuing to the present time.  In the account below historian James Patterson assesses the changes brought by that movement and more broadly the transformation of American life since the 1960s.  

In a great many ways the environmental movement that had surged ahead in the 1970s had become a mainstream phenomenon by the 1990s.  By then, pressure from activists, including “eco-feminists,” had prompted increasing public awareness about the dangers from toxic chemicals and lead poisoning. Other activists had stopped the authorization of environmentally controversial dams.  Recycling became the norm in most communities.  Acid rain decreased by one-half between 1970 and 2000. The ongoing development of high-yield agriculture led to the reforestation of a great deal of once cultivated land. Though runoff from agricultural chemicals was damaging, tougher controls on the dumping of sewage and industrial waste enabled many streams and lakes, including Lake Erie, where pollution had been near catastrophic, to regenerate.

Curbs on emissions from cars and smokestacks had helped clean the air. Despite population growth and a doubling of car miles traveled between 1970 and 2000, smog declined by one-third over those thirty years. The spread of energy-efficient household appliances, significant since the1970s, slowed the rise in use of electricity. SUVs aside, most automobile were more fuel-efficient than they had been in the 1970s. Thanks to improvements such as these, and to lower oil prices, energy costs, which had peaked at 13 percent of GNP during the oil crisis of 1979, declined to between 6 and 7 percent between 1995 and 1999.  Per capita consumption of energy in America, though increasing since the mid-1980s, rose more gradually than did population, or than economic output per capita.  The Very Bad Old Days of the 1970s, when extraordinarily wasteful uses of energy had helped to provoke national crises, appeared by 2000 to have ended.

In other ways, too, the quality of life in the late 1990s was better for most people than it had been in the 1970S and 1980s. One such change involved food. Though the massive consumption of junk food (and the sedentary life of riding around in cars and watching television) helped to drive a rise in obesity, the majority of Americans were also enjoying considerably greater choice in deciding what and where to eat.  In supermarkets as well as in urban and suburban restaurants, which proliferated greatly during the late 1990s, a wide variety of fresh, local, and seasonal foods, as well as ethnic and organic foods, was becoming more readily available.

Television chefs (Julia Child had been a pioneer in the field) captured growing audiences. Wealthy patrons of restaurants in major cities such as New York could feast on all manner of imaginative appetizers, salads, entrees, and desserts. Consumption of fine wines rose enormously. No longer could it dismissively be said, as it often had been, that most American people soldiered on in a bland and unimaginative gastronomic culture of casseroles, turkey and stuffing, and for those who could afford it, Sunday dinners of roast beef, potatoes, and apple pie.

Another improvement for most people was more basic: in health. Though 14 percent of Americans (roughly 40 million) still suffered from a lack of health insurance at the turn of the century, a host of technological advances in medicine continued to better the quality of life for the majority of people who had adequate coverage.  The introduction of more effective antiretroviral drugs was at last moderating the epidemic of AIDS in the United States. Preventive measures were becoming effective in improving personal health: Per capita smoking continued to decline, lessening mortality from tobacco, and bans on smoking began to cleanse the air in public places. Rates of infant mortality slowly decreased. Thanks especially to improvements in dealing with cardiovascular disease, life expectancy at birth, which had averaged 70.8 years in 1970, rose to 76.9 by 2000.  Better roads (and seat belt laws and tougher penalties for drunk driving) made it safer to drive: Though the number of miles driven greatly increased, fatalities from motor vehicle accidents declined absolutely, from 51,090 in 1980 to 41,820 in 2000.

Insofar as household possessions were concerned, Americans had never had it so good. In 2001, a record-high percentage of houses, 68 percent were owner-occupied, up from 64 percent in 1990.  The living spaces of the housing units built in the1990s were even larger on the average than earlier (and households were smaller in size), thereby offering more personal comfort and privacy and making room for a wide variety of goods and gadgets. Many other goods, such as automobiles, were of higher quality than in earlier years and cost less in inflation-adjusted dollars. The website eBay was becoming an extraordinarily popular destination for bargain hunters. Wal-Mart was a special boon to low- and middle-income shoppers. It was estimated that sales at Wal-Mart stores helped to lower the rate of inflation nationally by as much as 1 percent per year. 

At the turn of the century, the United States was truly a utopia of consumer goods, conveniences, and personal comforts. Of the 107 million households in the country in 2001, 106 million had color television (76 million had two or more sets); 96 million, VCR and/or DVD players; 92 million, microwave ovens; 84 million, electric clothes washers; 82 million, cable TV; 81 million, either room or central air-conditioning; 79 million, electric or gas clothes dryers; 60 million, personal computers; and 51 million, access to the Internet. More than 85 million households had one or more cars or trucks.  Americans had the means to enjoy travel as never before-for a total of 1,602 billion miles in 2000, as compared to 1,418 billion miles in 1990 and 1,126 billion miles in 1980. Consumer choice was even more dazzling than earlier, to the point of prompting shoppers to complain of “catalog-induced anxiety.” Comforts and possessions once only dreamed of-two or three cars, sail and power boats, frequent and faraway travel, second homes, were becoming affordable for steadily higher numbers of people who had ascended to the ranks of the upper-middle and upper classes.

Source:  James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 354-357


BUSH v. GORE, 2000

The Bush-Gore Presidential Campaign of 2000 proved to be one of the most divisive in recent memory.  Texas Governor George W. Bush, the Republican nominee, narrowly defeated Vice President Al Gore even though the Democratic won a majority of the popular vote.  What follows is a discussion of  that campaign.

And so the fateful presidential campaign of 2000 gathered momentum. At the start, the leading contenders, Vice President Gore and Texas governor Bush, faced opponents in the primaries. Gore, however, beat Bill Bradley, a former senator from New Jersey, in early contests and wrapped up the nomination by Super Tuesday (a date when a host of primaries took place) in March. Later, he chose Senator Joseph Lieberman of Connecticut to be his running mate.  Lieberman was the first Jew in American history to be honored with such a major party nomination. Bush lost the early New Hampshire primary to John McCain, a maverick Republican senator from Arizona who had spent five years as a POW in North Vietnam. But Bush rallied, buoyed by rough campaign tactics and by a war chest so enormous that he rejected federal funding for the primary season. After Super Tuesday, he, too, had sewn up the nomination. He later selected Dick Cheney, his father's tough-minded, conservative defense secretary, to run with him.

Other candidates--notably Ralph Nader for the Green Party and Patrick Buchanan, this time for the Reform Party--also entered the presidential race.  Nader, disdaining the policies of both major parties, directed much of his fire at the power and greed of multinational corporations. Buchanan, an isolationist, demanded a cutback of immigration and a nationalistic foreign policy. It was evident from the start of the campaign, however, that while Nader and Buchanan might be spoilers, they had no chance to win.

It was equally clear that the major party candidates in 2000--as had often been the case in the recent past--enjoyed the blessings of privileged and politically well-connected backgrounds. Gore, the son and namesake of a United States senator from Tennessee, had grown up in the nation's capital, where he had gone to private school before attending Harvard University. From his earliest years, he was groomed by his father to run for the presidency.  In 1988, at the age of thirty-nine, he had made an ambitious but abortive run for the office. Bush, like his father, had attended private school and graduated from Yale. He also received an MBA degree from the Harvard Business School. Moving to west Texas, he later became owner of the Texas Rangers baseball team, which he sold in 1998 for $15 million. The vice-presidential nominees, Lieberman and Cheney, had also attended Yale (though Cheney eventually graduated from the University of Wyoming, in his home state).

From the outset, the contest between Gore and Bush seemed too close to call, but though the race was tight, it was otherwise unexciting until the election itself. Bush, pointing to optimistic budget projections that anticipated a multitrillion-dollar federal surplus by 2010, called for huge tax cuts. The surplus, he said, was “not the government's money” but “the people's money.” In asserting the virtues of tax cuts, he was following in the footsteps of many GOP candidates, notably Reagan in 1980 and Dole in 1996. Bush proposed to allow younger workers voluntarily to divert some of their Social Security payroll taxes to personal retirement accounts. He favored drilling for oil in the Arctic National Wildlife Refuge, discounted prescription drugs for senior citizens through private insurance, a larger federal commitment to public education, and tax-supported vouchers so that parents could pay tuition at parochial and other private schools. The United States, he said, must discard its “soft bigotry of low expectations” about poorly performing students and improve its educational standards.

America, Bush emphasized, should become an “ownership society” in which enterprising people, not the federal government, played the dominant role. More of a social conservative than his father, he identified himself as a born-again, evangelical Christian, and he opposed abortion save in cases of rape or incest or when a woman's life was endangered. He repeatedly criticized the Clinton administration for risking the lives of American soldiers in places such as Haiti and for “nation-building” in foreign policy….  He said in a debate with Gore on October 11.  “I believe the role of the military is to fight and win war....  I don't want to be the world's policeman.”

Many political pundits predicted that Gore would beat Bush. After all, he was an experienced national politician who had served as a representative and senator from Tennessee before becoming an active, influential vice president…. “Dubya,” as many people called Bush (after his middle initial), seemed in contrast to be unsophisticated and inexperienced.  Though he had swept to victory in his races to be governor of Texas… he appeared to have no special qualifications to be president of the United States--save perhaps that…he was George H.W. Bush's son.

Gore, however, squandered his advantages. In the first debate, which he had been expected to win easily, he was overconfident and condescending--smirking, sighing audibly, rolling his eyes, and raising his eyebrows when Bush was speaking…. Gore never seemed comfortable as a campaigner… Critics observed that he was “stiff,” “wooden,” pompous, and inconsistent, seeming to flip-flop in order to cater to critics of his earlier statements and to please whatever constituency he was trying to impress…..

Bush, meanwhile, continued to surprise people. To be sure, he was far from articulate, once proclaiming, “Our priorities is our faith.” On another occasion, he exclaimed, “Families is where our nation finds hope, where wings take dreams.” Democrats mocked him as Governor Malaprop. Bush nonetheless came across as a folksy, energetic, physically expressive, and well-organized campaigner… Though Bush (like Gore) did not arouse great popular enthusiasm, many voters believed that he was a straight shooter who meant what he said….

When the polls closed, it was apparent that Gore had edged Bush in the popular vote.  Official tallies later showed that he had captured 50,992,335 votes (48.4 percent of the total) to Bush's 50,455,156 (47.9 percent). This was a margin of 537,179.  Nader won 2,882,897, or 2.7 percent of the total, and Buchanan received 448,895, or 0.42 percent….

The election revealed that social and cultural divisions had persisted. Like previous Democratic candidates, Gore scored impressively among low-income and new immigrant voters, city dwellers, supporters of gun control, members of labor union households, and blacks, winning an estimated 90 percent of African American votes. He was more popular among women than among men and among singles than married people, who were older on the average and (especially in the millions of dual-income families) wealthier.  Bush, however, fared better among low-income and lower-middle-class voters in rural and ·suburban areas than Dole had done in 1996. He was the choice of 54 percent of white voters, of 51 percent of white Catholic voters, and of 59 percent of people who said they attended church at least once a week.

On election night many states were too close to call. It was impossible to predict with certainty which candidate would win a required majority (270) of the 538 electoral college votes.  At 7:49 P.M. the major networks…said that Gore had won the state of Florida.  More than six hours later, at 2:16 A.M., FOX News, a conservative channel, announced that Bush had taken the state-and the election-whereupon ABC, CBS, ABC, and CNN followed suit within the next four minutes. Still later in the morning, the networks admitted that the outcome remained uncertain….

When Gore heard the networks announce for Bush, he rang him up to offer congratulations, only to be alerted by aides that he still had a good chance in Florida. So he phoned Bush back, saying: “Circumstances have changed dramatically since I first called you. The state of Florida is too close to call.” Bush responded that the networks had confirmed the result and that his brother, Jeb, who was Florida's Republican governor, had told him that the numbers in Florida were correct. “Your little brother,” Gore is said to have replied, “is not the ultimate authority on this.”

Americans, having gone to bed not knowing which candidate had won, awoke the next day to learn that the outcome was still uncertain. Within a few days, however, it seemed that Gore was assured of 267 electoral college votes, three short of the number needed to win the election. Bush then appeared to have 246. The eyes of politically engaged Americans quickly shifted to Florida, where twenty-five electoral votes remained at stake. If Bush could win there, where the first “final count” gave him a margin of 1,784 votes (out of more than 5.9 million cast in the state), he would have at least 271 electoral votes. Whoever won Florida would become the next president.

There followed thirty-six days of frenetic political and legal maneuvering….featuring partisan disputes over poorly constructed ballots used in various Florida counties. Flaws such as these, which had long existed throughout the United States…but they fell under an especially harsh glare of public scrutiny in 2000. In predominantly Democratic Palm Beach County, for instance, more than 3,000 voters--many of them elderly Jews--were apparently confused by so-called butterfly ballots and mistakenly voted for Buchanan, the Reform Party nominee, instead of for Gore. Some voters there, bewildered by the ballots, punched more than one hole; these “overvotes,” like thousands of overvotes on variously constructed 'ballots in other counties, were not validated at the time.

Elsewhere, partially punched ballots emerged from voting machines leaving “hanging,” “dimpled,” “pregnant,” or other kinds of “chads.” In the parlance of the time, these were “undervotes” --that is, ballots that voters may well have tried to have punched but that voting machines did not record as valid at the time.  It was estimated that the total of disputed undervotes (61,000) and overvotes (113,000) in Florida was in the range of 175,000. Irate Democrats further charged that officials at the polls unfairly invalidated the registrations of thousands of African American and Latino voters and tossed out a great many ballots in predominantly black neighborhoods. Police, they claimed, intimidated blacks who therefore backed away from voting in some northern Florida counties.  Gore's supporters further alleged that thousands of Floridians, many of whom were African American, were inaccurately included on long lists of felons and therefore denied the franchise under Florida law. Angry controversies over issues such as these heated up partisan warfare that raged throughout the nation.

From the beginning, Bush backers, led by former secretary of state James Baker…relied especially on the efforts of Bush's brother, Jeb, and on the Republican controlled Florida legislature, which stood ready to certify GOP electors. Federal law, Republicans maintained, set December 12 (six days before America's electors were to vote on December 18) as a deadline after which Congress was not supposed to challenge previously certified electors. If Gore called for a manual recount of ballots, Republicans clearly planned to turn for help to the courts in an 'effort to stop such a process or to tie it up in litigation so that no resolution of the controversy could be achieved by December 12.

On November 21, the Florida Supreme Court intervened on behalf of Gore, unanimously approving manual recounts in the four counties and extending to November 26 the deadline for these to be completed. Furious Bush supporters immediately responded by charging that Florida's Democratic judges, apparently a majority on the court, were trying to “steal” the election…. With tempestuous disputes flaring between hosts of lawyers and party leaders who were hovering over exhausted recount officials, Republicans on November 22 submitted a brief along these lines asking the United States Supreme Court to review the issue. The Court quickly agreed to do so and set December 1 as the date for oral argument.

Republicans insisted in their brief that the Florida court had changed the rules after the election and that it had violated Article II of the United States Constitution, which said that state legislatures, not state courts, were authorized to determine the manner in which electors were named….Gore's lawyers predictably and furiously contested this claim insisting the Florida legislature did not have the authority to override Florida state law and the state constitution, which provided for judicial review of the issue by the state courts…

One day after hearing the oral arguments, at 10:00 P.M. on the deadline date of December 12 itself, the U.S. Supreme Court delivered [its decision]. In Bush v. Gore, five conservative justices…focused on the question of equal protection. They stated that the recounts authorized by the Florida court violated the right of voters to have their ballots evaluated consistently and fairly, and that the recounts therefore ran afoul of the equal protection clause.

Their decision ended the fight. Bush, having been certified as the winner in Florida by 537 votes, had officially taken the state. With Florida's 25 electoral votes, he had a total of 271 in the electoral college, one more than he needed. He was to become the forty-third president of the United States.

Eighty-year-old Justice John Paul Stevens, who had been appointed to the Court by President Ford, wrote in the decision, “Although we may never know with complete certainty the identity of the winner of this year's Presidential election, the Identity of the loser is perfectly clear. It is the Nation's confidence in the judge as an impartial guardian of the rule of law.”

Source: James T. Patterson, Restless Giant:  The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), pp. 404-417..


9/11

It is fitting that the final vignette in this manual address the events of September 11, 2001.  Here historian Pauline Maier describes the cataclysmic events in New York City and Northern Virginia and the massive, spontaneous outpouring of support for both the victims and the nation.  The events and our response serve to remind us of our connection to our collective history and to each other.

On Tuesday, September 11, 2001, America's world was suddenly and dramatically transformed.  Within the space of an hour and a half that morning, two passenger airlines took off from Logan in Boston, and two others took off from Newark Airport in New Jersey and Dulles Airport in Washington, D.C.  All four, bound for California, were loaded with fuel.  At some point not long after the planes were airborne, each was commandeered by four or five hijackers armed with box cutters and knives.

At 8:45 A.M. one of the planes from Boston crashed into the north tower of the 110story World Trade Center in lower Manhattan, tearing a huge hole in the building and setting it ablaze.  Eighteen minutes later, the second plane out of Boston struck the south tower and exploded.  At 9:43, the plane from Dulles crashed into the Pentagon. Shortly after 10, the south tower of the World Trade Center, its reinforced concrete supports severely weakened by the intense heat of the jet fuel fire, collapsed, showering a torrent of  debris into the streets below. Just before 10:30, the north tower followed its twin into Vie dust, releasing a tremendous cloud of debris and smoke and severely damaging a nearby 47story building--later in the day it, too, felland setting others in the area on fire.  In Washington, in the meantime, the portion of the Pentagon that had been hit also collapsed.

Passengers on the fourth flight, in touch with relatives via cell phones, learned about the attacks on the Trade Center and the Pentagon; they concluded that their plane was being flown to a target as well.  Some decided to storm the cockpit, with the result that the Plane crashed in a field southeast of Pittsburgh rather than into a building. (It was, in fact, headed toward the nation's capital.)  All fortyfour people aboard were killed.

Within less than an hour of the first crash at the World Trade Center, the Federal Aviation Administration halted all flights at American airports for the first time in the nation's history and diverted to Canada all transatlantic aircraft bound for the United States. President Bush was in Florida, but the White House was evacuated and so were all other federal office buildings in the capital.  Secret Service agents armed with automatic rifles were deployed opposite the White House in Lafayette Park.  In New York, the stock exchanges and all state government offices were closed.

At a news conference in the midafternoon, New York's Mayor Rudolph Giuliani, asked about the number killed, said, "I don't think we want to speculate about thatmore than any of us can bear."  That evening, the city reported that hundreds of its police officers and firefighters on the scene were dead or missing.  In the weeks that followed, estimates of the deaths at the World Trade Center ran as high as 6,000 (they were later reduced to 3,000).  Some 200 people died in the crash at the Pentagon...

The attacks of September 11 prompted an outpouring of patriotism rarely seen since Pearl Harbor.  American flags appeared in shop windows and on homes, buildings, cars and trucks, overpasses, and bridges.  Millions of Americans pinned red, white, and blue streamers on their jackets. Across the country, people attended services for the victims, sent money to assist their families, and gave blood for the survivors.  Commentators everywhere extolled the heroism of the firefighters and police who died in the line of duty at the World Trade Center.  Thousands flocked to Ground Zero, now hallowed ground, solemnly peering at the smoldering ruins and the workmen removing the debris. Many posted prayers, notices of the missing, and poems on the protective chainlink fences at the site and on any available wall space (including phone booths) around the city.

September 11 heightened awareness of the fact that the United States, as the world's sole superpower, was an integral part of what was becoming a global civilization. The day after the attacks, the French newspaper Le Monde ran the headline "Nous sommes toutes les Amiricaines" (We are all Americans).  The victims at the World Trade Center included the nationals of more than eighty nations.  The multinational and multicultural nature of American society was revealed by the names of lost spouses, parents, and children, hundreds of them on posterboards pleading for information about them--people named Schwartzstein, Henrique and Calderon, Kikuchihara and Tsoy, Cassino, Staub, and Egan, Williams, Caulfield, and Wiswall.

On a sheet of paper tacked up in New York's Grand Central Station in late October, an anonymous poet cried out:

Six thousand fallen heroes

The six thousand angels, their trumpets blaring

Are calling us to arms, Waking us up from our selfish slumber

To the truth of our lives, the evil in the world 

We must stop, turn, stand up together as one,

Arm in arm, pillars of strength 

Many observers declared that September 11 had ushered the United States into a new era.  Perhaps it had...  Another poem posted at Grand Central Station told the perpetrators of September 11 why the nation remained strong and resilient:

Well, you hit the World Trade Center, but you missed America

America isn't about a place, America isn't even about a bunch of buildings

America is about an IDEA.

The idea, forged and enlarged through almost four centuries of struggle, had come to include many elements.  The overarching ones—the Fourth of July standards of freedom, equality, democracy, and opportunity--continued to transcend the nation's diversity, bind it together, and at once invigorate and temper its response to the shadowy threats it was now compelled to confront.

Source: Pauline Maier, Inventing America: A History of the United States, vol. 2 (New York, 2003), pp. 1082-1086.

cheap nike air max 90