|
|
ths |
ths |
|
-
-
|
t
t
Joseph
S. Lucas and Donald A. Yerxa, Editors
"Permanent
War for Permanent Peace" | America and the Western
Way of War | No Clear Lessons from the Past |
Tocqueville, Powell, Miller, and September 11 | Teaching
Religion in American Schools and Colleges | Teaching
the Holocaust in America | The Rediscovery of John
Adams | Remembering National History
November
2001
Volume III, Number 2
Permanent
War for Permanent Peace: American Grand Strategy since World War II
by
Andrew J. Bacevich
In
his widely praised appearance before a joint session of Congress on September
20, 2001, George W. Bush put to rest any lingering doubts about
the legitimacy of his presidency.After
months during which it had constituted a battle cry of sorts, “
Florida ” reverted to being merely a state.
Speaking
with confidence, conviction, and surprising eloquence, Bush reassured Americans
that their commander-in-chief was up to the task at hand:they
could count on him to see the nation through the crisis that had arrived
nine days earlier with such awful and terrifying suddenness.To
the extent that leadership contains elements of performance art, this particular
performance was nothing short of masterful, delighting the president’s
supporters and silencing, at least for a time, his critics.The
moment had seemingly found the man.
Yet
however much the atmospherics surrounding such an occasion matter—and they
matter a great deal—the historian’s attention is necessarily drawn elsewhere.Long
after passions have cooled and anxieties have eased, the words remain,
retaining the potential to affect subsequent events in ways large or small.
What
did the president actually say?What
principles did he enunciate?From
which sources did he (or his speechwriters) draw the phrases that he spoke
and the aspirations or sentiments that they signified?What
unstated assumptions lurked behind?Looking
beyond the crisis of the moment, what does this particular rendering of America
’s relationship to the world beyond its borders portend for the future?
In
this case, more than on most others, those questions may well matter.Not
since the Cold War ended over a decade ago has an American statesman offered
an explanation of foreign policy principles and priorities that enjoyed
a half-life longer than a couple of news cycles.Bush’s
father during his single term in office and Bill Clinton over the course
of eight years issued countless pronouncements touching on this or that
aspect of U. S.
diplomacy or security policy.None
achieved anything even remotely approaching immortality. (George H. W.
Bush’s “This will not stand”—uttered in response to Saddam Hussein’s invasion
of Kuwait in 1990— might have come close. But given the unsatisfactory
conclusion of the Persian Gulf War and its frustrating aftermath— with
Bush’s nemesis evincing a Castro-like knack for diddling successive administrations
—the rhetorical flourish that a decade ago sounded brave reads in retrospect
like warmed-over Churchill).
George
W. Bush’s speech outlining his war on terrorism may prove to be the exception.
It qualifies as the first foreign policy statement of the post-Cold War
era with a chance of taking its place alongside Washington’s farewell,
Monroe’s doctrine, Roosevelt’s corollary, and Wilson’s fourteen points
among the sacred texts of American statecraft. Or perhaps a more apt comparison
might be to another momentous speech before a joint session of Congress,
delivered by Harry S. Truman on March 12, 1947.
A
looming crisis in a part of the world that had only infrequently commanded
U.S. attention prompted President Truman to appear before Congress. A faltering
British Empire had just announced that it could no longer afford to support
Greece, wracked by civil war and deemed acutely vulnerable to communist
takeover. Britain’s withdrawal would leave a power vacuum in southeastern
Europe and the Near East, with potentially disastrous strategic consequences.
Filling that vacu- um, in Truman’s judgment, required immediate and decisive
American action.
In
short, Truman came to the Capitol not to promulgate some grand manifesto
but simply to persuade Congress that the United States should shoulder
the burden that Britain had laid down by providing aid to shore up the
beleaguered governments of Greece and of neighboring Turkey. But Senator
Arthur Vandenberg, a recent convert from isolationism (and thus presumed
to possess special insights into the isolationist psyche) had cautioned
Truman that enlisting the support of skeptical and tightfisted legislators
would require that the president first “scare hell out of the American
people.” Truman took Vandenberg’s counsel to heart.
Thus,
the president described the challenges of the moment as nothing short of
pivotal. History, he told the Congress and the nation, had reached a turning
point, one in which “nearly every nation must choose between alternative
ways of life.” Alas, in too many cases, the choice was not one that they
were at liberty to make on their own. Militant minorities, “exploiting
human want and misery,” and abetted by “aggressive movements” from abroad
were attempting to foist upon such nations the yoke of totalitarianism.
Left unchecked, externally supported subversion would lead to the proliferation
of regimes relying upon “terror and oppression, a controlled press and
radio, fixed elections, and the suppression of personal freedoms”—the very
antithesis of all that America itself stood for. According to Truman, the
United States alone could stem this tide. In what came to be known as the
Truman Doctrine, he declared that henceforth “it must be the policy of
the United States to support free peoples who are resisting attempted subjugation
by armed minorities or by outside pressures.”
Truman
did not spell out detailed guidelines on where this general statement of
intent might or might not apply. In the matter at hand, Congress responded
positively to the president’s appeal, appropriating $400 million of economic
and military assistance for Greece and Turkey. But things did not end there.
Truman’s openended commitment to protect governments threatened by subversion
continued to reverberate. His successors treated it as a mandate to intervene
whenever and wherever they deemed particular U.S. interests to be at risk.
America’s putative obligation to defend free peoples everywhere (some of
them not very free) provided political and moral cover for actions overt
and covert, wise and foolish, successful and unsuccessful, in virtually
every quarter of the globe. Over the next four decades, in ways that Truman
himself could never have anticipated, his eponymous doctrine remained the
cornerstone of U.S. foreign policy.
George
W. Bush’s speech of September 20 bears similar earmarks and may well give
birth to a comparable legacy. This is not because Bush any more than Truman
consciously set out to create such a legacy. But in making his case for
a war on terror, Bush articulated something that has eluded policymakers
since the collapse of the Soviet Union deprived the United States of a
readily identifiable enemy: a coherent rationale for the wide-ranging use
of American power on a global scale. Truman had placed the problems besetting
Greece and Turkey in a broad strategic context. The threat to those two
distant nations implied a threat to U.S. security and to the security of
the world at large. On September 20, Bush advanced a similar argument.
The events of September 11 may have targeted the United States, but they
posed a common danger. The fight was not just America’s. “This is the world’s
fight,” Bush said. “This is civilization’s fight.”
Truman
had depicted a planet in the process of dividing into two opposing camps—the
free world against totalitarianism. Bush portrayed an analogous division
—with “the civilized world” now pitted against a terrorist network intent
on “remaking the world—and imposing its radical beliefs on people everywhere.”
Echoing Truman, Bush insisted that history had reached a turning point.
Once again, as at the beginning of the Cold War, circumstances obliged
nations to choose sides. “Either you are with us,” he warned, “or you are
with the terrorists.” Neutrality was not an option.
As
in 1947 so too in 2001, the stakes were of the highest order. In the course
of enunciating the doctrine that would bear his name, President Truman
had alluded to freedom—free peoples, free institutions, liberty, and the
like—eighteen separate times. President Bush’s presentation of September
2001 contained fourteen such allusions. According to Bush, the events of
September 11 showed that “freedom itself is under attack.”
Casting
the U.S. response to that attack not simply in terms of justifiable selfdefense
or retaliation for an act of mass murder but as necessary to preserve freedom
itself imbued Bush’s speech with added salience. Although its meaning is
both continually shifting and fiercely contested, freedom by common consent
is the ultimate American value. In political rhetoric, it is the ultimate
code word.
Defining
the war against terror as a war on behalf of freedom served the administration’s
purposes in two important ways, both of them likely to have longer-term
implications. First, it enabled President Bush to affirm the nation’s continuing
innocence—not only in the sense that it is blameless for the events of
September 11 but more broadly that its role in the world cannot be understood
except as benign.[1]
“Why do they hate us?” the president asked rhetorically. “They hate our
freedoms,” he replied, “our freedom of religion, our freedom of speech,
our freedom to vote and assemble and disagree with each other.” In offering
this litany of estimable values as the only conceivable explanation for
“why they hate us,” Bush relieved himself (and his fellow citizens) of
any obligation to reassess the global impact of U.S. power, political,
economic, or cultural. That others —to include even our friends—view America’s
actual influence abroad as varied, occasionally problematic, and at times
simply wrongheaded is incontrovertible. The president’s insistence on describing
the United States simply as a beacon of liberty revalidated a well-established
national preference for discounting the perceptions of others.
Second,
sounding the theme of freedom enabled Bush to situate this first war of
the twenty-first century in relation to the great crusades of the century
just concluded. Alluding to the perpetrators of the September 11 attack,
the president declared that “We have seen their kind before. They are the
heirs of all the murderous ideologies of the twentieth century .… [T]hey
follow the path of fascism, and Nazism, and totalitarianism. And they will
follow that path all the way, to where it ends: in history’s unmarked grave
of discarded lies.”
The
president did not need to remind his listeners that the dangers posed by
those murderous ideologies had legitimized the rise of the United States
to great power status in the first place. It was the mobilization of American
might against the likes of Germany, Japan, and the Soviet Union that had
hastened the demise of the ideologies they represented. A new war on
behalf of freedom and against evil provides renewed legitimacy to the
exercise of American power both today and until the final elimination of
evil is complete.
Furthermore,
engagement in such a war removes the fetters that have hobbled the United
States in its use of power since the last ideological competitor fell into
its grave. The most important of those constraints relates to the use of
force. Since the end of the Cold War military power has emerged as never
before as the preferred instrument of American statecraft. Military preeminence
forms an integral component of U.S. grand strategy—an effort to create
an open and integrated international order, conducive to the values of
democratic capitalism, with the United States enjoying a position of undisputed
We willfully ignore the fact that bin Laden’s actions (however contemptible)
represent an expression of strongly held convictions (however warped):
a determination by whatever means necessary to overturn the existing American
imperium in the Middle East and the Persian Gulf. Thus do we sustain the
pretense that America is not an empire. primacy. But absent an adversary
on a par with Nazi Germany or the Soviet Union, policymakers during the
1990s found themselves unable to explain to the American people generally
exactly why the United States needed to exert itself to remain the world’s
only superpower—why the need to spend more on defense than the next eight
or ten strongest military powers combined? With U.S. security seemingly
more assured than at any time in recent memory, they found themselves similarly
hard-pressed to translate military preeminence into useful policy outcomes
in places far from the American homeland—why the need to intervene in Somalia,
Haiti, Bosnia and elsewhere?
The
Clinton administration justified its penchant for military intervention
by insisting that it acted to succor the afflicted, restore democracy,
and prevent genocide. Yet in virtually every case the facts belied such
claims. Moreover, even if the purest altruism were motivating Bill Clinton
periodically to launch a few cruise missiles or send in the Marines, Americans
weren’t buying it. Ordinary citizens evinced precious little willingness
to support foreign-policy-as-social-work if such efforts entailed even
a remote risk to U.S. troops. The hope of salvaging a multi-ethnic Bosnia
might stir the hearts of journalists and intellectuals, but the cause was
not one that the average American viewed as worth dying for. As a result,
during the 1990s, the greatest military power in history found itself hamstrung
by its own selfimposed shackles, above all, an obsession with casualty
avoidance. The United States could actually employ its military only with
advanced assurance that no American lives would be lost. The Kosovo conflict
of 1999 epitomized the result: a so-called humanitarian war where U.S.
pilots bombed Belgrade from 15,000 feet while Serb forces, largely unmolested,
pursued their campaign of ethnic cleansing on the ground.
The
fact that these various experiments in peacemaking and peacekeeping almost
inevitably resulted in semi-permanent deployments of questionable efficacy,
trampling on expectations that armed intervention should produce prompt
and clear-cut results, only accentuated popular discontent. Bald-faced
lies by senior U.S. officials—remember the fraudulent prom- ises that the
troops would be out of Bosnia within a year?—didn’t help much.
Now
President Bush’s declaration of war on terrorism offers a way out of that
predicament, making it possible for policymakers to reclaim the freedom
of action that the Truman Doctrine had provided in earlier decades. Under
the terms of the Bush Doctrine, the constraints that hampered the U.S.
in the 1990s need not apply. The calculations governing tolerable risk
change considerably. The gloves can come off—not just in the campaign against
Osama bin Laden, but against any other group or regime that this administration
or any of its successors can plausibly tag with supporting terrorist activity.
The Republican Party that had once codified the lessons of the Vietnam
War in the Weinberger Doctrine has now chucked that doctrine overboard,
telling Americans that they must expect war to be a protracted and
ambiguous affair, a long twilight struggle with even the definition of
victory uncertain.[2]
Furthermore,
defining our adversary as “terrorism” itself makes it all the easier to
avert our eyes from the accumulating evidence suggesting that it is the
quasi-imperial role that the United States has asserted that incites resistance—and
that it will continue to do so. In fact, as Daniel Pipes has correctly
noted, terror is a tactic, not an enemy.[3]
But by insisting that our present quarrel is with terrorism—rather than,
for example, with radical Islam—the United States obscures the irreconcilable
political differences underlying this conflict. We willfully ignore the
fact that bin Laden’s actions (however contemptible) represent an expression
of strongly held convictions (however warped): a determination by whatever
means necessary to overturn the existing American imperium in the Middle
East and the Persian Gulf. Thus do we sustain the pretense that America
is not an empire.
In
the weeks immediately following the terrorist attack on New York and Washington,
a rift about how best to proceed appeared at the highest levels of the
Bush administration. Should the United States embark upon what the president
in an unscripted moment referred to as an allout “crusade” against global
terror? Or should it limit itself to identifying and eliminating the network
that had actually perpetrated the September 11 attack? In the near term,
the advocates of the narrow approach seemingly prevailed. When Operation
Enduring Freedom began on October 7, 2001, the United States singled out
bin Laden’s apparatus and the Taliban for destruction. Yet U.S. officials
also hinted that the just launched offensive constituted only the first
phase of a multi-part campaign—carefully refraining from specifying what
phase two or phase three might entail. It turned out that the president
had not rejected the idea of a crusade; he had merely deferred it while
keeping all options open.[4]
Assuming
that the first phase of Operation Enduring Freedom succeeds, the doctrine
that President Bush enunciated on September 20 will provide a powerful
argument for those eager to move onto the next phase. Finding a suitable
candidate to play the roles of Al-Qaeda and the Taliban will present few
difficulties: the State Department roster of terrorist organizations is
a lengthy one; regimes suspected of supporting terror include Iraq, Iran,
Syria, Libya, the Palestinian Authority, Sudan, Yemen, North Korea, perhaps
even our new-found ally Pakistan, just for starters.
To
put it another way: Operation Enduring Freedom may be the first instance
of the U.S. waging “war on terror.” But it is unlikely to be the last.
The quest for Enduring Freedom points where the pursuit of absolutes always
has in international relations: toward permanent war waged on behalf of
permanent peace. The Bush Doctrine, like the Truman Doctrine that it supersedes,
offers policymakers a veritable blank check to fight those wars.
Andrew
J. Bacevich is professor of international relations at Boston University.
His book Indispensable Nation: U.S. Foreign Policy in a Global Age is forthcoming.
_________________________________________
[1]
In an effort to prevent any misunderstanding on this point, a statement
of my own personal views may be in order. There exists no conceivable justification
for the terrorist attacks that occurred on September 11. The United States
bears absolutely no responsibility for them. Even if the United States
were guilty of all the crimes that its harshest critics accuse it of committing,
the events of September 11 remain irredeemably evil and utterly and completely
unacceptable.
[2]
The secretary of defense described America’s new war this way: “Forget
about ‘exit strategies;’ we’re looking at a sustained engagement that carries
no deadline. We have no fixed rules about how to deploy our troops.” Donald
H. Rumsfeld, “A New Kind of War,” New York Times, September 27, 2001. On
another occasion Rumsfeld suggested that “victory is persuading the American
people and the rest of the world that this is not a quick matter that is
going to be over in a month or a year or even five years.” Quoted in Thomas
E. Ricks and Steven Mufson, “In War on Terrorism, Unseen Fronts May Be
Crucial,” Washington Post, September 23, 2001.
[3]
Daniel Pipes, “What Bush Got Right—and Wrong,” Jerusalem Post, September
26, 2001.
[4]
As early as October 8, 2001, the United States had notified the UN Security
Council that “We may find that our self-defense requires further actions
with respect to other organizations and other states”—seen as hinting at
a wider war. Irwin Arieff, “US Warns It May Target Others,” Boston Globe,
October 9, 2001.
America and
the Western Way
of War
by
Victor Davis Hanson
We
have suffered a great—and still not determined —loss in the United States,
perhaps as many killed on September 11 as at Iwo Jima, almost twice as
many as the dead at Shiloh, and perhaps ten times the fatalities of the
Coventry Blitz. A trillion dollars has vanished at once from our markets;
forty billion dollars of Manhattan real estate was vaporized; and, more
importantly, thousands of human lives were lost. But the incineration of
innocent civilians in our cities is not due—pace the Taliban and
Mr. Falwell—to our intrinsic weaknesses or decadence, but rather, like
the Greeks in the weeks before Thermopylae, attributable to our naiveté,
unpreparedness, and strange ignorance of the fact that there are some in
the world who envy and hate us for who we are rather than what we have
done.
Yet
bin Laden and the Taliban terrorists have made a fatal miscalculation.
They should read Thucydides about the nature of democracies aroused, whether
Athenian or Syracusan. Like all absolutists who scoff at the perceived
laxity and rot of Western democracies and republics, these cowardly murderers
have slapped an enormous power from its slumber, and the retribution of
American democracy will shortly be both decisive and terrible, whether
manifested in special operations, conventional firepower, or both. The
bloody wages of this ignorance of the resilience of a free people are age-old
and unmistakable— Xerxes’ 60,000 washed ashore at Salamis, 80,000 of the
Sultan’s best floating in the waters off Lepanto, 100,000 lost in the streets
of Tokyo.
Over
some 2,500 years of brutal warring, the real challenge for a Western power
has always been another Western power—more Greeks dying in a single battle
of the Peloponnesian War than all those who fell against the Persians,
Alexander butchering more Greeks in a day than did Darius III in three
years, the Boers killing more Englishmen in a week than the Zulus did in
a year, more Americans falling at Antietam than were killed in fifty years
of frontier fighting. And in the present conflict, America is not fighting
England, Germany, a westernized Japan, or even China or India, nations
that so desperately seek to emulate our military organization, training,
and armament.
Western
nations at war from the Greeks to the present are not weak, but enormously
lethal—far out of proportion to their relatively small populations and
territories. This frightful strength of the West is not an accident of
geography, much less attributable to natural resources or genes. The climate
of Egypt of the Pharaohs did not change under the Ptolemies, but the two
were still quite different societies. Mycenaeans spoke Greek and raised
olives, but they were a world away from the citizens of the city-state
that later arose amid their ruins.
Nor
is our power merely an accident of superior technology; rather it is found
in our very ideas and values. The foundations of Western culture—freedom,
civic militarism, capitalism, individualism, constitutional government,
secular rationalism, and natural inquiry relatively immune from political
audit and religious backlash— when applied to the battlefield have always
resulted in absolute carnage for their adversaries. Setbacks from Cannae
to Little Big Horn led not to capitulation, but rather to study, debate,
analysis—and murderous reprisals. Too few men too far away, a bad day,
terrible weather, silly generals like Custer, or enemy geniuses such as
Hannibal—all in the long haul can usually be trumped by a system, an approach
to war that is emblematic of our very culture. For good or evil, these
terrible protocols of the West at war will soon make themselves known to
the ignorant in Afghanistan and beyond.
Indeed,
such ideals have already appeared even in the first few hours of the attack—doomed
airline passengers first voting on their decision to storm the hijackers
to prevent further carnage to their countrymen; the Congress freely voting
—and finding—vast sums of capital for military operations; bizarre military
hardware and frightening weapons of death glimpsed on our television screens
as they head eastward; media critics and pundits openly lauding and criticizing
U.S. actions past, present, and future, and thereby crystallizing the nature
both of the threat and our response; individual rescue workers, aided by
sophisticated and huge machines, on their own initiative devising ad
hoc methods of saving victims and restoring calm to a devastated city.
Neither
the genius of Mithridates nor the wasting diseases of the tropics nor the
fanaticism of the Mahdists have stopped the heroes, idealists, megalomaniacs,
and imperialists of past Western armies, whose occasional lapses have prompted
not capitulation, but responses far more deadly than their enemies’ temporary
victories. This is not a question per se of morality, but of military
capability and power. It would have been less hurtful for all involved
had the thug Pizarro stayed put in Spain or the sanctimonious Lord Chelmsford
kept out of Zululand.
In
our peace and affluence, ignorant about the military history of Vietnam
and in awe of the suicidal fanaticism of our enemies, we Americans of this
complacent age have forgotten the lethal superiority of the Western way
of war—the Greeks losing only 192 at Marathon, Alexander the Great destroying
an empire of 70 million with an army of 40,000, Cortés wrecking
an impe- rial people of 2 million in less than two years, or a small band
of British redcoats ending the power of Cetshwayo and his Zulus for good
in less than a year. The arsenal at tiny 16th century Venice—based on principles
of market capitalism and republican audit, despite a West torn by Catholicism,
Orthodoxy, and Protestantism—launched far better and more numerous galleys
than those of the entire Ottoman navy. After Lepanto, the salvage crews
collected the Ottoman cannons —themselves copied on Venetian and German
designs—for scrap, so inferior were they to their European models. At Midway,
American code breakers—products of free universities, nursed on egalitarianism
and free to inquire without political and religious censure—helped to win
the battle before it had even begun. There was nothing like them in the
Japanese military. We are not supposed to say such things, but they are
true and give us pause for reflection upon the prognosis of the present
military crisis.
Greek
hoplites, like all Western armies, defined discipline not as sword play,
captive taking, or individual bravado, but as keeping in rank, marching
in time, drilling, and attacking in unison. And so at the battle of Cunaxa
in 401 they slaughtered their Persian opponents, while incurring not a
single fatality. Roman legions, Spanish harquebusiers, and English squares
followed in the identical tradition, and left corpses all over the globe.
After the disaster at Cannae—Hannibal’s genius resulted in 600 dead legionaries
a minute—Roman legions nevertheless grew, while Carthaginian mercenary
armies shrank. Such civic militarism is a trademark of Western militaries,
where soldiers are not serfs or tribesmen, but fight as citizens with rights
and responsibilities. The last radio transmissions of the doomed New York
City firemen reveal not just professionalism, but a real sense of egalitarianism
and democratic affinity.
In
the months to come, American ground and air forces, with better weapons,
better supplies, better discipline, and more imaginative commanders—audited
constantly by an elected congress and president, critiqued by a free press—will
in fact dismantle the very foundations of Islamic fundamentalism. Indeed,
the only check on the frightful power of Western armies— other than other
Western armies—has rarely been enemy spears or bullets, but the very voices
of internal dissent—a Bernardino de Sahagún aghast at his people’s
cruelty in Mexico, a Bishop Colenso remonstrating the British government
about the needless destruction of Zululand, or an American Jane Fonda in
Hanoi to undermine the war in Vietnam. The Taliban and the hosts of murderers
at bases in Pakistan, Iraq, and Syria may find solace from Western clergy
and academics, but they shall not discover reprieve from the American military.
America
is not only the inheritor of the European military tradition, but in many
ways its most frightful incarnation. Our multiracial and radically egalitarian
society has taken the concepts of freedom and market capitalism to their
theoretical limits. While our critics often ridicule the crassness of our
culture and the collective amnesia of our masses, they underestimate the
lethal military dynamism that accrues from such an energetic and restless
citizenry, where past background means little in comparison to present
ambition, drive, and ingenuity.
Look
at a sampling of the names of the dead firemen in New York—Michael Weinberg,
Manuel Mojica, Paddy Brown, Joseph Angelini, Gerard Schrang, James Amato,
Sean Hanley, Tarel Coleman, Joseph Gulleckson, and Jose Guadalupe. These
rescuers were united not by hue or accent, but, like those in the legions,
a shared professionalism and desire for action. So our creed is not class,
race, breeding, or propriety, but unchecked energy as so often expressed
in our machines, brutal competitiveness, and unleashed audacity—frightful
assets when we turn, as we shall shortly, from the arts of production to
those of destruction.
The
world, much less the blinkered fundamentalists, has not seen a United States
unleashed since World War II and has apparently forgotten all this; we
should not shudder at the false lesson of Vietnam but at the real seminar
of the ages that they are about to learn. Americans are kind, and we are
a generous people. But when wronged, held in contempt, and attacked in
peace, we define victory as the absolute annihilation of our adversaries
and then turn to a very peculiar but very deadly way of making war. In
our reckoning, real humanity is redefined not by smug sermonizing about
the sanctity of life, but by ending the lives of killers who will kill
innocents until stopped.
So
we are a schizophrenic people of sorts, a nation of amateurs that can almost
magically transform itself into a culture of professional killers. In 1860,
Grant was a clerk and Sherman a failed banker and then teamster; in 1865,
they were cruel masters in the art of unmitigated carnage, their huge armies
the most deadly of the age. The world before September 11 has now passed,
and what is to follow is murky, chaotic, and unpredictable. But there is
one constant—we eventually will fight back and when we do, we will most
surely win. Victor Davis Hanson is a military historian and most recently
the author of Carnage and Culture: Landmark Battles From Salamis to Vietnam
(Doubleday, 2001).
Victor
Davis Hanson is a military historian
and
most recently the author of Carnage and
Culture:
Landmark Battles From Salamis
to
Vietnam (Doubleday, 2001).
No
Clear Lessons from the Past
by
David Kaiser
On December
8, 1941, President Franklin Roosevelt labeled the attack on Pearl
Harbor a “day that will live in infamy,” and Congress declared
war on Japan.Three
days later the United States
was at war with Germany.Three
years and nine months later, with about 300,000 young Americans dead, our
enemies were entirely defeated, and the creation of a new world began.The
destruction of the WorldTradeCenter
and the attack on the Pentagon may have had a similar emotional impact
upon the American people, but the task of eliminating the threats posed
by terrorism makes the Second World War seem almost simple by comparison.It
is most unlikely that we will suffer 300,000 people killed over the next
four years, but it is equally unlikely that we will have eliminated terrorism
by then either.
Although
the United States
seemed woefully unprepared when the Second World War broke out—all the
more so since the Pacific fleet had been crippled—both the problem we faced
and the eventual solution were already quite clear.As
Churchill put it in his memoirs, the new coalition of the Soviet
Union , Britain
and the United States
had many times the combined resources of Germany, Italy,
and Japan,
and the eventual destruction of the Axis was only a matter of production,
mobilization, and deployment—that is, of time.We
understood both the threat—the Axis military forces—and the appropriate
response—the eventual conquest of two medium-sized countries, Germany
and Japan
. Despite many further reverses during 1942, the allies rapidly achieved
superiority and reached their objectives.
Comparisons
between September 11 and Pearl Harbor—focusing on America’s unpreparedness
and the emotional shock felt by the nation—have already become commonplace.
Beyond that, however, this historical analogy offers little guidance. The
threat is not a purely military one, nor can it be easily dealt with by
military means.Apparently, the threat
is a large, well-trained organization—allied to other similar organizations—based
in a remote and unfriendly country, but living everywhere and nowhere. Osama
bin Laden has made clear that he wants to eliminate American influence,
and the regimes that depend on it, from the Middle East.
His weapon is not traditional war, but terror. Ideally, terrorists represent
a problem for law enforcement rather than the military, but law enforcement
agencies in various Middle Eastern countries allow them to operate—some
from ideological sympathy and some out of fear.In
theory, that deprives these governments of all legitimacy and makes them
enemies of the United States.
In practice, it may make the problem we face insoluble for many years to
come.
World
reaction suggests that the United States can now build a broad coalition
designed to make it impossible for organized terrorism to operate anywhere—
a coalition including not only Western Europe and our Asian allies, but
also Russia and other former Soviet States, which have already been victims
of terrorism themselves. Arab states such as Egypt and Algeria, well accustomed
to terrorist threats, also seem willing to participate. But even if we
set aside Iraq, the full cooperation of the predominantly Muslim nations
is highly unlikely.
Although
we now have a right and a duty to strike at any perpetrators we can identify,
it seems to me far from certain that the kind of precision strikes in which
the American military now specializes will be able to destroy Osama bin
Laden, much less his organization, within Afghanistan. That country is
very large—approximately 1000 by 400 miles of mostly mountainous terrain—and
has a population of more than twenty million people. The Soviet Union had
no success operating there; can our army expect much more? Can we really
commit the resources necessary to establish law and order in a hostile
country in which Muslim fundamentalists are the strongest political force?
Can we conquer Iraq, which the Bush administration clearly suspects of
complicity, at the same time? Is the western world prepared to re-occupy
large portions of the Middle East for decades to come?
And
there are further concerns. Bin Laden and his associates could flee to
a neighboring country—Iraq is mostly likely. Is there any doubt that they
would continue trying to mount fresh outrages? New security measures may
make another incident like September 11 unlikely, but other kinds—some
even worse—are entirely possible. Won’t a greater American presence in
the region increase the number of their recruits? Might it not actually
topple some friendly governments?
The
new anti-terrorism coalition which must now form—and which needs, if at
all possible, to work through the United Nations—must discover effective
means of putting pressure on states that refuse to cooperate. These may
include refusing to allow their nationals to live abroad—a draconian measure,
certainly, but one that seems to be both logical and appropriate, given
the difficulty of distinguishing innocent people from terrorists who threaten
thousands of people. But perhaps most important of all, we must enlist
all the nuclear powers of the world in an attempt to inventory and secure
every single nuclear warhead in their possession. Clearly, the men who
flew planes into the World Trade Center would have detonated such a warhead
if they could have gotten their hands on it. This is the most urgent problem
that we face.
Given
the nature of modern society and its vulnerabilities, we will not be safe
from hijackers and bombers until effective and cooperative political authorities
essentially rule the world. Missile defense will do nothing to bring that
about. No matter what happens, we will probably have to endure more attacks
for at least ten years. We must try to establish some momentum toward a
true new world order, and the role of traditional military force in this
process is anything but clear.
David
Kaiser, a historian, teaches at the U.S. Naval War College. He is the author
of American Tragedy: Kennedy, Johnson, and the Origins of the Vietnam War
(Harvard University Press, 2000) .
Tocqueville,
Powell, Miller, and September 11
by
Walter LaFeber
The
September 11 attacks have changed how Congress and the Bush administration
think about security, the economy, budget issues, and (at least temporarily)
even partisanship.
Unknown,
of course, is how long this different thinking and America's New War (as
several television stations now brand these beginnings of the 21st
century) will shape the politics and spending policies of a nation usually
reluctant to commit itself over long periods to quite different, and often
individually restrictive, priorities—especially when the old priorities
and ways of thinking produced the highly affluent, if often cloying but
comfortably trivial, society of the late 1990s. Monica Lewinsky and Forrest
Gump meet Mullah Omar and Osama bin Laden.
A
few film critics are already beginning to ponder this pivotal question.
They are wondering aloud whether Hollywood
will finally have to begin making movies for adults. Perhaps, although
it will be economically tempting, not to mention easier, to produce the
same kind of mindless escapism, heroism-from-a-safe-distance, technology-instead-of-thought,
easy-to-understand-good-wars, and other types of platitudinous films. The
idea of a good war, for example, looked one way when viewed from the perspective
of soldiers who came under fire to destroy Hitlerism and Japanese militarism.
It looked different from the perspective of many civilians who were able
to enjoy an adequate income for the first time in a generation. The rationing
of monthly amounts of meat, sugar, and gasoline between 1942 and 1945 appeared
less restrictive to those who during the 1930s could not afford to buy
meat or automobiles. Given this perspective, the New War against terrorists
does not promise to be a good war. The New War is suggestive much less
of Pearl Harbor than of Tocqueville's warning
170 years ago that Americans will have trouble conducting a complex, secretive,
long-term foreign policy because they have a short attention span, have
too little historical understanding to comprehend long-term interests,
don't trust unforthcoming governments, and prefer such domestic pursuits
as making money. To some students of U.S.foreign
policy, this formulation has become known as the Tocqueville problem: how
can a complex, pluralistic, entrepreneurial society be organized and disciplined
over long periods to support a foreign policy or war? This problem has
necessarily been at the center of presidential concerns in every war Americans
have become involved in since the late 1790s.
Good
wars, if Tocqueville is correct, turned out to be those when some were
sent to die for a great cause and most remained safe and increasingly prosperous
at home. The wars of 1812 and 186l-1865 may well have been necessary, but
given the near-defeat and destruction of U.S. property in the first, and
the incredible bloodshed in the second, few would have called them good.
President George W. Bush recognized a massive fact of American history
when he at once dispatched a professional (that is, non-conscripted) force
overseas, while telling Americans to invade shopping malls, pocketbooks
at the ready. During World War II, President Roosevelt asked Americans
to save, especially through war bonds, as they enjoyed increasing prosperity.
In the New War, President Bush orders Americans to spend; both the Tocqueville
problem and the nature of 2lstcentury capitalism demand it.
It
would seem this New War might be different and move outside Tocqueville’s
categories. For the first time since 1812, a foreign enemy has inflicted
heavy civilian casualties on the U.S. mainland, indeed on the nation’s
largest city. The question nevertheless has to be asked as Tocqueville
formulated it: whether, and if so how long, Americans are willing to change
their way of acting and, above all, thinking in a war that, as the president
constantly warns, can last decades and not (as in, say, Kuwait or Kosovo)
hours or months. A thirtyminute- longer wait at airline check-in counters
does not count in answering this question. Gun battles at 30,000 feet with
unbalanced passengers, or a long economic downturn during an extended,
unpredictable war, do.
How
Americans think about this new war, and how patient they will be in accepting
the government’s definition of the demands for that war, will depend not
only on whether, as occurred during the Vietnam conflict, they estimate
that the costs are far outrunning the benefits. Contrary to weird advertisements
recently appearing in a number of college newspapers, the peace movement
of the 1960s and early 1970s was not anti-American. But it was, contrary
to these historically ignorant advertisements, much less important in terminating
the U.S. involvement than were business leaders (who, as they told Secretary
of Defense Clark Clifford in 1968, were frightened of the terrible economic
conditions looming just ahead if policy were not changed), or the U.S.
military forces which, with their own institutions at stake, began a fundamental
reevaluation of their relationship to both American society and U.S. foreign
policies. This turned out to be a rethinking that led by the mid-1980s
to the Weinberger or, as it is now known, Powell Doctrine. The Doctrine
demanded that presidents meet certain criteria before placing troops in
harm’s way. It sometimes, but not usually, meant not sending them out at
all.
This
self-analysis undertaken by American military leaders spoke directly to
the Tocqueville problem (and was obviously caused by it). The reevaluation
has turned out to be one of the most significant historical developments
of the last quarter-century. This Doctrine, more than Afghanistan’s mountainous
terrain or the fear of an Islamic backlash, will determine U.S. policy,
especially while Colin Powell is secretary of state. Stated more accurately,
the effects of Afghanistan’s topography, the nature of Islam’s response,
and the political temperature of the American people will be viewed through
the prism of the Powell Doctrine. Whether one agrees or (as in Madeleine
Albright’s case) disagrees with the Powell Doctrine, it seems that at the
moment U.S. military and economic power came to dom- inate what was termed
a unipolar world power structure, the military’s new thinking went far
in determining how that power would or would not be used.
Such
fundamental rethinking is rare in American history, and for students of
U.S. history, and especially foreign policy, such rethinking is long overdue.
What has passed for rethinking has too often been either narcissistic analysis
of American culture without reference to the arena of international power
within which that culture exists, or an examination of an international
history whose most notable characteristic is an emphasis on other cultures
(often defined by western-designed categories), without any apparent understanding
of, or concern for, the many forms of U.S. domestic power that so largely
shaped that international history over the past century. When written this
narrowly, so-called international history takes the exercise of considering
Hamlet without the Prince to entirely new levels. If the origins and evolution
of the present crisis are to be understood, a starting place will be the
policies of the sole superpower over the past 40 or so years—the starting
place, it needs to be emphasized, not the ending place. The reaction of
parts of the globe and of foreign cultures to what, as Martin Sklar has
noted, has actually been a century of diplomacy demanding the open door
for American ideas, goods, and often military presence, must also be closely
examined. But a starting place for this analysis is not the reaction of
foreign cultures.
For
all the recent talk about soft power (that is, culture separate from the
hard power of economic, especially technological, pressure, and of coercion
by military force), the present crisis is shaped by cultural values attached
to direct economictechnological pressure, and, at times, when the need
arises and the Powell Doctrine permits, to force. Soft power is an oxymoron
historically as well as clever in literary terms. McDonald’s power to change
the way other people eat is not due solely to its hamburgers, but to its
myriad technologies and U.S. marketing practices. Americans like to distinguish
neatly between the supposed soft power of their corporations and the hard
power of their government, especially its military. Many people who destroy
American property in southern France or parts of Asia do not make such
careful distinctions.
These
several views of the same American phenomenon would seem to go along with
the parts of post-modern cultural theory which maintain that texts and
events have to be interpreted relatively according to context, reader,
or the author’s multiple meanings. But these views have nothing to do with
such theory. They have everything to do with an understanding of American
power and its reception in other cultures. In any event, the relativism
related to so-called post-modernism loses some of its explanatory power
when applied to bin Laden’s unqualified text that he would like to kill
every American he can reach (unless the American “changes” to bin Laden’s
own view of the world).
American
intellectuals, and too many others, have experienced this problem before.
In the 1930s, a generation of liberals, exemplified by Carl Becker, endured
severe frustration and guilt resulting from their involvement with the
Woodrow Wilson administration’s rape of the truth in 1918-1919 in order
to make the world safe for democracy (and, as well, to deal with the Tocqueville
problem at home). Many of these disillusioned dealt with their problems
by retreating into a relativism that paralyzed them as Hitler came to power.
Becker’s famous analysis in his American Historical Association presidential
address of how many ways a document could be read, depending on who was
reading it, proved not to be useful in understanding Mein Kampf.
It took the so-called realism of Reinhold Niebuhr and Hans Morgenthau to
take U.S. intellectuals and policymakers over to a more practical approach.
Given the shortcomings of that realism, not least its refusal to consider
the domestic power blocs that shape the means and ends of foreign policy,
one hopes something better will emerge from the present rendezvous with
relativism’s results.
It
is difficult to find a silver lining in a war that kills nearly 6,000 civilians
at the outset. But one beneficial result might come from a new emphasis
on reexamining the internal dynamics of the world’s superpower —how they
have reshaped or tried to reshape other societies, how they have nearly
led to a larger number of U.S. military interventions in the ten years
of post- Cold War than in 40 years of Cold War (despite the supposed restraints
of the Powell Doctrine), and how the Tocqueville problem can worsen dramatically
if, as was the case in 1952 or the late Vietnam war years, the New War
lingers on with the accompaniment of American war dead and economic stagnation.
This is not a prediction of what will occur, but a consideration of what
can occur if the Tocqueville problem is ignored, or is smoothed over temporarily
by misleading government statements and cheerleading.
Given
the nature of the Osama bin Laden and Saddam Hussein regimes, and the explicit
threats they pose to American lives (threats that have already been made
real), a war has to be fought. The reasons for that war, deeply rooted
in the history of politics, foreign policy, and technology, had better
be understood and explained by both government officials and historians.
The trade-off of military needs, if this New War is to be successfully
waged, against the requirement that Americans become associated with highly
undemocratic, militaristic, even medieval, regimes, will have to be explained
and debated. The tradeoff of internal security against the restriction
of civil liberties (that panoply of liberties for which the war is allegedly
being fought) will have to be explained and debated. The simultaneous waging
of the war against terrorism while carefully considering how Americans
should think about other foreign policy problems, such as a rapidly changing
China and an increasingly unstable Latin America, has to be explained and
debated. Doing all this simultaneously challenges the Tocqueville problem
with a dangerous overload.
Playwright
Arthur Miller once reformulated the Tocqueville problem by remarking that
Americans respond to a call for righteousness if they mistake it for a
call to lunch. The New War will be an ultimate test of Miller’s skepticism,
and one hopes he is wrong. Meanwhile, it might also be remembered that
in the hard power world of international affairs and terrorism, there is
no free lunch.
Walter
LaFeber is the Marie Underhill Noll professor of American history at Cornell
University. He is the author of The Clash: U.S.—Japanese Relations throughout
History (W.W. Norton, 1997), which won the Bancroft Prize.
Teaching
Religion in AmericanSchools
and Colleges: Some Thoughts for the 21st
Century
by
Wilfred M. McClay
Of
all the surprises the 20th century
had in store for us, none was greater than the amazing persistence of religion.
For better or worse, the older dream of a fully privatized religious faith
and a fully secularized public life, the dream of what the Rev. Richard
John Neuhaus memorably labeled “the naked
public square,” seems to be losing its hold on the national imagination.
Religion has gained a new lease on life in contemporary America,
partly because its would-be displacers have failed to supply an equally
compelling framework for meaningful, morally coherent lives. Like it or
not, we seem to be launched into an era in which once-settled terms of
separation between religion and public life are in the process of being
renegotiated.
One
would not know any of this was happening, however, if one were to judge
only by the content of texts and courses in American history offered in
our schools and colleges. There, old assumptions still reign,
the tenets of a soft, cautious, inoffensive secularism that omits rather
than debunks. One still encounters only the most tangential or unavoidable
mention of American religious beliefs and practices. This is not just a
matter of the disappearance of religious perspectives on the subject
matter at hand. Less defensibly, it also has entailed the near-disappearance
of religion as a subject worthy of the serious attention of any
educated person. The result is an account of the American past that is
radically deficient.
To
be sure, scholars and teachers should strive to maintain a Weberian
distance between themselves and the vagaries of public opinion. Otherwise,
they have nothing distinctive to offer, and their salt will lose its savor.
But the renewed visibility of religion in American society should also
prod us to consider ways we can make that subject better reflected in the
way American history is studied and taught. Such a shift would remind us
that the exclusion of religious factors from the story of the American
past is a gross distortion, tantamount to the exclusion of such massive
structural factors as climate, geography, economics, and demography--and
that religion is not simply reducible to some combination of material and
social factors. This is not to say, of course, that our understanding of
religion should simply revert to what it was before the advent of secular
historiography. That would be absurd, not least because it would mean throwing
out valuable insights along with the ideological bathwater.
But
when the American Historical Association’s most recent catalogue of publications
designed to “strengthen the study and teaching of history” includes not
a single publication addressing the religious history of the United
States—while making room for such subjects
as “Gender, Sex, and Empire” or “Teaching History with Film and Television”—it
is clear that something is terribly amiss. Students—and especially students
who come from secular backgrounds—desperately need to know about the religious
element that permeates the human past and present. And students of American
history need to encounter a more even-handed understanding of the respective
roles played by both religious and secular perspectives and institutions
in the making of the American nation.
This
will not be an easy understanding to achieve. But it is important to make
a beginning, and in what follows I would like to offer four general suggestions
as to how the current imbalances can be addressed, and how religion, both
as a subject of history and as a perspective on history,
can begin to be accorded its rightful position in the curriculum.
1.
Remember—and take comfort in the fact—that one size does not fit all.
One
of the greatest strengths of American education is its astounding institutional
diversity. We have schools and colleges that are public, private, Roman
Catholic, Jewish, Episcopal and other mainline Protestant, generically
evangelical Protestant, sectarian, nonsectarian, single-sex, “historically”
black, Indian—the list goes on and on. And even among public institutions,
there is enormous variation according to demography and region.
This
is a diversity to be celebrated, and preserved. Under such circumstances,
it would be a mistake of the first order to attempt to devise a standardized
approach to the teaching of, or about, religion. This is true not only
for prudential reasons—i.e., that a school leader cannot do in New York
what he or she can do in Alabama. It is also true for deeper reasons. No
task facing American education is more important than that of preserving
its institutional diversity; and that means that religious schools and
colleges ought to be especially keen about intensifying their efforts to
be distinctive, rather than to be all things to all people. The single
best encouragement we can give to the retvitalized teaching of American
religion is the self-conscious fostering of precisely such institutional
variety. That represents American pluralism at its best, and would
be infinitely preferable to the universal adoption of a lowest-commondenominator
canon of knowledge about religion. Let the variety of opinions be robustly
reflected by a competing variety of distinctive institutions—rather
than striving to have them be reflected universally, but anemically, by
a prescribed “diverse” perspective imposed uniformly upon all institutions.
2:
Be clear about what we are studying when we study “religion.”
What
kind of knowledge are we seeking when we study “religion”? Is it knowledge
about the nature of ultimate reality, with “religion” representing the
repository of our most important and far-reaching reflections upon it?
Or is it knowledge about the history of certain social and cultural institutions,
and about the patterns of belief and practices of sacralization and worship
in human societies, that we are exploring? And what should be our proper
attitude or disposition toward the study of “religion”? Obviously, we want
to steer between unrelenting antagonism and uncritical acceptance. But
how exactly do we situate ourselves on the sliding scale of sympathies?
By
and large we want to treat religion as an integral part of life, as something
“lived” as well as thought, felt, and professed. This is all well and good.
Religious history is never just the story of ideas, or even of the institutions
in which those ideas are housed. Often it is the story of the gaps between
profession and action, between what people say and what they do. Sometimes
it seems to be more of an account of human frailty, hypocrisy, and veiled
power relations than one of piety and righteousness. But in the end religious
history has the virtue of presenting us with religion as a central organizing
principle of social and cultural life, and therefore as something highly
integral in importance, as well as highly functional in character.
There
is considerable value in looking at “religion” in this generic and functional
way. But there is also a danger in it—the danger of trivializing religious
belief. The reduction of diverse religious commitments to an algebra of
faith, in which one can plug in the relevant variables for any given situation,
does not really take seriously the intellectual and cognitive power of
specific religions, as they inhabit the minds of believers. It tends to
convert “religion” into something very different from what the believer
thinks it is. This is perhaps not entirely a bad thing, within limits.
But it can quickly take on a pattern well described by the philosopher
George Santayana:
Any
attempt to speak without speaking any particular language is not more hopeless
than the attempt to have a religion that shall be no religion in particular
. . . . Thus every living and healthy religion has a marked idiosyncrasy.
Its power consists in its special and surprising message and in the bias
which that revelation gives to life. The vistas it opens and the mysteries
it propounds are another world to live in; and another world to live in—whether
we expect ever to pass wholly over into it or no—is what we mean by having
a religion. (Reason in Religion)
Religion
is the ultimate in “totalizing discourse,” the master narrative of master
narratives. Hence, students should come away from the study of religion
with the feeling that they have passed through the eye of a massive storm,
through a force of immense power for creation and destruction, and therefore
of immense consequentiality, since every religion is in some way an attempt
to take account of the ultimate and of our proper relationship to it. It
is for that reason—and not out of a misplaced sense of relativism or multiculturalism
—that students should learn to understand and accord basic respect to established
faiths other than their own.
3:
Be prepared for the fact that the reintroduction of religion to the study
of American history will entail a more general change in the way the subject
itself is studied.
Educators
will find it tempting simply to insert the facts and narratives of religious
history into the existing accounts of the American past—a strategy that
might be summarized as, “Add religion, and stir.” Such a move is by no
means to be disdained. It would be a great step in the right direction.
But it would still not be entirely sufficient. The addition of religion
to a program of study will require that historians and teachers of history
take much more seriously, and make much more central, the role of ideas
and consciousness in human history. This will require a dramatic reorientation
in a discipline that has prided itself upon its growing emphasis on general
causes and large-scale structural changes as the only adequate explanations
of historical developments. Such an emphasis has yielded countless valuable
insights, but it only tells us about part of the human condition. Those
disciplines involved in the study of humanity must also take account of
the fact that it is humans that they study, not billiard balls, and that
any explanatory scheme that fails to take account of human volition, human
agency, and human consciousness, is clearly inadequate to the task at hand.
It
is largely meaningless, for example, to talk about what historical actors
are doing without some reference to what they think they are doing.
Our ideas are our maps of reality itself, the blueprints according to which
we order our desires, our morals, our choices, our goals, and our dreams.
This does not mean that we are never impelled by material causes, only
that even strictly materialistic motivations can be understood as
such only through the filter of a set of ideas. The history of ideas and
culture is at the very core of our self-understanding, for it records the
ways that we have wrestled with the most urgent questions of human existence.
Seen in that way, such a record, far from being a mere ghostly procession
of disembodied abstractions, takes on the most pressing and riveting human
importance.
Therefore,
explanations of historical events that attempt to drain the element of
religious conviction from them are reductive falsifications. Our accounts
of the Puritan migration, the Salem witch trials, the American Revolution,
the abolitionist movement, the Civil War, immigration and nativism, the
Progressive movement, the rise of Protestant fundamentalism, the African-American
civil rights movement, the anti-abortion movement, and the like, cannot
pretend to be adequate if they fail to respect the conscious and deeply
felt religious orientation by which the key historical participants believed
themselves to be animated. This does not mean we should simply take historical
actors at their word, and leave it at that. No historian should ever do
that, at least not in any simple sense. Rather, it means refraining from
disregarding their plain words, and the concepts behind their words—a
grossly reductive and historically impoverishing practice that is all too
common.
It
follows, therefore, that a more adequate incorporation of religion within
historical study should emphasize the study of original texts, both
popular and learned, read with a view toward the sympathetic apprehension
of the world picture that lies behind those texts. All students of American
history should read, for example, the Puritan leader John Winthrop’s “A
Modell of Christian Charity,” the speech he gave aboard the Arbella
before she came ashore at Massachusetts Bay in 1630. Such a reading will
disclose to them the profound religious conviction (and the direct Biblical
allusion) behind his famous description of the colony as “a city upon a
hill.” They should read Roger Williams’s Bloudy Tenent of Persecution,
one of the signal documents in the history of American religious liberty
that even today constitutes one of the strongest arguments for a more expansive
reading of the First Amendment’s guarantees of free religious exercise.
They should study the lyrics to Julia Ward Howe’s “Battle Hymn of the Republic,”
to see just how profoundly Northern reformers’ sense of the Unionist cause
was shaped by their religious sentiments. They should also read the writings
of James Henley Thornwell and Robert Lewis Dabney to get a sense of the
very different ways that Protestant Christianity was understood south of
the Mason-Dixon line. Above all, they should read the great political speeches
and documents of American history, ranging from Lincoln’s Second Inaugural
Address to William Jennings Bryan’s “Cross of Gold” speech to Albert J.
Beveridge’s “The March of the Flag” to Franklin D. Roosevelt’s First Inaugural
Address to Martin Luther King, Jr.’s “Letter from a Birmingham Jail,” all
of which are inescapably laced with Biblical allusions, Biblical ideas,
and Biblical sentiments.
To
train students to reach the point where they have the ability to read such
documents with understanding and imaginative sympathy would be a worthy
goal indeed. One has to be realistic about even such a modest goal, of
course. The reading abilities of students, and the level of their “religious
literacy,” including their knowledge of the Bible, are both far below what
one could desire. We all are painfully aware of that fact. Moreover, one
of the dirty little secrets about the debunking “hermeneutics of suspicion”
that prevails in so many of our classrooms is the fact that the casual
debunking that so often masquerades as “critical thinking” is far easier
than the alternative, which requires the acquisition of real knowledge.
But the turnaround of our condition has to begin somewhere, and there is
no better place to begin than with a collection of documents such as those
listed above. There can be no substitute for the kind of direct and relatively
unobstructed window onto the past that an original document provides. Such
documents will make it unmistakably clear that we do violence to the past
when we fail to acknowledge the religious sentiments that pulsate through
it.
4:
Emphasize the importance of “religious literacy” as an essential component
in citizenship and civilized life.
Even
students and teachers who maintain a resolutely secular outlook can be
convinced that some rudimentary knowledge of religion ought to be part
of a decent liberal education, precisely because liberal education is an
initiation into a conversation about the means and ends of one’s civilization.
People of faith have always played a central role in that conversation.
Unless we willfully expunge the voices of religious adherents from the
record of humanity, we have to acknowledge religion’s profound historical
role in the shaping of culture and the formation of morality. Whenever
we think of our attempts to describe and define those things that we think
of as ultimate—life, death, suffering, salvation, guilt, forgiveness, love,
community, virtue, and so on—we have to acknowledge that religion has always
had a central part in such attempts. Our ideas of human rights, law, social
obligation, privacy, social welfare, race, gender, moral responsibility,
education, childrearing, medicine, adulthood, competency, selfhood, liberty,
equality, etc., impel us back to a consideration of religious conceptions
of the human person, if only as an historical starting place.
In
addition, there are processes at work in the contemporary world that promise
to undermine our free-andeasy assumptions about the meaning of the human
person, and for which a fuller understanding of religious perspectives
may form a vital resource. There is, for example, the phenomenon of the
ever-shrinking globe, ushering in a new world that is economically integrated
without being culturally or politically cohesive, and therefore likely
to experience social unrest, massive personal displacement, and pervasive
individual anxiety.
As
we wrestle with this powerful and intrusive push toward greater global
homogeneity, and the inevitable backlash it spawns, it is critically important
to distinguish those things that are, or ought to be, true of all human
beings from those things that are naturally diverse, and rightly peculiar
to human beings dwelling in particular local, regional, and national cultures.
These latter, more particularist considerations should include a respectful
awareness of deep and historically grounded religious commitments and differences.
Nothing could be more misleading than the easy and arrogant assumption,
so common among the college-educated in America, that we now live in the
age of ultimate truths, and that all the world is drawing closer and closer,
bit by bit, to our own “enlightened” secularity. This is a disastrous attitude
to take in confronting a world in which religion’s place looms as large
as ever in the overwhelming majority of people’s lives. Indeed, that assumption
is equally misleading even when applied to the United States alone—which
is why an approach to American history that neglects its religious dimension
is not only inaccurate, but damaging, to the extent that it fosters incomprehension
and intolerance in precisely those circles that ought to know better. One
of the things that a liberal education ought to free us from is the ignorance
and self-absorption of those who cannot imagine any world other than their
own. Cosmopolitanism ought to be something more than just the provincialism
of the “educated.”
Sympathetic
exposure to a wide range of American religious beliefs and practices, then,
should be a central feature of civic education in American schools and
colleges. History is perhaps the perfect discipline within which this exposure
can be effected, simply because teachers and students of history are not
required either to accept or reject particular religious truth claims in
order to appreciate their indisputable importance and influence. In that
sense, the serious study of religion can be an ideal meeting ground between
the believing and unbelieving, an intellectual commons whose very existence
can make a genuine contribution to American pluralism—a pluralism that
is, alas, often more honored in the breach than the observance by our current
educational establishment.
Wilfred
McClay holds the Sun Trust Bank Chair of Excellence in the Humanities at
the University of Tennessee at Chattanooga. He is the author of The Masterless:
Self and Society in Modern America (University of North Carolina Press,
1994), which won the Merle Curti Award from the Organization of American
Historians.
Teaching
the Holocaust in America
by
Paul Lyons
Recently
I taught a master’s level course called “The Holocaust and the American
Experience” with the intent of placing the Shoah
within an historical and comparative framework, recognizing both its distinctive
features and its inevitable similarities to other moments of horror in
human history. I was particularly concerned that my students, future American
public school teachers, be able to respond to the question, “Why the Holocaust
and not (fill in the blank)?”
My
own perspective, informed in part by the work of intellectual historian
David Hollinger and Holocaust scholar Yehuda
Bauer, is that isolating the historical injustices experienced by a particular
group—Jews or other—risks playing into what some have called a hierarchy
of victimhood, a form of identity politics
in which only one’s own suffering counts. Too often educational institutions create
group-oriented studies programs, e.g., Jewish Studies, African-American
Studies, Women’s Studies, Gay and Lesbian Studies, etc., in which students
fixate on a particular group without stretching their imaginations and
ethical concerns to include the broadest range of those who have been mistreated.
Indeed, I want students to begin with the axiom that all peoples have histories
which include glory and shame, that all peoples have the capacity to regress
toward intolerance, ethnocentrism, discrimination and bigotry.
In
this regard, I weigh in on one side of the recent arguments between Christopher
Browning’s “ordinary men” of police battalion 101 and Daniel Goldhagen’s
“ordinary Germans, who he claims were carriers of an exterminationist
anti-Semitism.Goldhagen’s approach
seems to me to deny the social and psychological dynamics which might lead
quite conventionally decent human beings to slide down a slippery slope
of committing evil acts of genocide. Given recent history it seems reckless
to deny the human capacity to participate in mass murder. At the same time,
such recognition must not rush headlong toward a Hobbesian
or social-Darwinist condemnation of human nature. What we as educators
need to understand and communicate to our students is the contextual nature
of human behavior, its range and subtleties, and the contradictory ways
that humans respond to moral challenges. As such, we teach humility before
the wonder—the heroism, the cowardice, the insensitivities, the villainies—of
our own natures, our own histories.
My
course began with a consideration of Western and, particularly, American
aggressions against Native-American Indians. It then examined the experience
of African-American slavery and racism, followed by considerations of anti-immigrant
nativism, the Klan of the 1920s, possible fascist dangers during both the
Depression and the McCarthy era, American governmental responses to the
Holocaust, the internment of Japanese- Americans during World War II, the
My Lai massacre, the contemporary dilemmas of multiculturalism and ethnic
diversity, and concluded with an all too brief evaluation of Peter Novick’s
recent and controversial The Holocaust in American Life. How is it that in a nation which was
essentially distant from the Holocaust we have generated such a remarkable
cultural and institutional support system for its remembrance and study?
And, how is it that this same nation, whose most morally indefensible behavior
was directed against Native- Americans and African-Americans, has not found
the resources or the funding to address those genocidal aspects within
our history? I do not share Novick’s questioning of the validity of the
enterprise which has built the U.S. Holocaust Museum. I do, however, join
with Novick in asking what accounts for this seeming imbalance.
At
first, my students seemed wary of what the course was about. After all,
there was virtually nothing in the syllabus that was particular to the
Holocaust. But by the end, most seemed enthusiastic about approaching the
Holocaust comparatively, indeed seeing it as inherently comparative. This
is not to engage in the pernicious game of comparative suffering or to
deny the value of assessing those aspects of the Holocaust which remain
distinctive and unprecedented. It is only to come to an understanding that
all genocides carry such singularities, all are at some level, unique.
The point, however, of historical reconstruction is to frame the particular
within the general, to see simultaneously what is only true of the Shoah
and yet offers insights to other moments of human criminality.
I
tell my students that the study of history is most similar to the theatre.
We must ask what it was like to exist at another moment in time and space.
This is what we mean by a liberal arts education. It’s not enough to imagine
being the oppressed— although that is essential—but one must also consider
what kinds of thoughts, feelings, customs, behaviors, values, idiosyncratic
experiences, led, for example, a white Protestant in the 1920s to join
the Second Klan or to put out cigarettes on the bodies of those engaging
in sit-ins at the lunch counter in Greensboro, North Carolina in February
of 1960. These are the questions which, finally, matter the most in the
lives of our students. Do they go along with peers, or do they stand up
for what is right? Do they join or confront the bullies of the world?
David
Bankier argues that no more than 5 percent of Germans were pathological
anti- Semites while perhaps the same percentage were heroic, “righteous
gentiles,” willing to stand up and be counted, often at great risk to themselves. Based on this, I would suggest that
the vast majority of people turn away from the kinds of injustices which
wind up highlighted in the history books. Such people determine the parameters
of genocide; therefore, we need to know more about such behaviors, those
of “ordinary” men and women who tolerate, sanction, or finally engage in
acts of evil.
In
my book Class of ‘66, baby boomer respondents described themselves
as living in a suburban cocoon, walled off from the troubles of nearby
impoverished cities and of global hot spots. Critics of suburbia emphasize, perhaps
unfairly, this propensity to turn away from the life of the polis.
In fact, many suburban people are extensively engaged in what Herbert Gans
calls their micro-society, participating in charity drives, helping out
neighbors in distress, and generally being good citizens. But they raise the drawbridge over
the moat connecting them to the problems of “the other.” They do not hate
“others,” although they carry very real prejudices. They simply act as
if such folks do not exist within the same moral universe. And if there
is a crisis, if there is a test of moral resolve, if they are placed in
a situation in which they are under peer pressure to conform to a moment
of ugliness and bigotry, they may succumb. As Bankier suggests, at a moment
when their society is in crisis, they may be drawn into the small minority
of haters.
We
cannot expect all of our students to become “righteous gentiles.” We will
be fortunate indeed if we increase at the margin those who are willing
to stand up for justice. But human behavior being what it is, we remain
burdened with the knowledge of how difficult it is to educate individuals
to identify with all of “the others,” to construct a global identity focused
on human rights. Given the trauma of the Great War, Sigmund Freud asserted
not only that reason and enlightenment were fragile, but also that there
was something in the existence of human intelligence which never allowed
the darkness to be all-engulfing, and that this inextinguishable light
of humane thought had a surprising persistence.
Our goal as educators is to widen that ray of light, to assist a few more
ordinary men and women to resist the extraordinarily evil and to stretch
toward the extraordinarily good.
When
we began the course, many students bridled at the use of the word “holocaust”
in the title of our first reading about Native Americans. Some were still
feeling the exclusivity of language, the competition over suffering and
victimhood. In one of our last discussions, we looked at a piece by Native-American
author Robert Allen Warrior, in which he discusses his ambivalent feelings
about the Exodus story. Warrior speaks of how strongly he was compelled
by Martin Luther King’s Exodus imagery of going to the mountaintop, seeing
the Promised Land, crossing the river Jordan. He reports being stunned
at the realization that Indians were in fact the Canaanites of the American
experience and that, as such, the compelling Exodus story was tarnished
by its impositions upon all of the indigenous peoples of the world. As
Warrior concludes, “I read the Exodus stories with Canaanite eyes.” I added that I had come to similar
conclusions when I visited Vietnam in 1985. As an old anti-war activist
in the land that my people had napalmed and strafed with agent orange poisons,
I realized that the Vietnamese had earlier migrated down from the Red River
Valley to the Mekong, displacing the indigenous Champa, who retreated to
the highlands in the face of the invasion by a technologically more advanced
people. Such contradictions, of peoples simultaneously victimizer and victim,
were at the heart of my course.
Each
October, we engage in what is often a fruitless, painful exercise regarding
Columbus Day. When I was a boy, we thoughtlessly honored this extraordinary
and daring sailor. That was before a Eurocentric norm came under challenge.
Beginning in the 1960s, activists and then scholars raised essential and
telling questions about the consequences of this “discovery” of a “New
World,” questions about its genocidal impact on Native-American Indians,
and on the Africans who would be involuntarily dragooned to the Western
Hemisphere in the infamous Middle Passage to generate profits from sugar,
rice, and cotton slave plantations. In fact, we can only celebrate Columbus
Day if we all agree that its significance is inherently contradictory and
double-edged, as is the story of all peoples. In one of my classes the
students unanimously argued that the world would have been better off without
Columbus, the United States, and the rest of the nations of the hemisphere.
Actually, they didn’t add that last part; I did. I informed my Puerto Rican
and Colombian students that without Columbus, they wouldn’t exist; there
would be no Puerto Rico, Mexico, or Colombia in the strict sense of those
terms. They were taken aback. We then were able to begin a more nuanced
dialogue about the contradictory nature of historical experience, including
that of Columbus.
My
friend Dan Bar-On, the Israeli psychologist who brings together Jews and
Germans, Palestinians and Israelis, black and white South Africans, and
Protestant and Catholic Irish, has taught me how difficult it is for victims
to give up their desires for vengeance, their rage at the injustices they’ve
experienced; how difficult it is for victimizers and, especially, their
accomplices, to come face to face with what they have wrought in causing
human suffering. Coming to grips with one’s own culpability
is a tremendous moral burden, but I would advise against an approach that
relies on guilt. What matters, finally and in the moment, are outcomes,
consequences, actual behaviors informed by beliefs. As educators, we need
to focus on changed behaviors. What do we want to happen as a result of
our teaching of the Holocaust? Frankly, focusing on anti- Semitism does
not strike at the heart of the moral dilemmas facing many students. As
Peter Novick so strenuously argues:
…for
most Americans deploring the Holocaust is a rather ritualistic, albeit,
undoubtedly well-meant, gesture towards the Jews who ask them to do so—a
cost-free avowal that, as decent people they are moved by the murder of
the Jews.…the memory of the Holocaust is so banal, so inconsequential,
not memory at all, precisely because it is so uncontroversial, so unrelated
to real divisions in American society, so apolitical.
In
this America of ours, students need to struggle with what could be both
costly and quite political. My own view is that the best
way to help students respond to moral challenges is to help them to understand
the contradictory strands of heroism and knavery, the victimized and the
victimizing, of many of our peoples. They need to know about prejudice
directed against the Irish, Italians and Poles; they also need to know
about the process by which such groups became “white.” They need to examine
how turn-of-the-century immigrants imposed the same stereotypes used against
them to fend off and marginalize other groups like Puerto Ricans and African-Americans.
The epicenter of discourse must be the horror and the wonder, the pain
and the humor of these groups’ historical amnesia or, perhaps more accurately,
selective memory.
Indeed,
what I propose and seek to implement in my Holocaust course is a form of
the Socratic notion of knowing oneself. The double helix of all peoples,the
intertwining of their burdens and their inspirations, their hidden shames
and forgotten accomplishments, makes it more likely that they will be able
to recognize the same complexity in others. That is the real challenge
of Holocaust education.
Recently,
Henry Louis Gates has provided all of us with an exemplary approach. His
television history of Africa was willing to address the complicity of Africans
in slavery and the slave trade. He knew he would face criticism for this
washing of one’s dirty linen in public, in front of “the other.” And he
has. But Gates understands that such complicity in no way undermines the
moral assault on slavery and racism that his series wages; indeed, such
a morally complex narrative strengthens the ethical challenge by making
clearer the fullness of the tragedy and the evil.Another extraordinary example which
models the contradictory nature of bias is the theatrical work of Anna
Deavere Smith. In “Fires in the Mirror,” her one-person show about the
Crown Heights clashes between Hasidic Jews and African-Americans, Smith
listens to and then performs multiple voices, from Al Sharpton to Lubavitcher
Hasids, from housewives to street toughs—young, old, black, Jewish, rich,
poor, the enraged and the saddened. She offers no solutions, only questions
within a framework of empathy and a sense of our responsibility—individual
and collective—to bridge the kinds of differences which can yield such
insularities, hatreds, and, indeed, crimes. Gates and Smith stand as African-Americans,
boldly stating that their people are strong enough, mature enough, and
proud enough to present the fullness of their historical legacy.
In
my seminar some of the very best discussions occurred following the viewing
of the 1945 documentary “The House I Live In,” featuring Frank Sinatra.
The very young Sinatra grabs a smoke following a recording session only
to encounter a group of kids beating up a young boy. Sinatra gently interrogates
them to discover that they are picking on a Jew because “he ain’t our kind.”
Sinatra, then a New Deal Democrat and Popular Front supporter, evokes the
multicultural messages of the war effort and sings about what America means
to him, a nation of “all races and religions.” Such a film reflected the
anti-fascist pluralism which integrated the turn-of-the-century immigrants—
Catholics and Jews, Italians, Poles, and Greeks—into what came to be called
the Judeo-Christian tradition. The discussion begins when I ask what is
left out: how would this remarkable short be remade in the early twenty-first
century? Let’s rewrite it to include women, people of color, gays and lesbians,
the disabled.
As
teachers we struggle with students who hold back from authentically discussing
issues of prejudice, who go silent or simply echo agreement. It is hard
work to achieve honest discussions; all students enter with bruises. One
must establish a trusting environment for such discussions to be fruitful.
Trust doesn’t exist at the beginning of a class; I tell students that the
handshake is an apt metaphor for our relations —I hold your hand, you hold
mine— we trust one another but I also prevent you from hitting me in case
that is your hidden desire. We trust and mistrust simultaneously. And then
we can begin to have an honest discourse.
Anti-Semitism,
at its heart, is fear of “the other.” To effectively inoculate against
it, one must work toward a more generic injection, one that sensitizes
students to the extraordinary contradiction of our history: we are a nation
of the most magnificent promises and dreams—equality, unalienable rights,
the right to pursue happiness—contradicted by our nightmarish acts against
various forms of “the other,” from Native-American Indians to the disabled.
Our glory, that which leads the rebels in Tiannamen Square and in Prague
and in Soweto to look toward the American example, is intertwined with
our shame. And that is how it is, to one degree or another, for all peoples
and all faiths.
We
have lots of work to do, but the beginnings are grounded in a profound
sense of humility at the task before us. My experience teaching about the
Holocaust and the American experience encourages me to think that we can
take some joy in the ways in which human beings have translated the roller
coaster of their historical experiences into the best of our cultures.
Let us continue.
(This
is an edited version of a paper presented at the 30th Popular Culture Association
and 22nd American Culture Association Annual Conference, for a panel “Vietnam
War: Vietnam & The Holocaust,” chaired by the author, New Orleans,
April 22, 2000)
Paul
Lyons teaches U.S. history and Holocaust & Genocide Studies at The
Richard Stockton College of New Jersey. He is the author of A History of
the Movement in Philadelphia: The New Left in the 1960s (University of
Pennsylvania Press, forthcoming).
Daniel
Jonah Goldhagen, Hitler’s Willing Executioners:
Ordinary Germans and the Holocaust (New York, 1996); Christopher Browning,
Ordinary Men –Police Battalion 101 and the Final Solution (New York,
1992).
Peter
Novick, The Holocaust in American Life (Boston, 1999).
David
Bankier, The Germans and the Final Solution: Public Opinion Under Nazism
(London, 1992); Issues in the Study of the Holocaust (Jerusalem, 1993).
Paul
Lyons, Class of ‘66: Living in Suburban Middle America (Philadelphia, 1994),
234-237.
Herbert
J. Gans, Middle American Individualism: The Future of Liberal Democracy
(New York, 1988), 4, 64-66.
Paul
Roazen, Freud: Political and Social Thought (New York, 1968), 158-212,
289-322.
Robert
Allen Warrior, “Canaanites, Cowboys, and Indians,” in Rebecca Alpert, ed.,
Voices of the Religious Left: A Contemporary Sourcebook (Philadelphia,
2000).
Daniel
Bar-On, The Indescribable and the Undiscussible: Reconstructing Human Discourse
After Trauma (Ithaca, NY, 1999); Legacy of Silence: Encounters with Children
of the Third Reich (Cambridge, MA, 1989).
Novick,
Holocaust in American Life, 279.
PBS,
Wonders of the African World with Henry Louis Gates Jr., 1999.
PBS,
Fires in the Mirror, 1993.
Join
The Historical Society and subscribe to Historically Speaking
With
the spectacular publishing success of David McCullough’s John Adams
following close on the heels of Joseph Ellis’s Passionate Sage and
Pulitzer Prize-winning Founding Brothers, the second president’s
stock seems to have taken a dramatic rise, especially in comparison to
his rival-turned-friendly correspondent, Thomas Jefferson. Essayists have
pounced on the John Adams phenomenon to comment on such things as the state
of popular history and contemporary American intellectual culture, as well
as how Americans remember and commemorate their past.Often
lost in the hoopla is the indispensable work of the editors of the massive
Adams Papers, which has made the recent rediscovery of John Adams possible.
When
John Adams lost the close presidential election of 1800 to Thomas Jefferson,
his reputation as a major founder of the United
States began to dim. Few would remember
or fully appreciate that Adams had been the leading advocate of independence
in the Second Continental Congress, that this New Englander pushed the
Virginian George Washington into leading the American army, that in Europe
he had led the diplomatic struggle for a favorable peace treaty with Great
Britain, or that his presidency had held the new nation together at a time
when it might easily have been shattered.
In
the next two centuries political leaders often divided into ideological
camps of Jeffersonian and Jacksonian democrats
and Hamiltonian economic nationalists, but there were few if any Adamsonians.
When in the decade before the Civil War Charles Francis Adams published
ten volumes of selections of his grandfather’s manuscripts and a biography,
the thought and career of John Adams appeared largely irrelevant to Americans
engaged in taming the continent, ending slavery, and building the economy. Much more interest had been created a decade
earlier by this grandson’s publication of letters of his grandmother, Abigail
Adams.Today, at last, with a definitive publication
of his papers in process, a major evaluation of John Adams’s contributions
to the United States
is underway. Once again, the dependence of historians on the availability
of sources becomes clear.
John
Adams insisted that his papers be preserved, and his heirs largely complied
with his wishes and followed his example. The result was that by 1889 the Adams
manuscripts constituted the largest and most important collection of papers
of any American family. Use of the original manuscripts remained under
the family’s strictest control, but at the beginning of the next century
the collection was removed from the Adams homestead
in Quincy and given to
the Massachusetts Historical Society in trust for fifty years. Only a very
limited access was available to a few scholars approved by the heirs. Not
until 1956 did the Society receive full title to the papers and open them
to historians in a microfilm edition (1954-1959) of 608 reels, which if
spread out would extend for five miles. At the same time the Society began a definitive
publication of Adams manuscripts from all sources
that to date has produced thirty-six volumes in several series. In time,
several generations of editors will have published as many as one hundred
volumes, perhaps thirty alone devoted to the diary of John Quincy Adams.
Once the Adams Papers were opened, the late Page Smith accepted the daunting
task of preparing a comprehensive biography of John Adams. His two volumes
of 1140 pages, published in 1962 and 1963, became the standard source for
the next three decades. Smith saw the Adams Papers project
as marking “John Adams’s re-entry into American history.” He recognized
that the inner history of the family, so richly revealed in the manuscripts,
made it essential for a biographer to present Adams “with his foibles and
eccentricities, his blemishes as well as his virtues, so that he may be
seen in his full humanity.” Although the second president exhibited contradictions
and paradoxes in his views and opinions, he remained, in Smith’s view,
“remarkably steadfast” in his “fundamental convictions.” Thus his life
became a “tract for the times” in 1960s America.
Smith
sought to correct Adams’s detractors without exaggerating the three principal
roles of his public life: Massachusetts political leader and delegate to
the Continental Congresses, diplomat, and vice president and president.
Typical of Smith’s balanced approach and moderate judgments was his objection
that historians had tended to dismiss the Adams administration because
it was sandwiched between two great presidencies. Instead, Adams left office
after having given the nation “a small but effective navy, an augmented
army, a solvent treasury, and above all peace.” Whatever he lacked in administrative
skills, his policy and especially his character “served his country well.”
Smith’s monumental biography left historians in his debt for having judiciously
covered the complete Adams.
Despite
a flurry of writing on Adams’s life and thought after the microfilm of
the Adams Papers was issued, no new full biography appeared until John
Ferling’s concise but comprehensive volume in 1992. He brought to this task the extensive
background of his own writing on the American Revolution and a thorough
knowledge of the Adams manuscripts and related studies. The author warned
that this biography was “not meant to be an apologia for John Adams,” who
“often displayed unattractive qualities, including calculation, extensive
ambition, rage, jealousy, and vanity” and who was “ill suited for some
of the public roles he played.” Ferling’s Adams was a “one dimensional
man…given to incessant labor with the solitary goal of becoming a great
man,” driven by “an exaggerated sense of inadequacy.” Yet those who got
close to him “discovered a basic decency that earned their respect.” Ferling
sought to “discover the total person” and to place Adams in his time and
among other major American leaders.
In
his account of the shaping of his subject as a revolutionary, Ferling gave
extra space to the years before Adams left for Europe. Without neglecting
the role of John’s cousin, the passionate radical Samuel Adams, Ferling
saw John as “profoundly conservative” and “full of distrust of the popular
leadership” until he perceived a British threat to American liberties in
1773. Only then did he begin to understand that, short of major British
concessions, those liberties could only be preserved through independence.
In this, Ferling differed from those historians, including Page Smith,
who date Adams as a fervent patriot from the Stamp Act crisis of 1765.
True, he “had been slow to fully embrace the ideology of the popular movement,
but once he did so, he never wavered and he probably was one of the first
to accept the conviction that independence was desirable.” Adams was the
“great conservator of what had been achieved in America before the troubles
with the parent state.” But he feared the damage Americans might do to
their liberties in the creation of new governments, viz., giving the vote
to non-property owners or creating unicameral legislatures.
His
concern for constitution making in all the colonies lifted Adams from the
provincial mindset of other leaders, such as Samuel Adams or, at first,
even Thomas Jefferson. In 1779, Adams encapsulated his views on government
into a constitution for independent Massachusetts, a “document that occupies
a crucial place in American political history, for it ended the period
of legislative-centered government, . . . the pattern that had characterized
most of the earlier state constitutions,” and inaugurated the era of the
system of checks and balances represented by the presence of popularly
elected executives and independent judges, which were to exist coequally
with bicameral assemblies.
One
of Ferling’s strengths was his frequent comparison of Adams with his associates,
nowhere sharper than with fellow commissioner Benjamin Franklin during
the peace negotiations. So inflamed was Adams to deny Franklin his due
share of the credit for the outcome that he acted “irresponsibly” in his
communications to the British envoys. “Falling into a mood of black despair,”
Adams “stewed and simmered” at Franklin’s popularity with the French. Here
as elsewhere, Ferling tended to give Adams’s private musings full value.
Despite
some emphasis on his subject’s inner turmoil, Ferling strived for a balanced
approach to the Adams presidency. He did not minimize the president’s acquiescence
in the Alien and Sedition Acts, as Page Smith did, but he lauded the decision
not to go to war with France. He agreed with those historians who rated
Adams as a “near great” chief executive.
After
Ferling, no new biographies appeared in the 1990s, but two important studies
called for a more positive interpretation. Joseph J. Ellis, in Passionate
Sage, interpreted Adams from the perspective of his retirement years. He was, Ellis contended, “the most
misunderstood and unappreciated ‘great man’ in American history.” He “was
a veritable genius at recognizing what was central and what was peripheral,
what the national interest required and what history would allow.” It was
high time, Ellis maintained, to erect “an Adams monument on the Tidal Basin
in the nation’s capital, … situated sufficiently close to the Jefferson
Memorial, that depending on the time of day and the angle of the sun, he
and Jefferson might take turns casting shadows across each other’s facades.”
In
a study of Adams’s political and constitutional thought, C. Bradley Thompson
concluded “from the beginning of his public career until the very end,
John Adams always acted on principle and a profound love of country. He
devoted his life and mind to the cause of liberty and the construction
of republican government in America. He wanted liberty, equality and a
virtuous republic as much as any Jeffersonian.” Adams’s political treatise, the often-scorned
A Defence of the Constitutions of Government of the United States of
America, “may very well be the most important reformulation of mixed
and balanced government since Aristotle’s Politics.”
At
the beginning of this year, David McCullough published his 700-page life
of Adams in a widely-heralded first edition of 350,000 copies. The book, an unequivocal celebration
of Adams’s life and career, quickly reached the top of the New York
Times best-seller list. McCullough’s method is clear: he stresses Adams’s
major accomplishments and basic integrity while subordinating the foibles
that for too long had clouded his place among the nation’s founders. True,
he was vain, but it was a vanity “which came from years spent in the service
of other men, without attention to oneself, in the face of exhausting toil
and at the risk of life.” Adams admitted to irritability —his wife said
it was his single flaw—but to McCullough his irritability came from an
eagerness to get things done. Adams was brilliant, honest, ambitious, and
courageous. “He could be high-spirited and affectionate, vain, cranky,
impetuous, self-absorbed, and fiercely stubborn; passionate, quick to anger,
and all forgiving; generous and entertaining.” Without wealth or family
standing, he distinguished himself at the bar and in print. Although opposed
to mob rule, by the time of the Stamp Act in 1765 Adams’s political views
were fixed. “Patriotism burned in him like a blue flame.” No wonder, McCullough
announces, that his original plans to write a dual biography of Adams and
Jefferson had to be adjusted when he found the New Englander a more interesting
subject who was much neglected compared to the attention given to the Virginian.
Seeming
eager to reach the great events ahead, McCullough passes quickly over the
early years before the Second Continental Congress in 1774 where Adams
first faced the issue of joint colonial opposition to Great Britain. Likewise,
in this earlier peri- od the influence of Samuel Adams is minimized. Although
some historians have exaggerated the role of John’s cousin as the master
revolutionary of colonial Boston, few would treat it as casually as McCullough
does.
Reaching
the Second Continental Congress, McCullough begins his booklong comparison
of Adams with Jefferson, who “wished to avoid the rough and tumble of life
whenever possible” while the older Adams’s “irrepressible desire was to
seize hold of it.” In supporting independence at Philadelphia, Adams made
“the greatest speech” of his life. He, “more than anyone…had made it happen.”
But had he done nothing more than pushing Washington into command of the
army and Jefferson into writing the Declaration of Independence, “his service
to the American cause would have been very great.”
After
independence, McCullough writes, Adams saw a long, difficult war ahead
and went to work as head of the Board of War until Congress sent him abroad.
A master literary craftsman, McCullough finds great drama in Adams’s winter
crossings of the war-torn and stormy Atlantic, and then in his 1779 journey
overland from Spain to Paris. Calling Adams an early advocate of the importance
of an alliance with France, McCullough provides an extensive account of
the diplomacy that ended the War for Independence and of Adams’s solitary
efforts to obtain financial aid from Holland. Overcoming his suspicions
of fellow commissioner Benjamin Franklin, “Adams was at his best in the
final days of negotiations.” Although “untrained in diplomacy and by temperament
seemingly so unsuited for it, he had succeeded brilliantly,” McCullough
concludes. “It would be said they had won the greatest victory in the annals
of American diplomacy.”
McCullough
never completely abandons his initial plan to write a dual biography of
Adams and Jefferson; at the same time, his comparisons of the two men reveal
why he finds Adams more likeable. Adams filled his diary with inner thoughts;
Jefferson kept account books. In debate, Adams was aggressive; Jefferson
passive. The New Englander was self-made; the Virginian born to wealth.
Adams was frugal; Jefferson a spendthrift. The older Adams, devout; the
younger Jefferson, a skeptic. Adams was bluntly honest; Jefferson given
to dissembling.
McCullough’s
judgment of the Adams presidency is glowing. He left to President Jefferson
a nation at peace “with its coffers full” and “its commerce flourishing,
its navy glorious, its agriculture uncommonly productive and lucrative.”
By thwarting Alexander Hamilton’s plans, “he may have saved the country
from militarism.” All of this had been achieved with “no scandal or corruption.”
Even “if he had too readily gone along with the Alien and Sedition Acts
. . . he had managed nonetheless to cope with a divided country and a divided
party, and in the end achieved a rare level of statesmanship.” In love
with his subject and writing with the grand style of the lives of great
men, McCullough lets his enthusiasm prevent him from more objective analysis.
Biographers
of John Adams are inevitably drawn to the influence of his wife, Abigail,
whose correspondence with her husband and others occupies a large section
of the Adams manuscripts. McCullough states bluntly, “His marriage to Abigail
Smith was the most important decision of John Adams’s life,” While president,
he trusted her more than his advisers, and she may “have been decisive
in persuading Adams to support the Sedition Act.” Page Smith assigned her
a therapeutic role: “Abigail insured his sanity . . . she gave him, with
her love, a gyroscope that brought him safety through the stormiest seas.”
Otherwise, the signs of the classic manic-depressive he displayed might
have been self-destructive. Ferling saw Adams in his early married years
as cruel to his family and insensitive to his wife. Only in later life
after he had achieved a measure of public applause for his accomplishments
was he “at last capable of realizing with Abigail the real intimacy that
had so long eluded him.”
No
biographer denies Abigail’s major importance in her husband’s career, but
the richness of material in the manuscripts supports a variety of interpretations.
It is not surprising that with the opening of the Adams Papers more biographies
of the wife appeared than of the husband. Most saw the wife as essential
to her husband’s roles, admired her courage and independence, and relished
her correspondence. Yet a different Abigail Adams emerged in three books
by Paul C. Nagel, who had an extensive knowledge of the Adamses, their
children and grandchildren, as revealed in the papers. Nagel’s Abigail was stern, “thick
skinned and aggressive,” and a “calamity as a mother.” The husband had
to calm the wife’s fears, not the reverse.
In
public appearances after the publication of his John Adams, David
McCullough joined Joseph Ellis in calling for a national monument to Adams
in Washington. But the New York Times columnist Maureen Dowd responded:
“Let historians fight it out over John Adams. I say we need a monument
to Abigail Adams.” And constitutional scholar Floyd
Abrams objected to a monument for the president who had signed the Sedition
Act.Time will tell whether Adams joins
Jefferson on the Tidal Basin. More certain and more important, the editors
of the Adams Papers are erecting an enduring literary monument that makes
possible the rediscovery of John Adams. And the candid humanity revealed
in these papers continues to contribute to the restoration of Adams’s reputation.
While McCullough certainly admires Adams’s political achievements, he is
even more captivated by the character of this early American leader.
Charles
W. Akers is professor of history emeritus at Oakland University and author
of Abigail Adams, an American Woman (Little Brown, 1980; Longman, 2000).
DISPATCH
FROM THE UNITED KINGDOM
Remembering National History
by
Jeremy Black
The
end of one millennium and the beginning of its successor led to less discussion,
let alone celebration, of the national past than might have been anticipated.
In part this was a deliberate matter of public policy. It was decided at
a very senior ministerial level to include no section on history in the
Millennium Dome at Greenwich.
Similarly, the Project for a Museum
of National History in London
was unsuccessful in its requests for governmental support under the National
Heritage Lottery grant program. Yet two public corporations did use the
millennium to consider the nation’s history. For the first time ever, the
Royal Mail devoted all its stamps for one year (1999), with the exception
of a royal marriage, to an individual topic, British history in the last
millennium, while the BBC spent a large sum of money to produce A History
of Britain presented by Simon Schama.
Future
historians interested in the creation and sustaining of national images,
and in anniversaries and heritage, might well consider the Royal Mail commemorations
and the BBC series. As the historical advisor to the Royal Mail on the
1999 stamps (I also wrote or revised the explanatory texts sent to the
individual artists who produced the images, and drafted the text on the
presentation packs), and as one of the three trustees responsible for the
historical rationale and prospectus for the History of Britain Museum proposal
(as well as for the proposal from the same group for space in the Dome),
I am uneasily conscious how little documentary material survives to reveal
and explain the decisions taken about approach and content. In the case
of the Royal Mail, there was no committee to leave minutes; there was only
the correspondence between myself and the relevant official at the Royal
Mail. I am uncertain as to whether the Royal Mail kept minutes or other
papers; my efforts to find out have been unsuccessful. It is noteworthy
that I faced no opposition to my determination to adopt a thematic approach
that gave due weight to non-political histories, such as those of culture
and science, although there was some newspaper criticism of particular
stamps, most particularly the agriculture set.
In
contrast, the Schama series received serious criticism; although he found
favor with the government, receiving a CBE in the 2001 New Years Honors
List. In the Times of September 28, 2000, Magnus Linklater pointed
out that Schama’s approach to Britishness “blithely ignore[d] the entire
canon of recent historical work.” Two days later, the anonymous reviewer
in The Economist warned that Schama “runs the risk of reducing the
history of Britain to little more than a soap opera of bloodthirsty warring
kings, jealous siblings and revolting barons . . . [and] risks the charge
of banality.” In the Times Higher Education Supplement of December
8, 2000, Christopher Haigh found “no vision, no theme, no coherence . .
. too much drama . . . a Hollywood version . . . a messy soap opera in
costume,” rich in error.
Equally
disturbing is the failure to make due allowance for contrasting responses
and different approaches, which is not only part of the fascination of
history, but also central to its civic importance, not least as a reminder
of the limitations of authoritarian accounts.
This
dumbing down is linked to a persistent and mistaken tendency in the “media”
to underrate the intelligence and interest of viewers, listeners and readers.
They encounter competing analyses in political debate, so why not for history;
or is the audience supposed to be dimmer? Writers and presenters have to
be clear, but clarity is not the same as simplicity.
Look
at and read Schama, not to smile or squirm at the errors, omissions and
slant, but in order to consider how best to make public history; and also
to think how best to present the wealth of American scholarship in an accessible
fashion.
Jeremy
Black is professor of history at the University of Exeter and author of
Europe and the World, 1650-1830 (Routledge, 2001).
h
The Historical Society,
656 Beacon Street, Mezzanine, Boston, MA 02215 | Tele: (617) 358-0260,
Fax: (617) 358-0250
© The Historical Society | web design by Randall J. Stephens | v.
10/26/05
t
h
|
|
|
|