|
-
-
|
t
Joseph
S. Lucas and Donald A. Yerxa, Editors
Peter
A. Coclanis, "PRESIDENT'S CORNER: Rethinking Rethinking American History
in a Global Age"
Annabel
Patterson, "Whiggism Today"
R.J.B.
Bosworth, "Benito Mussolini: Dictator"
William
Stueck, "The End of the Korean War: Some Reflections on Contingency and
Structure"
Doyne
Dawson, "Dispatch from Seoul"
Alfred
J. Andrea, "The Silk Road: Part I"
Stephen
G. Brush, "Why Did (or Didn't) It Happen?"
An
Interview with Richard J. Evans
Thomas
Albert Howard, "A 'Religious Turn' in Modern European Historiography"
Avihu
Zakai, "Jonathan Edwards's Vision of History"
George
F. Nafziger and Mark W. Walton, "The Military Roots of Islam"
Antony
T. Sullivan, "Understanding Jihad and Terrorism"
David
Moltke-Hansen, "The Rise of Southern Ethnicity"
Kathleen
Broome Williams, "Improbable Warriors: Mathematicians Grace Hopper and
Mina Rees in World War II"
David
Gordon, "France, 1940: National Failure and the Uses of Defeat"
George
Ross, "Chirac the Great or de Gaulle the Small"
Max
Boot, "Iraq and the American Small War Tradition"
Marc
Trachtenberg, "The Origins of the Historical Society: A Personal View"
Historically
Speaking: The Bulletin of the Historical Society
June
2003
Volume IV, Number 5
Chirac
the Great or de Gaulle the Small? [1]
by
George Ross
Jacques
Chirac has become the most disliked Frenchman of our time, called a worm,
a weasel, a surrender monkey, a co-conspirator with Saddam Hussein, and
the individual most responsible for preventing allegedly eager nations
from supporting the American war. Such epithets and charges do little but
obscure the situation, leaving important questions to be answered.
Who
Is Jacques Chirac?
Although
one would never know it from Americanpress
coverage, Jacques Chirac has always had warm feelings for the United States.
In his early twenties, after graduating from the Ecole Nationale d’Administration,
France’s factory for producing technocrats, he washed dishes in Harvard
Square, worked his way down the Mississippi to New Orleans, and fell in
love with an American woman. Early in his career he helped make ends meet
by teaching French politics to eager young Americans in the Sweetbriar
junior year in France program.[2]
And as he became more important, he deepened friendships with Americans
and kept in touch with American society. After losing a second bid for
the French presidency in 1988, he flew several times from Paris to New
York with his daughter and principal campaign strategist, Claude, to upgrade
his communications skills at the feet of the great American spin-meisters.
He has evenconfessedto
“loving fast food” and is a nice guy with lots of American-style gregariousness
who enjoys women, eating and drinking with the people, kissing babies,
and pressing the flesh.
Chirac
was an unlikely candidate for principled stands against U.S. policies.
Although it would be an exaggeration to claim that Chirac has lacked principles,
it istrue that he has changed them regularly
to suit the moment. He once announced that political “promises engage only
those who believe in them.” Chirac leaned Left in his youth but, like many
young French technocrats, he then hitched his star to Gaullism, a statist,
nationalist, right-of-center ideology concocted by the founding president
of the French Fifth Republic, to which Chirac initiallyadded
a dose of Algérie Française. By his late twenties
he had become the protégé of Georges Pompidou, then de Gaulle’s
chief of staff, later prime minister and president. Pompidou found Chirac
a seat in Parliament, made him a junior minister, and finally the lynchpin
of his government.
Chirac
became his own man only after Pompidou’s death in 1974when
hesabotaged his own party’s candidate
to ensure the victory of Valéry Giscard d’Estaing in presidential
elections, in exchange becoming Giscard’s first prime minister. Chirac
dealt with the first oil shock with statist efforts to reflate the economy,
thereby becoming an early explorer of stagflation. He also pioneered policies
to re-absorb thepetro-dollars sloshing
around the Middle East with state-underwritten mega-projects like airports,
palaces, arms programs, and nuclear reactors (one to Iraq in particular)
far and wide across the desert.
Disagreements
with Giscard led Chirac to resign in 1976, when he thenturned
to building an organizational base to win the presidency, first taking
over the Gaullist Party—renamed the Rassemblement Pour la République
(RPR). He was then elected mayor of Paris in 1977 where henceforth he did
a good Gallic imitation of Richard Daley, Sr., replete with patronage (in
part based on granting cronies cheap access to splendid Paris apartments
that the city owned), illegal campaign financing, shakedowns for aspirants
to city contracts, and a cash-stuffed office safe for friends in need.
Chirac
removed the obstacle ofGiscard d’Estaing
being president by quietly urginghis
party and Paris machine to avoid efforts to re-elect Giscard in 1981. Chirac
was thus the man second most responsible, behind François Mitterrand,
for electing the first left-wing president of the Fifth Republic. By the
1986 legislative elections, Chirac, leader of the winning center-right
coalition, had become a Reaganite neo-liberal. This did not sell well,
however, and by the presidential campaign in 1988 Chirac had resurrected
Gaullist appeals with an anti-EU edge. Mitterrand, exploiting Chirac’s
changeability, was re-elected after campaigning as la force tranquille.
In
the 1995 campaign that finally made him president, Chirac promised everything
to everyone, including new economic growth, rapid job creation, relief
from unemployment, and policies that would heal France’s “social fracture.”
These promises, made at precisely the moment when austerity in France was
inevitable because of the criteria for Economic and Monetary Union (EMU),
were clearly irresponsible. A few short months later Chirac’s government
introduced a major austerity plan centering on French social policy, which
provoked weeks of strikes and a complete collapse of public support. In
part to rebuild this support Chirac dissolved Parliament in 1997. The election
that followed managed to turn a huge center-right majority into a new government
of the “plural Left” led by Lionel Jospin. For the last five years of his
first term Chirac therefore became a weakened partner in cohabitation.
Chirac’s
re-election in 2002 was a political accident. As sitting president he led
after the first round with but 19.88% of the vote. His small success resulted
primarily from the abysmal campaign of Prime Minister Lionel Jospin, high
levels of abstention, and discontent among voters of a “plural Left” made
even more plural by a crush of competing candidacies. All this enabled
Jean-Marie le Pen, the National Front’s demagogic leader, to squeak into
second place, slightly ahead of Jospin (16.86% to 16.18%). Chirac then
won the second round overwhelmingly, carried by massive and self-righteous
republican mobilization against le Pen.
Why
Did Jacques Chirac Do What He Did?
It
is necessary to dispel the absurd contention that Chirac, or the French,
are inclined to pacifism or reluctant to use force. Chirac himself graduated
top of his class from France’s prestigious Saint Cyr officer school, loved
military service, and served as a well-regarded junior infantry officer
during the bloodiest years of the Algerian War. The French, more generally,
have never been gun-shy, as their persistent resort to warfare throughout
the 20th century clearly shows. Chirac’s first major presidential actions
in 1995 were saber-rattling nuclear tests in the South Pacific which defied
international law and opinion. One of his most important policies thereafter
was rebuilding the French military into a professional army, eventually
leading France to the 1997 St. Malo accord with Britain which launched
the EU’s rapid reaction force. Indeed, even as diplomatic battles at the
UN raged onduring late winter 2003,
French soldiers were engaged in skirmishes in at least two places in West
Africa.
Parisians,
when asked to account for Chirac’s recent behaviorby
this author, invariably started withChirac’s
personal situation. At the age of seventy and in the twilight years of
a remarkable career, Chirac, they noted, was well aware of his reputation
as a ruthless political operator with few principles, and Iraq presented
him with an opportunity to go down in history for standing for something
important. The same Parisians usually added that France was a democracy
in which an overwhelming majority opposed U.S. policies on Iraq, thus allowing
Chirac to gain new public support after the peculiar circumstances of the
2002 election.
Fears
about the Bush administration were undoubtedly more important, however.
France had reacted to September 11 with an outpouring of sympathy for the
U.S. and support for action against terrorism. Its wariness grew as the
Rumsfeld-Wolfowitz-Cheney strategy became clearer. France, with its long
experience with the Middle East and Islam and its intimate knowledge of
terrorism, worried about the instability that American action might bring.
Worse still, any contagion of instability around the Mediterranean, the
EU’s “near abroad,” might tip the balance in former French North African
colonies like Algeria, already in a perilous state. Finally, American braggadocio
about nation building and spreading democracy came across as naive and
incendiary.
The
EU, the UN system, NATO, and other international organizations have been
the backbone of peace in postwar Europe. Bush’s unilateral approach to
Kyoto, the chemical arms treaty, and the International Criminal Court,
among other things, indicated that U.S. leaders might henceforth “go multilateral”
only when they completely controlled the outcome. The notion that “the
mission makes the coalition” confirmed this. Moreover, the march to war
in Iraq demonstrated how little difference anyone else’s opinions would
really make. The French knew, along with almost everyone else, that the
Bush administration had decided to go to war well before it tried to obtain
the second UN resolution in early 2003.In
short, Bush’s “my way or the highway” stance threatened a world in which
France had learned to live.
For
a full explanation of what Chirac did, however, we need to look at the
recent history of French foreign policy. In the 1960s Charles de Gaulle
set out lasting “Gaullist” guidelines for French international action.
His major goal was to carve out a sphere of French-led autonomy in Europe
within broader Cold War boundaries. De Gaulle had some success in promoting
the “Europe of nations,” shaping the Common Agricultural Policy, frustrating
British efforts to join the EEC, and blocking attempts to push Europe in
supranational directions. Success was harder in larger geo-political matters,
andde Gaulle, like Chirac, annoyed Americans
sufficiently to prompt boycotts of French wine and cheese. He nonetheless
believed that France should stand steadfast behind the U.S. at moments
of confrontation, as in the Berlin and Cuban Missile crises.
Gaullist
strategy continued after de Gaulle, albeit adapted to changing circumstances.
Mitterrand initiated a modified approach based on new European integration
through the 1992 Single Market program and EMU. This involved dropping
de Gaulle’s earlier quest to make France the strongest free-standing national
economy in Europe. In exchange Mitterrand sought to make France the most
influential player in a single European economy built around Brussels-initiated
liberalization. These new goals came at the cost of dramatic shifts in
domestic policies, however, including medium-term austerity, high unemployment,
new social problems, and rocky electoral situations. Benefits began with
the fact that the new constraints from Brussels (which the French were
prime movers in creating) obliged reforms in France that would make it
more competitive and less prone to inflation. Making the franc into a hard
currency to rival the Dmark, which enhanced French capacities in European
economic and monetary realms, provided additional benefits. Still, little
in these changes posed challenges to the Cold War context. Mitterrand thus
supported NATO and American efforts to install Euro-missiles on the continent,
for example, and with Reagan and Thatcher he encouraged the Gorbachev reforms
in the Soviet Union.
Chirac’s
new Gaullism on Iraq, however, was born in a completely new environment.
Europe, France included, was ill-prepared when the communist bloc collapsed.
Absorbed in, and conflicted about, the new European integration that had
begun in the 1980s, EU members were unable to deal with the disintegration
of Yugoslavia and warfare in the Balkans. The U.S. was eventually forced
to step in. EU members and leaders were tentative in approaching new Central
and Eastern European market democracies. The eventual determination to
welcome new members into the EU has been admirable, but in the interim
intelligent American moves to expand NATO bound these Central and Eastern
European states to American conceptions of collective security. The Maastricht
Treaty on European Union (negotiated in 1991 and ratified in 1993) included
ambitious words about a “Common Foreign Security and Defense Policy” that
weredifficult to translate into practice.
The big EU member states—Germany, France, and the UK—first had to reconsider
Cold War defense postures before they could clarify new missions. This
proved a slow process, hampered by a quest for “peace dividends” in military
spending prompted by the harsh budgetary constraints set by EMU. Europe
thus fell behind in security coherence and in instituting the reforms needed
to make it happen. In the meantime the U.S. spent steadily, creating the
strategic and materiel leads that are now so striking.
What
would Gaullism mean in these new circumstances? Cold War Gaullism had always
assumed that Europe would be included in American calculations as long
as France did not disavow basic American positions on global matters. The
Bush administration’s positions implied that Europe would be included in
American considerations only if it signed on unequivocally to decisions
the U.S. had already made. In effect, the U.S. no longer needed Europe
to pursue its global objectives. In addition, the state of play within
Europe had changed significantly. That Great Britain sided with the U.S.
on Iraq was not surprising, but that Italy, Spain, and other European “elders”
did the same, along with the Eastern “new Europeans,” was alarming.
Jacques
Chirac thus had to respond to changes threatening decades of French foreign
policy practice. The anti-war position taken by Chancellor Schröder
in the 2002 German elections prodded Chirac to act because it meant that
France would have an important ally in standing up to the U.S., with Russian
and Chinese support coming only later. Perhaps more important, the German
position on Iraq made renewing the Franco-German “couple” within EU politics
possible. Even weakened by defections among “old” Europeans and the pro-Americanism
of “new” ones, a reinvigorated French-German couple could still wield great
power over Europe’s future. The extravagant ceremonies surrounding the
40th anniversary of the French-German Elysée Treaty in early 2003
symbolized what Chirac had in mind.
New
post-Cold War geopolitical and European issues are likely to take years
to resolve. Even then they may not be resolved in the ways that Chirac
and France hope. In this light, swashbuckling French diplomacy in the run-up
to war in Iraq may have been unduly provocative (even though France’s diplomatic
ineptness pales beside that of the U.S.). Villepin’s rhetorical humiliation
of Powell at the Security Council and Chirac’s injunctions to Eastern Europeans
that they “would have done better to keep their mouths shut” were unnecessarily
inflammatory Gallicisms. French stridency undoubtedly helped to make a
last minute compromise impossible, even if the “no” camp on the second
Security Council resolution might have accepted a deadline on Iraqi disarmament
that the U.S. could have lived with. In general, M. Chirac and M. de Villepin
outdid one another in emulating the men of action they both admire in André
Malraux’s great novels. Indeed, they often acted as if they believed that
France remained a great power with sufficient strength to stand up to the
U.S. The Americans, who knew better, were offended.
The
results of M. Chirac’s campaign have not been good. Until the coalition’s
military success, Chirac benefited from a wave of support at home. Since
then doubts about him as an inconsistent improviser have reemerged. Worse
still, military success in Iraq makes it relatively easy for the Anglo-American
victors to embracethe Germans by playing
on their traditional Atlanticism in ways that will isolate and punishthe
French.[3]
The Bush administration certainly will not hesitate to do this, and its
new approach of dividing and ruling the continentals will be reinforced.
M. Chirac’s actions could also embolden the Rumsfeld-Wolfowitz-Cheney faction
in Washington toward further unilateralism, disregard for the UN, and disdain
for the “old Europe.”
What
is now clear is that the reconfiguration of earlier French foreign policy
goals for a post-Cold War world demands ingenuity, assiduity, shrewdness,
and patience. The French president’s brief moment in the international
sun will count much less than his work in the future. Will M. Chirac, with
at least four years left, be up to the job? His past suggests that he is
likely to turn out to be “de Gaulle the small” rather than “Chirac the
Great.” This would be sad, for much is at stake, including the future of
Europe.
George
Ross is Morris Hillquit Professor in Labor and Social Thought and director
of the Center for German and European Studies at Brandeis University. He
is co-author, with Andrew Martin, of Euros
and Europeans: Monetary Integration and the European Model of Society
(Oxford University Press, forthcoming).
[1]
The title is a translated headline borrowed from Le Monde, February
26, 2003, 1.
[2]
For better or worse, I am one product of his teaching both in the Sweetbriar
program and at Sciences-Po Paris, where for a few years he ran the major
seminar for foreign students.
[3]
See the Financial Times, April 19, 2003, 3.
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
June
2003
Volume IV, Number 5
An
Interview with Richard J. Evans
conducted
by Donald A. Yerxa
Richard
J. Evans is professor of modern history at Cambridge University. A specialist
in German social and cultural history, Evans is also widely known for his
historiographical writing, especially In
Defense of History (1997; first published in the United States in 1999),
for his role as principal expert witness in the David Irving libel trial
(2000), and his work on the clash of epistemologies when history enters
the courtroom (see his “History, Memory, and the Law: The Historian as
Expert Witness” History and Theory [October 2002]). Professor Evans
has recently written a new introduction to E.H. Carr’s What is History?
and a new afterword to G.R. Elton’s The Practice of History. He
is currently working on a three-volume history of the Third Reich, of which
The Coming of the Third Reich will be published by Penguin Books
in October, 2003. The following interview was conducted on March 21, 2003.
Donald
A. Yerxa: Why did you feel
the need to defend history in the 1990s?
Richard
J. Evans:In Defense of History came about
because I was asked to teach a lecture course on historical epistemology
at Birkbeck College in London, where I was professor of history at the
time, before I moved to Cambridge. As I read in preparation for the course,
I discovered that the literature on questions such as “What is history?”
and “How do we find out about the past?” was either very out of date (Carr
and Elton, for example) or written in a spirit of extreme skepticism by
postmodernist theorists (people like Keith Jenkins and Frank Ankersmit).
Clearly, there was room for an up-to-date statement about historical knowledge
which argued for its possibility, while taking on board the criticisms
of the postmodernists and trying to deal with them openly, rather than
simply ignoring them. As I read more, particularly in the journals, I found
that there was a good deal of debate among historians about postmodernist
and post-structuralist skepticism and hyper-relativism. There were angry
and dire wailings about this, without any real attempt to come to grips
with it. So I developed my lectures, and as I shared them with some colleagues,
they encouraged me to expand them into a book.
Yerxa:
Was history in a state of crisis in the mid- to late 1990s?
Evans:
There was a widespread feeling of an epistemological crisis. Of course,
a lot of historians never even realized there were these postmodernists
out there, so the sense of crisis was not universal in the historical profession.
But those who paid attention to these things realized that there was a
serious theoretical attack underway on the nature and possibility of historical
knowledge. And that did engender a sense of crisis.
Yerxa:
Has the sense of crisis dissipated?
Evans:
Interestingly, I think it has to a large extent. As I said in In Defense
of History, there is a tendency for new methodological and theoretical
approaches to begin by proclaiming their universal validity and their power
to revolutionize the whole of historical study. Then within a short space
of time, they tend to become sub-specialties, with their own journals and
societies where their adherents talk mainly to one another. And that is
exactly what has happened to the extreme relativists among the postmodernists.
Their critique has not left the practice of history unchanged, though the
extreme skepticism which they voiced about historical knowledge has now
subsided into a rather marginal phenomenon. After all, the only possible
reaction from historians who actually did accept these notions was to stop
writing history, and more history is being written today than ever before.
Yerxa:
What has been the legacy of these methodological debates?
Evans:
There have been negative and positive legacies. One noteworthy effect has
been to direct attention to culture and language as ways of explaining
and understanding history. And that has brought us away from the dominant,
socio-economic model of the 1970s and 1980s which held that society and
the economy were the driving forces in history. At that time, ideas took
second rank in the explanatory models of many historians. Historians now
take ideas, language, and culture much more seriously, and I think that
is a good thing. On the other hand, some historians have started to
neglect social and economic factors and to advance a crude cultural or
even linguistic determinism that is just as one-sided as the old economic
determinism.
Another
effect has been that we historians have become more self-conscious in our
practice. In a negative sense, that can mean that historical writing becomes
self-indulgent, simply the expression of personal views, quirks, and opinions.
It can become a very egotistical, narcissistic exercise. On the positive
side, it has caused us to become more honest about our own writing and
research. This has contributed to an interesting phenomenon in the UK:
a tremendous boom in popular history. You can see this especially on television.
A British television producer recently said, “history is the new gardening,”
meaning that gardening programs are giving way to history programs on the
TV channels. And I think that is partly because we now have presenters
like Simon Shama, David Starkey, and Niall Ferguson, who give what's obviously
a personal view of the past but on the basis of mostly authoritative knowledge.
This is in great contrast to the way history was presented in the media
fifteen or twenty years ago when you had the pictures accompanied by an
impersonal, objective-sounding voice-over. Academic historians have been
enabled by acceptance of the subjectivity in their own work to take part
in popular history and the dissemination of historical interpretations
and research, and that is a good thing.
Yerxa:
What gives you the greatest cause for hope as you assess the current state
of historical inquiry?
Evans:
The greatest cause for hope is that professional historians are writing
in a way that is much more accessible than it used to be in the 1970s and
1980s when, for all its many virtues, the social science model of history
did not have great readability or popular appeal. Historians are getting
their message across to a much wider readership than they used to. And
academic historians are not leaving the field to amateur historians and
journalists as used to be the case. History is more diverse than it has
ever been, and that’s also a very positive development. There are now many
different kinds of history; everything is grist for the historian's mill,
and that, too, is very good.
Yerxa:
And what concerns you the most about the current state of history?
Evans:
There are at least two developments that give cause for concern. One is
the state of history instruction in the schools, at least in the UK. We
have a national curriculum here that lays down what subjects are to be
taught, and history has been squeezed by other subjects deemed by the government
to be more important. Concentrating on history and combining that with
learning a foreign language seems, for example, to be now almost impossible.
Consequently, there are virtually no history students, no young historians
coming into the profession in this country who speak any foreign languages
at all. Thus none of my Ph.D. students who are working on subjects in German
and European continental history are British: they are German, Swiss, Canadian,
American, and so on. That is a pity. The great tradition of British historians
who work on France, Germany, Russia, and other countries is coming to an
end. Those subjects will continue to be taught in British universities,
but they will be taught more and more by people from those countries. Interestingly,
these people will have taken all their degrees in British universities,
so their intellectual formation at least is partly British, and that perhaps
is something of a compensating factor.
Also
in the schools here there is an overwhelming concentration on the 20th
century in history teaching, and there is an appalling ignorance and lack
of teaching on any period before 1914. Large swaths of history are simply
going untaught, and that, too, is a great pity. So the state of history
in the schools gives me great cause for concern.
The
other thing that I find worrying is connected with one of the more questionable
side effects of the postmodernist emphasis on subjectivity, and that is
what one might call the moralization and legalization of history. By this
I mean that since the early 1990s historical studies have become more and
more concerned with using moral and legal categories instead of understanding
and explanation based on value-neutral explanatory models and theories
from the social sciences. And that is because many historians now deem
neutrality to be morally undesirable. In a number of areas, the main concern
of young historians seems to be to reach moral judgments regarding the
Crusades, slavery in the American South, Nazi Germany, or whatever it might
be. There are historians working almost exclusively with concepts such
as perpetrators, bystanders, victims, and so on, which don’t help us understand
the historical process in any way. They simply assign praise and blame.
Yerxa:
What do you make of the emphasis on identity and memory?
Evans:
Identity can be a very interesting way of approaching history. One of history’s
main functions can be to illustrate the possibilities of human thought
and behavior, what it means to be human, what forms human identity can
take. While there are quite a few unsatisfactory books about identity (and
the concept as such is a very slippery one), still I think it is a very
interesting and important phenomenon.
On
memory, it is true that there is a lot of historical work now on what some
call public memory or the public commemoration of the past, and memory
is an important subject for historical study. But historians need to maintain
a clear distinction between history and memory. I would get worried if
the study of memory became a substitute for the study of the past, and
I do get a sense that in some areas there is almost more writing about
how people remember the past than there is writing about the past itself.
That is not generally true, of course, but in the area of Nazi Germany,
for instance, there is a rapidly expanding literature on how post-1945
Europe has dealt with pre-1945 Europe. A lot of it is very good and very
interesting, but I think we must study what happened before 1945 as well.
Yerxa:
What do you see as the central task of the historian?
Evans:
The historian’s central task is to understand and to explain the past.
Doing so requires certain other things; it implies, for instance, that
you also have to establish accurate knowledge about the past. I also think
historians have to make an attempt to recreate a sense of what it was like
living in the past and what people were like in the past. History is not
simply an abstract cerebral enterprise; it has a creative, imaginative
side to it as well. But understanding and explanation are the key things
that make history different from chronicle. END
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
June
2003
Volume IV, Number 5
PRESIDENT’S
CORNER
Rethinking
Rethinking American History in a Global Age
by
Peter A. Coclanis
One
of the most interesting developments in American history over the past
decade or so has been the attempt by various scholars to embed or at least
to situate our past in broader narrative frames. If the particular concerns
and methodologies of such scholars have differed, their findings, by and
large, have provided support, whether explicit or implicit, for two related
scholarly initiatives as well: one to relax the association between American
history and the American nation-state, and the other to challenge notions
of American exceptionalism.
The
most celebrated, important—and, certainly, self-conscious—example of the
“broadening” phenomenon in recent years is the 2002 collection entitled
Rethinking American History in a Global Age, edited by Thomas Bender.[1]
This collection at once reprises and encapsulates the principal interpretive
themes that resulted from a multi-year collaborative project entitled “The
Project on Internationalizing the Study of American History,” sponsored
by New York University and the Organization of American Historians, and
funded by the Rockefeller Foundation, the Ford Foundation, the Andrew W.
Mellon Foundation, and the American Council of Learned Societies among
others. This project involved seventy-eight scholars (thirteen from NYU)
who convened four times between 1997 and 2000 in Florence, Italy at NYU’s
“extraordinarily beautiful and peaceful Villa La Pietra.” According to
Bender, the project’s organizers believed that holding the conferences
outside the United States “seemed symbolically to make an important point
about the value of stepping outside of the nation, if only temporarily,
to write a fresher account of it.” Bender also notes that the “consistently
good spirits of the conference . . . owed something to the Tuscan sun,
the delightful gardens of the villa, and the formal but comfortable meeting
rooms” at La Pietra. (I’ll bet. Can anyone spell boondoggle?) In
addition to the volume edited by Bender, the La Pietra project produced
four reports, the last of which “The La Pietra Report: A Report to the
Profession,” also written by Bender, summarizes the project’s main conclusions
and recommendations. All four reports are available at the OAH website
and the website of NYU’s International Center for Advanced Studies.[2]
Rethinking
American History in a Global Age and “The La Pietra Report” can legitimately
be viewed as scholarly companions. Almost all of the essays included by
Bender in Rethinking American History and, obviously, Bender’s “Report”
itself are predicated on a set of interrelated assumptions—“priors,” as
economists say—that at a minimum includes the following: (a) American historians
over time have been limited, if not hamstrung by their narrative fixation
on the nation-state; (b) one such limitation resulting from this fixation
has been the belief in American exceptionalism; (c) more complex and varied
solidarities, processes, and identities, which have perforce been neglected
because of the aforementioned fixation, need greater attention; (d) such
solidarities, processes, and identities may best be observed at levels
of historical analysis greater or smaller than the nation-state; (e) transnational
or supranational levels of historical analysis seem particularly inviting
to American historians in the increasingly global age in which we live
today; and (f) connections to and with histories, historiographies, scholars,
and institutions hitherto viewed as exogenous to the American experience
are not merely desirable, but well nigh imperative.
Let
me say straightaway that many of the essays included in Rethinking American
History are quite stimulating, and that some of the points made in
the companion report are at once reasonable and unobjectionable. A number
of the finest historians working today were involved in the project, so
it is in no way surprising that Rethinking American History includes
a number of excellent pieces. Indeed, but for one or two essays, it is
a fine collection. Moreover, just as Bender et al. claim, some subjects
of historical inquiry are not particularly well suited for national treatment,
much less for organization around the nation-state, and in many cases American
historians probably should design their research projects in trans-,
supra-, and infra-national ways. If many scholars— economic historians,
for example—have long known this, and have long been working in ways concordant
with such knowledge, it never hurts to reinforce a sound point. Similarly,
I’m all for connectivity and am sold on the idea that we have much to learn
from histories, historiographies, scholars, and institutions “a little
beyond,” as the Transcendentalists (who, following the logic of our guides,
should be read along with the German Romanticists!) might put it.
My
problems with both Rethinking American History and the report, it
is fair to say, do not rest so much with the formal challenge being mounted
or with the plan of attack: as suggested above, I found the collection
of essays, tout ensemble, to be interesting and provocative and
the report moderately useful, which is not half bad given the conventions
of the genre (exhortation). Nonetheless, I must point out that I have some
residual concerns about the report’s “specific recommendations” for internationalizing
American history, certain “issues,” as students say, over the committed
and often extreme anti-exceptionalist assumptions informing both texts
(with the notable “exception” of the essay by David Hollinger in Rethinking
American History), and serious difficulties with the accusatory, anachronistic,
let’s-find-America-wanting-if-not-culpable tone and implications of a few
of the individual essays, Marilyn B. Young’s in particular.
First,
the recommendations laid out in the report. Many of these are bromidic,
lacking in rigor, and insensitive to opportunity costs. Take the “Teaching
Objectives” specified in the report, for example, which include statements
about better preparing students to understand the contemporary world and
its historical development, developing in students habits of historical
analysis sensitive to context, and, my personal favorite, an injunction
to “promote in students a more informed sense of and commitment to a global
commons.” We should also work to “[i]ntegrate U.S. history more effectively
into world history” and to “[e]ncourage greater study of languages and
foreign study.” Well, okay, but it would help, too, I suspect, if we encouraged
students to eat their vegetables (grown on the global commons?) and to
be trustworthy, loyal, helpful, friendly, courteous, kind, etc.
Regarding
curricular revision, the report recommends that departments consider reconfiguring
the major, possibly along thematic lines and, in the U.S. history survey,
“connect[ing] American history more strongly to historical themes that
are not exclusively American.” Neither of these recommendations is without
merit, but each comes at a cost. In the former case, one runs the risk—similar
to that associated with systems of matrix management—of creating intricate
inter-braided loops and helixes connected to each other but without a true
core; in the latter case, of imparting fewer and fewer of the essentials
(yes, the essentials) of American history. The report suggests that
“[t]he American Revolution and its aftermath need not be studied as a singular
event, but as part of a global system of empires that over the next two
centuries would be challenged by democratic revolutions,” and that “[t]he
Civil War can be examined as an episode in an international process of
consolidating national territories and empires, all of which was entangled
with the reorganization of systems of labor on a global scale.” These occurrences
can conceivably be analyzed in such ways, and generally are with more sophisticated
students in more advanced pedagogical settings. I am dubious, however,
about reallocating the scarcest of resources—time—in the basic survey away
from George Washington and Abraham Lincoln, as it were, to José
de San Martin, Kemal Ataturk, Muhammad Ali Jinnah, Patrice Lumumba, and
Bishop Carlos Ximenes Belo.
The
recommendations for internationalizing the training of M.A. and Ph.D. students
in American history are similarly problematic. It is proposed that M.A.
requirements, already threadbare, be reconfigured so as to “allow for a
global contextualization of American history, ideally in conjunction with
an M.A. degree or concentration in world history.” After making this proposal
Bender thoughtfully adds, however:
At
the level of the M.A. degree one cannot, of course, providein-depth
preparation for teaching world history (including U.S.history)
or broadly contextual approaches to U.S. history inK-12 or
in a museum. But it is both possible and desirable toprovide
the examples and conceptual formulations for suchcourses,
curricula, and exhibitions.
Let
me get this straight, then: world history; U.S. history; K-12; museums;
plans for lessons and exhibitions—all in thirty credit hours! If it’s Tuesday,
this must be the “long” 19th century, or maybe American history all the
way through the New Deal.
Regarding
the crown jewel of academic history, the Ph.D., the report proposes no
“major structural changes,” but urges “mainly that programs be structured
in such a way as to allow and encourage wider transnational and international
perspectives.” Noble sentiments, but at what price? Should we urge students
in my particular area, American economic history, to forego training in
econometrics and statistics in order to familiarize themselves with Islamic
economic thought or the historiography on the developmental state in Indonesia?
To be sure, each of these topics is inherently interesting, but there’s
no such thing as a free lunch (except perhaps at La Pietra!).
The
matter of opportunity costs becomes more important still in light of the
fact that the report states that “[i]n order to engage . . . international
historiography and [the international] scholarly community, students in
American history will need greater language proficiency than has usually
been required.” Such a requirement, of course, flies in the face of the
movement by generations of graduate students in American history to reduce,
if not to eliminate all language requirements other than English
(which is “foreign” enough to most graduate students!). Furthermore, the
report recommends that “[d]epartments should endeavor to establish bi-lateral
graduate student exchange programs with foreign universities, even including
teaching opportunities abroad for the U.S.-based student, and research
and, perhaps, teaching opportunities for the foreign-based student.”
And let’s eliminate war and world hunger while we’re at it. Having chaired
a large history department for five years and having worked countless hours
to establish (and then to keep alive) a couple of modest graduate student
exchange programs, I for one am dubious about the plausibility, much less
the viability of a gambit such as this, given a world replete with administrative
intransigence, curricular incompatibilities, calendar inconsistencies,
and various and sundry bureaucratic snafus (not even to mention the post-September
11 “regime” of tightened visa controls). We have trouble coordinating our
program with that of Duke University, a school eight miles down the road.
Imagine the problems entailed in establishing, then sustaining collaborative
graduate programs with Gaja Mada University, Shanxi Normal, Makerere University,
UNAM, or even Heidelberg.
Now
let us return briefly to matters of substance, namely, the anti-exceptionalist
bent that informs and imbues both Rethinking American History and
the La Pietra Report.[3]
Both Bender and most of the authors in the collection he edited are not
only anti-exceptionalists, but also equate a belief in American exceptionalism
with an ideology of American triumphalism. In formal terms, the equation
of these two “isms” is fallacious, and in real terms such an equation is
unwarranted, as David Hollinger makes clear in (among other places) his
fine essay “The Historian’s Use of the United States and Vice Versa” in
Rethinking American History.[4]
In this piece Hollinger demonstrates convincingly that the “national project”
of the United States, like it or not, is exceptional and argues
that it behooves even “internationalizing” Americanists to recognize this
fact. For starters, as Hollinger points out, our history “displays the
most successful nationalist project in all of modern history.” According
to Hollinger:
[t]wo-and-one-quarter
centuries after its founding and 135 years after its Civil War, the United
States is the most powerful nation-state in the world and the only 21st-century
power that operates under a constitution written in the 18th century. Its
significance is measured by its sheer longevity, its influence in the world
arena, and its absorption of a variety of peoples through immigration,
conquest, and enslavement and emancipation.
Hollinger
is no apologist for America, and his analysis of our historical evolution,
though respectful, is hardly triumphalist. No matter, for his voice is
largely overwhelmed in the Bender collection’s anti-exceptionalist/anti-triumphalist
cacophony. In some ways the conflation of exceptionalism and triumphalism
in Rethinking American History is reminiscent of yesteryear’s conflation
of “consensus” history and triumphalism by many academics on the Left,
who often argued as though scholars so diverse as Hofstadter, Hartz, and
Boorstin shared the same views. Just as there was nothing inherently “triumphalist”
about a consensus approach to American history—Hofstadter is a case in
point—there is no logical congruence today between a belief in American
exceptionalism and a triumphalist narrative line.
Then
there is Marilyn B. Young’s snide blame-America-first-last-and-always essay,
which makes editorials in The Nation seem subtle by comparison.[5]
Writing on American foreign policy in the 20th century, Young argues that
“the United States is not exceptional, only exceptionally powerful” and
misses no chance to diminish our successes, exaggerate our failures, and
highlight our inconsistencies and shortcomings in the international realm.
Indeed, she is not content to blister the manner in which American elites
have formulated foreign policy and wielded power in the world, so she takes
pains to get in a few licks as well at the American public, which she considers
solipsistic and self-absorbed (presumably so unlike academics such as herself).
Now one doesn’t have to be in Donald Rumsfeld’s pocket to suggest that
Young’s unmodulated critique of America’s international initiatives in
the 20th century is as cynical as it is untrue. Perhaps a few years under
an international regime dominated by any other plausible 20th-century contender
for “hegemon” status would change her tune, though, frankly, I’m not sure.
For
the record, let me state that I am basically sympathetic to the general
idea of internationalizing American history and have spent my whole career
working along such lines. Since the early 1990s, in fact, while working
on a project on the creation of an integrated global market in rice, I
have spent more research time abroad (particularly in Southeast Asia) than
I have in the U.S. Spending time in archives (and rice paddies) on the
other side of the globe has meant the world to me. I have come to appreciate
radically different perspectives and have learned to ask new questions.
I have had to confront unfamiliar historiographies and languages and see
my way through different research protocols and academic cultures. I have
made many new friends in so doing and now collaborate with scholars in
several different parts of the world. I have cherished these experiences
and acquired the “cosmopolitan feeling” endorsed by Professor Bender, along
with “that welcome sense of defamiliarization” with America that has prompted,
as Bender predicted, “a new and more inquiring curiosity about the American
past.”[6]
That these experiences have strengthened, not weakened, my sense of American
exceptionalism is important to me, but largely irrelevant to my final point.
Let
me draw on one of my “internationalizing” experiences in Southeast Asia,
one from Singapore more specifically. In my last few trips to the city-state,
I’ve found that Singaporeans like to joke about their notoriously “nanny-state”
(and humorless) government’s relentless efforts in recent years to transform
the nation’s economy from one based on manufacturing to one based on “knowledge.”
In one of these jokes—at least I think it is a joke—Singaporeans
claim that the government’s efforts include running articles periodically
in the semi-official newspaper, The Straits Times, with titles such
as “Be Creative: Here’s How.” In reading Rethinking American
History in a Global Age and the La Pietra Report, I couldn’t help but
be reminded of this “joke,” for both texts can be seen as rather humorless,
nanny-ish, semi-official efforts to remake a field, if not a world.
You know: “Be International and Cosmopolitan: Here’s How.” Follow us, for
we are wise.
Peter
A. Coclanis is Albert R. Newsome Professor and chairman of the history
department at the University of North Carolina at Chapel Hill. He is co-author,
with David L. Carlton, of The South, the Nation, and the World: Perspectives
on Southern Economic Development (University of Virginia Press, 2003).
[1]
Thomas Bender, ed., Rethinking American History in a Global Age
(University of California Press, 2002).
[2]
“The La Pietra Report: A Report to the Profession.” The Organization of
American Historians/New York University Project on Internationalizing the
Study of American History. (Thomas Bender, Director)
[3]
In a sense, two special issues of the Journal of American History,
both commissioned under David Thelen’s editorship, should also be considered
in the context of this “internationalizing” project. See the Journal
of American History 86 (September 1999) and 86 (December 1999). The
former is entitled “Rethinking History and the Nation-State: Mexico and
the United States as a Case Study,” while the latter is entitled “The Nation
and Beyond: Transnational Perspectives on United States History.”
[4]David
A. Hollinger, “The Historian’s Use of the United States and Vice Versa,”
in Bender, ed., Rethinking American History, 381-395.
[5]
Marilyn B. Young, “The Age of Global Power,” in Bender, ed., Rethinking
American History, 274-294.
[6]
“The La Pietra Report,” 5.
Iraq
and the American Small War Tradition
by Max Boot
One
would have thought that the defeat of the Taliban would have shattered
for all time the mystique of the guerrilla. Apparently not. Agitated commentators
on Iraq invoke comparisons with Vietnam and warn that allied occupiers
will never be safe.
Such
a nightmare scenario cannot be dismissed out of hand—a good general must
prepare for every contingency—but, if the historical record is anything
to judge by, it is unlikely. The U.S., along with most Western nations,
has a long record of defeating guerrilla resistance all over the world.
And the conditions present the only time the U.S. suffered a serious defeat—in
Vietnam—appear to be missing in Iraq today.
The
primary job of the U.S. Army until 1890 was fighting guerrillas—American
Indians, to be exact, the finest irregular warriors in the world. Defeating
them was a slow and arduous process, with some famous setbacks like the
Battle of Little Bighorn. But in the end dogged generals like Nelson Miles
and George Crook managed to capture the last holdouts, such as the Apache
leader Geronimo and the great Sioux chief Sitting Bull.
Much
of the historiography of the Indian Wars focuses on the U.S. Army’s excesses,
such as the massacre at Wounded Knee in 1890. But the army’s ultimate victory
was predicated not upon sheer brutality but upon the essentials of good
counterinsurgency strategy: cutting off the guerrillas from their population
base by herding tribes onto reservations; utilizing friendly Indians for
scouting and intelligence; and being relentless in the pursuit of hostile
braves.
Similar
strategies were utilized, with similar success, by the army in its campaign
to stamp out resistance to U.S. rule in the Philippines after the Spanish-American
War. The Philippine War was long and ugly. It lasted from 1899 to 1902,
with sporadic resistance thereafter, and it cost the lives of 4,200 U.S.
soldiers. But its success was sealed through a daring commando raid undertaken
by Brigadier General Frederick Funston. He dressed a unit of native allies
in insurrecto uniforms and pretended to be their prisoner in order
to capture the rebel chief, Emilio Aguinaldo, in his mountain lair.
Important
as this coup was, it was not enough to assure the long-term acceptance
of a U.S. presence in the Philippines. This could only be done through
measures designed to win the hearts and minds of Filipinos. In the early
days of the occupation, U.S. troops vaccinated children, set up schools,
and repaired roads. Later on, the U.S. granted the Philippines growing
autonomy well ahead of other colonies in Asia. Nationalist leader Manuel
Quezon was driven to complain: “Damn the Americans, why don’t they tyrannize
us more?”
America
went on to wage many more counterinsurgency campaigns in the years after
1898, mainly in the Caribbean, where U.S. troops occupied Panama, Cuba,
Haiti, the Dominican Republic, Nicaragua, and other places for varying
lengths of time. Most of these occupations were carried out by a small
number of Marines who fought guerrillas while being careful not to alienate
the bulk of the civilian population. Resistance was not always entirely
stamped out—the Nicaraguan rebel leader Augusto Sandino eluded capture
from 1927 to 1933—but vigorous policing usually kept the guerrillas isolated
in the outback where they did not pose a threat to large population centers.
The
bulk of the fighting was done by native enlisted men led by U.S. officers
in constabulary outfits like the Haitian Gendarmerie or the Nicaraguan
National Guard. This is an important point to keep in mind: while U.S.
forces possess superior training and firepower, they lack the local knowledge
essential to root out wily opponents. This gap can be bridged only by local
allies, whether serving informally alongside U.S. forces (as with the Northern
Alliance in Afghanistan, or the Kurds in northern Iraq) or in a more formal
military structure (as with free Iraqis who are working for various U.S.
units).
While
U.S. troops fought openly against guerrillas in the pre-World War II period,
after 1945 the emphasis switched to covert operations, with Washington
supplying arms and expertise to friendly governments battling communist
insurgencies. This strategy failed spectacularly in China, which was taken
over by Mao Zedong in 1949 because of the strong support the communists
received from Moscow and the blunders of the Nationalist government. But
this defeat should not erase the memory of victories elsewhere. A small
sample: between 1945 and 1949 Greece defeated the communist-dominated Democratic
Army with U.S. help provided under the Truman Doctrine; between 1946 and
1954 the Philippine government, advised by the “Quiet American,” Edward
Lansdale, put down the Hukbalahap rebellion; and between 1980 and 1992
El Salvador, with U.S. aid provided under the Reagan Doctrine, defeated
the Farabundo Marti National Liberation Front (FMLN).
In
all these instances the U.S. strategy called for carrots and sticks—aggressive
military operations against the rebels combined with liberalizing reforms
to win over the uncommitted populace. And in all these cases the U.S. and
its allies were successful.
The
glaring exception is Vietnam, where the U.S. pursued a similar strategy
with a notable lack of success. America’s failure was due to many factors,
including a ham-handed military campaign that ignored successful counterinsurgency
techniques of the past. But the ultimate problem was that the communist
forces operating in South Vietnam had a “deep rear” in North Vietnam, the
Soviet Union, and China. The U.S. was loath to take decisive military action
against any of these states for fear of widening the war. As a result,
the insurgents always had a safe base of operations across the border and
steady supply lines along the Ho Chi Minh trail. Even so, the Vietcong
did not win the war. They merely softened up the enemy for the conventional
invasion that North Vietnam mounted in 1975.
The
question today is: does Iraq more closely resemble Vietnam or, on the other
hand, the numerous places where U.S. counterinsurgency strategies prevailed?
The answer is the latter. In the first place, Vietnam’s topography—lots
of jungles and mountains—was much more favorable to guerrilla operations
than the deserts and towns of Iraq. And, unlike in Vietnam, it is doubtful
that any neighboring country will want to give long-term support to a Baathist
guerrilla campaign against coalition forces. While neither Syria nor Iran,
which share long borders with Iraq, is friendly to the U.S., they do not
have particularly warm feelings for Saddam Hussein either. In any case,
neither state enjoys superpower patronage, so they would be at the mercy
of U.S. forces if they fomented a wave of terrorist attacks against the
occupation authorities.
But
simply because a guerrilla campaign against the U.S. is unlikely to succeed
does not mean the occupation itself will be a success. That will require
a concerted campaign of “nation building” similar to those the U.S. has
previously undertaken in countries ranging from Germany to Haiti. The lesson
of those campaigns is clear: where U.S. troops stay the course for the
long term (Germany, Italy, Japan, Philippines, Bosnia, Kosovo) they can
change life for the better. Where they pull out too quickly (Cuba, Haiti,
the Dominican Republic, Somalia) things can go to hell in a handbasket
pretty quickly. That’s a point worth remembering as President Bush vows
to draw down U.S. forces in Iraq. Only if the U.S. military is prepared
for a long, long stay will the Bush administration have any hope of carrying
out its vow to turn Iraq into a nascent democracy.
Max
Boot is Olin Senior Fellow at the Council on Foreign Relations and author
of The Savage Wars of Peace: Small Wars and the Rise of American Power
(Basic Books, 2002).
Join
the Historical Society and subscribe to Historically Speaking
Whiggism
Today
by
Annabel Patterson
Where
there’s a lot of smoke, there may or may not be a fire, but there’s almost
certainly a story. And there was certainly a great deal of smoke in the
lead review of the Times Literary Supplement for March 14 of this year,
when Jonathan Clark (usually known as J.C.D.) filled two full pages with
a tirade against my Nobody's Perfect: A New Whig Interpretation of History.
Why, one might well ask, would a relatively small book with a serio-comic
title, focusing on the later 18th century in Britain, have raised such
a fuss? I suggest the following answers, in ascending order of significance:
first, the predilections of the reviewer; second, the challenge to received
historiographical notions, especially but not exclusively about the 18th
century; and third, the deeper issues about what constitutes freedom of
the press and fair political and legal process, issues that both American
and British readers must have somewhere in mind or on their consciences
in the early months of 2003.
First,
the reviewer: J.C.D. Clark has made his reputation (“the most controversial
historian of his generation,” “the history profession's leading iconoclast”)
by publishing a series of books on the 18th-century political scene in
Britain and, in The Language of Liberty, 1660-1832, on the roots of American
political culture. It is probably fair to say that what makes him an iconoclast
is his conservatism. One of his favorite figures is Dr. Samuel Johnson,
who makes brief appearances, not greatly to his credit, both in Nobody’s
Perfect and my earlier Early Modern Liberalism, whose title Clark also
deplores. Another is Edmund Burke—the later Burke, that is, the Burke of
the Reflections on the Revolution in France, not the Burke who for a while
supported the American Revolution. Clark is skeptical about the American
Revolution, whose cultural myths he sets himself to demolish. Oddly, the
phrase “American pulpit” occurs twice in this review in relation to myself.
I have apparently mounted it by virtue of holding a chair “at Yale,” while
Clark himself occupies a chair at Kansas. In fact, we are both expatriates
from Britain.
The
real argument between us, however, has two hearts: one, the influence of
Herbert Butterfield’s 1931 The Whig Interpretation of History (also a small
book) and whether that influence has been benign or malign in the field
of historiography; and two, the view of 18th-century Britain we wish to
propagate. Let me speak to Butterfield first. I was initially driven to
unearth this early work of Butterfield’s, which after the experience of
World War II he repudiated, by puzzlement at the way my colleagues in the
legal theory workshops at Yale used the term “whig” with casual contempt,
as shorthand for any opinion that smacked of a belief in historical progress.
They were some distance away from Butterfield’s actual argument, which
was designed to counter the old whig historiography as the line “which
leads through Martin Luther and a long succession of whigs to modern liberty.”
No doubt it did need countering at that time. But Butterfield himself came
to feel that his early arguments had been not only naive but possibly dangerous,
and in 1940 he wrote instead: “It is not necessary or useful to deny that
the theme of English political history is the story of our liberty . .
. . Those who, perhaps in the misguided austerity of youth, wish to drive
out that whig interpretation . . . are sweeping a room which humanly speaking
cannot long remain empty”(The Englishman and his History, 3-4).
It
seemed to me that the period of sweeping had gone on long enough and that
the vicissitudes of history (academic and real history) now dictated once
more that the room should be, at least till the next house-cleaning, plainly
but interestingly refurnished . . . .
Annabel
Patterson is Sterling Professor of English at Yale University. She is the
author of Marvell: The Writer in Public Life (Longman, 1999).
Join
the Historical Society and subscribe to Historically Speaking
Benito
Mussolini: Dictator
by
R. J. B. Bosworth
In
Britain during the early months of 2003, while the Iraq crisis was coming
to a head, Adolf Hitler was in the news. Politicians found it easy to draw
(usually confused or inapposite) historical parallels between the present
and the 1930s and to dispute again the deep meaning of A.J.P. Taylor’s
old goak that the Munich agreement and the policy of the appeasement
of Germany which produced it were “a triumph of all that was best and most
enlightened in British life.” Historians, too, were arguing about history.
Hitler, it was said, had too high a profile in the training of British
school children. To some it seemed that Hitler and the tale of his regime
and its killing of the Jews had all but taken history over.
An
Australian-born historian and his American readers should leave the British
to sort out their own educational priorities. Yet, the problem of Hitler
hangs over us, too. In a profoundly ironical fashion, the ghost of the
German Führer has seized, or been given, world power in the
mind of history, and not just history. In much of our culture, Hitler has
become the model of the bad dictator, the image of evil, the quintessence
of the “rogue” ruler.
But
the positioning of Hitler in this role is not good either for our sense
of history or of our own society. The adjective which affixes itself most
readily to the Nazi dictator is “mad.” Even Ian Kershaw, in his marvelous,
massive and best-selling biography of the Führer, agrees. But
it is a destructive word signifying that the madman is incomprehensible
and imponderable, a person with whom dialogue or diplomacy, any rational
dealing, is pointless. The madman is the ultimate “Other.” His evil is
beyond our ken; he is not our responsibility.
By
all accounts, Hitler was a very strange person, and the fanaticism of his
ideological obsessions to kill Jews and destroy what he viewed as the evil
empire of Judeo-Bolshevism was profound. At least after 1939, Hitler was
the sort of politician who unswervingly pursued his ideology even when
it undermined his regime’s best interests. But should we make the leap
to assume that other dictators were, or are, mad? I doubt it very much.
I certainly doubt it as a new biographer of Mussolini. In my account, I
argue instead that the Duce, in his combination of strategy and
tactics, his meshing of revolutionary leadership and the continuity of
power, as well as in his coping with his public and private life, was sweatily
human. Benito Mussolini embodied much of his country, region, age, gender,
and class. He was not an insane Other, but someone we can understand. Reviewing
his life we must acknowledge that there but for the grace of humanity go
we . . . .
R.J.B.
Bosworth is professor of history at the University of Western Australia.
His Mussolini was recently published by Oxford University Press.
Join
the Historical Society and subscribe to Historically Speaking
The
End of Korean War: Some Reflections on Contingency and Structure
by
William Stueck
In
the February 2003 issue of Historically Speaking Jay Winik discusses
the end and immediate aftermath of the American Civil War in the context
of “the contingent nature of events and the different paths that history
could have taken.” “Comparatively speaking,” he argues, the Civil War ended
well; the war’s conclusion set the stage for a relatively peaceful and
quick return to stability and enduring national unity. It did not have
to be so, he claims, as the alteration of individual decisions, sometimes
apparently trivial ones, could have set in motion a very different pattern
of events.[1]
Winik,
it appears, has a predisposition to see glasses as half full rather than
half empty, as he might easily have reasoned the other way and speculated
on how much better things might have been had Abraham Lincoln decided not
to attend the Ford Theater on the fateful night of April 14, 1865. More
important, though, he is a proponent of the notion that small decisions
frequently have big and enduring results. Thus it is a task of those who
study the past to grapple not only with what was but what might have been—to
engage, in other words, in counterfactual analysis. I will follow in Winik’s
footsteps here in examining another ending, that of the Korean War. In
doing so, I intend at once to support his views and to qualify them.
Like
the conclusion of the American Civil War, with the tumultuous and in many
ways disappointing period of Reconstruction that followed, the Korean War’s
end invites a number of negative judgments. The armistice of July 27, 1953
left the peninsula divided, with the forces of hostile regimes and their
allies facing each other ominously across a four-kilometer-wide demilitarized
zone. Weapons inspections and arms control broke down in less than five
years, foreshadowing the current crisis over North Korea’s development
of weapons of mass destruction. South Korean President Syngman Rhee’s refusal
to sign the armistice eventually provided North Korea with a rationale
for ignoring its southern counterpart in initiatives to revise the cease-fire
or replace it with a peace treaty. One victory for the United States was
the armistice’s settlement on prisoners of war, which forbade “forced repatriation”;
tens of thousands of North Korean captives on the UN side opted not to
return. But the armistice left large numbers of missing American combatants
unaccounted for, a tragedy not fully rectified to this day. To say the
least, whether from the U.S. perspective of 1953 or today, the end of fighting
in Korea left far-from-ideal conditions . . . .
William
Stueck is Distinguished Research Professor of History at the University
of Georgia. His most recent book is Rethinking the Korean War: A New
Diplomatic and Strategic History (Princeton University Press, 2002).
[1]
“How Wars End and the Writing of History: An Interview with Jay Winik,”
Historically Speaking: The Bulletin of the Historical Society
(February 2003): 18-20.
Join
the Historical Society and subscribe to Historically Speaking
Dispatch
from Seoul
by
Doyne Dawson
The
experience of Korea in the 20th century was not unique; in all Asian societies
the traditional elites struggled with tensions between ancient national
identities and the challenges of the modern West. But the Korean transition
to modernity was unusually traumatic. The Korean ancien régime
died hard. The Confucian mandarin class resisted Westernizing and modernizing
influences even more stubbornly than did their counterparts in imperial
China. As a result, the “Hermit Kingdom” fell easy prey to the modern army
and navy of Japan, which in 1910 abolished the 1200-year-old Korean monarchy
and annexed the peninsula.
Most
non-Western countries went through a long colonial period, but in the case
of Korea the colonizer was also non-Western. That the conquerors belonged
to the same race and the same Sinitic civilization as the conquered did
not make their relationship easier. Koreans had traditionally regarded
the Japanese as their cultural inferiors. Yet the Japanese conducted experiments
in total cultural assimilation, such as forcing Koreans to take Japanese
names, which no European colonial power would ever have attempted in an
Asian country. The Japanese occupation left bitter memories that still
rankle.
The
Japanese withdrew at the end of the Second World War, to be replaced by
other foreign intruders. With Soviet backing, the first communist regime
in the Far East was set up at Pyongyang in 1948 and would have imposed
itself upon all Korea had it not been for the intervention of the United
States. The war of 1950-53 killed 1,300,000 South Koreans and 55,000 Americans
and left the devastated peninsula divided between an American protectorate
in the South and a Red Chinese protectorate in the North. (During the war
China had replaced the Soviet Union as Pyongyang’s main patron.)
Thus
the new Republic of Korea faced the challenge of reinventing a Korean national
identity following a half-century-long interruption marked by the total
collapse of the old elite, foreign conquest and occupation, a terrible
civil war, and the loss of half the national territory . . . .
Doyne
Dawson teaches in the Asian Studies Program at Sejong University in Seoul.
He is currently working on a comparative study of modernization in Europe
and Asia.
The
Silk Road: Part I
by
Alfred J. Andrea
Recent
events, particularly United States military actions in Afghanistan and
Iraq, have drawn Americans’ attention to Inner Asia. A token of America’s
recent “discovery” of the lands and peoples of Central Asia was evident
on the Mall of Washington, D.C. in late June-early July 2002 when the Smithsonian’s
annual Folklife Festival was devoted exclusively to the many cultures of
the ancient Silk Road.
Simply
put, the Silk Road (more correctly, the Silk Roads) was a complex network
of connected land routes stretching across Asia and connecting the capital
cities of China to the trading emporia of India and the eastern Mediterranean
from about 100 B.C.E. to approximately 1500 C.E.
That
said, we must acknowledge that these are extremely imprecise dates. Long-distance
travel and exchange across Inner Asia existed for thousands of years before
the classic era of the Silk Road, and even today many of the traditional
routes of the Silk Road continue to bear commercial traffic.
In
like manner, the geographic definition—a network of land routes across
Asia, connecting China with India and the Middle East—is equally misleading
and imprecise. In fact, the Silk Road did not stop at water’s edge. Beyond
China and the Levant lay Japan in the Far East and North Africa and Europe
in the Far West. Thanks to ships and shipping lanes, Japan and the far
western regions of North Africa and Europe shared in the goods, ideas,
and other items transmitted across the Silk Road and may be thought of
as part of a Greater Silk Road. The same can be said of the lands and islands
of the South China Sea and the Indian Ocean. In short, goods and ideas
transported across the Silk Road reached cultures that had no terrestrial
connection with Inner Asia.
At
its height in the 7th and 8th centuries C.E., the main overland portion
of the Silk Road stretched for more than 4,000 miles from east to west,
from Chang’an (modern Xi’an), the western capital of the Han and Tang dynasties
in north central China, to Antioch, Tyre, Constantinople, and similar cities
of the Eastern Mediterranean. Along the way, it passed through such fabled
cities as Samarkand (in modern Uzbekistan), Kabul in Afghanistan, Susa
in Iran, Baghdad in Iraq, and Palmyra in Syria. It traversed deserts, steppes,
rivers, and mountain ranges, all of which presented dangers to those who
braved its routes.
Additionally,
bandits preyed on travelers, and strange food, drink, and microorganisms
threatened a traveler’s internal organs. There were also psychic dangers.
Several Silk Road travelers, including Marco Polo, recorded first-hand
accounts of the frightening nocturnal sounds of the Salt Desert of Lop
Nor that disoriented the unwary.
The
dangers of Silk Road travel were ameliorated and the journey was made possible
by oasis caravanserais and urban centers that allowed travelers to progress
from refuge point to refuge point at the pace of about twenty to twenty-five
miles a day—absolute top speed—with a variety of pack animals: Bactrian
camels, oxen, yaks, horses, Arabian camels, donkeys, and even elephants.
Of these, the slow but strong Bactrian, or double-humped, camel, which
could bear loads of up to 300 pounds, did the bulk of the carrying across
the paths of Inner Asia. To further insure the safety of travelers, shrines
and pilgrimage sites sprang up where they could find spiritual solace and
physical refuge.
The
most famous of these today is a Buddhist complex known as the Mogao Caves,
located not far from the oasis town of Dunhuang, itself situated at a point
of convergence for the main northern and southern routes that skirt the
Tarim Basin and its essentially impassable Taklamakan Desert (“The Place
from which No Living Thing Returns”). The Mogao Caves, dug out of the soft
stone of a cliff face, number around 500 and extend for about a mile. According
to available records, the first cave dates to 366 C.E. The caves are shrines
and cumulatively contain about 3,000 statues and murals, all laboriously
created by resident monks. Were all this art laid end to end, it would
measure sixteen miles long by fifteen feet in height. And this is only
one of many Buddhist cave complexes along the Silk Road.
Along
the Silk Road’s routes merchants moved goods, pilgrims visited holy sites,
missionaries sought out converts, armies marched on expeditions of conquest
and missions of pacification, colonists set out for distant frontier regions,
ambassadors and promised brides journeyed to distant lands to cement alliances,
and imperial administrators traveled to far-flung outposts.
In
addition to manufactured goods, livestock, fruits, and vegetables were
transported to new homes, where they became integral parts of the agrarian
landscapes and the tables of their host cultures. Artistic motifs and styles
in painting and sculpture traveled along these routes, as well as other
forms of artistic expression. Music, dance, and a wide variety of musical
instruments made their way eastward from Persia, India, Central Asia, and
elsewhere and profoundly affected the cultures of China, Korea, and Japan.
And then there were the ideas that flowed every which way, especially religious
concepts. Not all of the exchange was healthy, however. Diseases also traveled
along these pathways, as microorganisms were carried by human and animal
traveler alike.
Travelers
along the Silk Road during its classic era referred to its routes by many
different terms, some of them unrepeatable in polite company. There is
no evidence, however, that any of them called all or part of it “the Silk
Road.” The term is a modern convention. During the 19th and early 20th
centuries, a variety of Western, largely European, adventurers and scholars
began traveling through that vast region of Inner Asia known imprecisely
as Turkestan—essentially lands dominated by pastoral nomads who speak a
variety of Turkic languages—to explore and uncover the largely forgotten
ruins of once-prosperous towns and religious shrines along the ancient
caravan routes that had, so long ago, linked East and West. Within this
context, in 1877 a German geographer, Baron Ferdinand von Richthofen (1833-1905),
the uncle of the Red Baron of World War I fame, coined the term die
Seidenstrassen (the Silk Roads) to indicate that Chinese silk, more
than any other product of value, fueled commerce along these routes. English-speaking
historians later transformed the plural to the singular, hence the Silk
Road. Today many world historians favor the more correct “Silk Routes”
to underscore the fact that this was a vast complex of land and sea routes.
I prefer the more romantic, albeit less precise, Silk Road. After all,
what is history without a touch of poetry? . . . .
Alfred
J. Andrea is professor emeritus of medieval global history at the University
of Vermont. He is editor of the popular world history text, now in its
4th edition, The Human Record: Sources of Global History: To 1700,
Vol. I of two vols. (Houghton Mifflin, 2001) and author of The Encyclopedia
of the Crusades (Greenwood, forthcoming). During spring semester 2002
he served as Distinguished Scholar-in-Residence at the University of Louisville,
where he offered a seminar on the Silk Road.
Join
the Historical Society and subscribe to Historically Speaking
Why
Did (or Didn’t) It Happen?
by
Stephen G. Brush
When
I switched from theoretical physics to history, it was in part because
I wanted to find out how and why scientific knowledge had been established.
Scientists want to describe the natural world and also find out what causes
things to happen in that world; presumably historians want to undercover
causes as well as facts in the human world. To what extent is this presumption
valid? Is studying history like studying science?
According
to one version of “The Scientific Method,” proposed by the philosopher
of science Karl Popper (1902-1994), we should use our hypothesis to predict
a fact or event that we don’t already know. If the prediction turns out
to be wrong, we must discard the hypothesis. If the prediction turns out
to be correct, that doesn’t prove the hypothesis is correct—since
some other hypothesis might have produced the same prediction. It does
mean, however, that we can retain our hypothesis—it has been “corroborated”
but not “confirmed”—and prefer it to another hypothesis that has not survived
such a test. On the other hand, a hypothesis that is so flexible it can
explain anything but cannot make any testable predictions is not scientific
at all. (Popper placed Marxism and psychoanalysis in this category of pseudosciences.)[1]
According
to Popper, merely predicting the recurrence of a known phenomenon is not
a real test of a scientific hypothesis: you should predict something that
a person who doesn’t know your theory would not expect to happen.
You get no credit for predicting that the sun will rise tomorrow, but if
you predict, as did Svante Arrhenius at the end of the 19th century, that
continued burning of fossil fuels will lead to global warming, then other
scientists may eventually conclude that your theory is valid.
One
might infer that historians are not scientists because they do not
judge their theoretical explanations by their ability to make successful
predictions, while physicists, chemists, and meteorologists are
scientists because they do. And indeed this is exactly the kind of inference
that has led many people to conclude that the physical sciences are “harder”—they
yield more reliable knowledge—than the biological and social sciences,
and certainly harder than history, which is not a science at all.
There
is just one problem with this inference: in many (perhaps most) cases,
physical scientists do not judge a theory primarily by its success
in making predictions, although that may be a secondary factor in persuading
them to accept it.
After
many centuries of research, geologists have not actually produced a theory
of earthquakes that is generally accepted and makes accurate predictions
about future earthquakes.[2]
Yet no one, to my knowledge, has concluded from this fact that geology
is not a legitimate science. So perhaps we should ask instead: is The Scientific
Method, as defined by Popper, an accurate description of how science works?
There
is also some confusion about what is meant by “prediction.” Physicists
generally use the word to include a deduction of an empirical fact,
whether or not the fact was known before the prediction was published.
If it was not known, one speaks of “prediction in advance” or “forecasting”
or (in the discourse of philosophers of science) “novel prediction.” Formulations
of The Scientific Method by Popper and other non-physicists usually require
that a theory make novel predictions, although some of these statements
are ambiguous. The linguistic usage of physicists suggests that they don’t
think it makes any difference whether a prediction is novel or not, and
in some cases it is quite clear that they don’t think successful novel
predictions should count any more, in evaluating a theory, than successful
non-novel predictions (deductions).[3]
More
generally, if one believes the statements of Einstein, Dirac, Eddington,
and others, physicists like theories that are simple, universally applicable,
logically coherent, beautiful, and consistent with other established theories
and laws of nature; in the long run a theory is expected to make accurate
predictions about empirical facts, but it should not be rejected if it
doesn’t happen to agree with the latest experiment. Conversely a theory
that does not satisfy those criteria of simplicity, etc., may be rejected
despite its success in predicting new phenomena . . . .
Stephen
G. Brush is Distinguished University Professor of the History of Science
at the University of Maryland. His most recent book, written with
Gerald Holton, is Physics:
The Human Adventure, from Copernicus to Einstein and Beyond (Rutgers
University Press, 2001).
[1]
Karl Popper, The Logic of Scientific Discovery (Hutchinson, 1959);
Conjectures and Refutations (Basic Books, 1962); and Unended
Quest: An Intellectual Autobiography (Open Court, 1976).
[2]
S. G. Brush, “Dynamics of Theory Change: The Role of Predictions,” PSA
1994: Proceedings of the Biennial Meeting of the Philosophy of Science
Association 2 (1995): 133-145; “The Reception of Mendeleev’s Periodic
Law in America and Britain,” Isis 87 (1996): 595-628.
[3]
S. G. Brush, “Prediction and Theory Evaluation: The Case of Light Bending,”
Science 246 (1989): 1124-29.
Join
the Historical Society and subscribe to Historically Speaking
A
“Religious Turn” in Modern European Historiography?
by
Thomas Albert Howard
In
recent decades a promising new historiographical direction has quietly
opened up. Religion has resurfaced as a category worthy of serious investigation
by European historians—and not just among medieval or early modern scholars,
where religion’s fortunes have never really suffered, but most strikingly
among those toiling in presumably more barren soil: the “secular age” after
the fall of the Bastille and the guillotining of Louis XVI. Passed over
by many historians, who’ve tended to equate modernity with the “disenchantment
of the world,” religion now appears poised for fresh evaluation as a persistent
and protean force in modern societies. Indeed, having abided the inattentiveness
of some of our craft’s best practitioners, religion now stands, importuningly,
at the door. Whence comes this uncanniest of all guests? . . . .
Thomas
Albert Howard is associate professor of history at Gordon College. He is
the author of Religion and the Rise of Historicism (Cambridge University
Press, 2000) and Protestant Theology and the Making of the Modern German
University (Oxford University Press, forthcoming).
Join
the Historical Society and subscribe to Historically Speaking
Jonathan
Edwards’s Vision of History[*]
by
Avihu Zakai
Once
dubbed by Perry Miller “the greatest philosopher-theologian yet to grace
the American scene,” Jonathan Edwards is now widely recognized as America’s
most important theologian. And he is no less celebrated as a prominent
philosopher, ethicist, and moralist. Edwards’s theology and philosophy
are a matter of great scholarly interest today, and recent studies have
dealt with almost every aspect of his thought. Strangely enough, however,
there has been no serious attempt to explore Edwards’s philosophy of history,
let alone to analyze the content and form of his distinct mode of historical
thinking.
Edwards’s
sense of time, his vision of history, and the development of his historical
consciousness warrant serious attention. Without this, much of his philosophy
and theology are unintelligible; moreover, the significance he accorded
to his actions—as well as the ultimate sacred historical meaning he attached
to his own time, as evidenced by his decisive role in initiating, advancing,
and promoting the Great Awakening, 1740-43—remain uncomprehended . .
. .
Avihu
Zakai is professor of history at the Hebrew University of Jerusalem. Cambridge
University Press recently released his Exile and Kingdom: History and
Apocalypse in the Puritan Migration to America in paperback.
Join
the Historical Society and subscribe to Historically Speaking
The
Military Roots of Islam
by
George F. Nafziger and Mark W. Walton
The
events of September 11 shoved the Islamic world into the center of our
television screens. An endless stream of experts began explaining to the
American public what there was to be known about Islam. When President
Bush spoke to the American people, he correctly said that the very
word “Islam” means peace. This was a comforting thought, and well in line
with the best ideals of our nation. What he did not say was that the peace
of Islam, within the concept of Dar al-Islam (the House of Islam), applies
only within the borders of Islam; those outside it are exempt from the
peace . . . .
George
F. Nafziger, Captain, USNR-R, Ph.D., has authored several books and articles
on military history. He is a former director of the Napoleonic Society
of America and the Napoleonic Alliance. He is the owner of the Nafziger
Collection, a publishing house specializing in the Napoleonic Wars and
World War II history. Mark W. Walton is an independent researcher. They
are the authors of Islam at War: A History (Praeger, 2003).
Join
the Historical Society and subscribe to Historically Speaking
Understanding
Jihad and Terrorism[*]
by
Antony T. Sullivan
Pascal
once observed that the necessary basis for moral conduct is clarity of
thought. In recent discussions concerning a putative relationship between
Jihad and terrorism, clear thinking has been notable primarily by its absence.
That is regrettable, especially given the polarities that now seem to characterize
relations between the West and the Muslim world. In this worsening environment,
Christians and Muslims alike would do well to reexamine established ideas.
Such reconsideration has never been more important.
In
the West, Jihad is now understood to mean terrorism, tout court.
And in the Muslim world far too many now understand Jihad as justifying,
indeed demanding, the taking of innocent civilian life. Misperceptions
and ignorance are widespread everywhere.
The
truth is that terrorism and Jihad are not identical twins but historic
enemies. In fact, a new vocabulary is essential to demonstrate the radical
antipathy that separated these concepts until very recent decades. Terrorism
is not only un-Islamic but anti-Islamic, and those who commit terrorism
should be designated as criminals rather than as holy warriors or resistance
fighters. A new focus by Muslims on the Qur’an, at the expense of medieval
Islamic jurisprudence (fiqh), is now very much in order. The best
approach to grasping the real meaning of Jihad may be through analysis
of the linguistic roots of the word. To do that, one must know Arabic,
a qualification that many contemporary commentators on this topic lack.
Above all, what is now needed is a revived and authentically Islamic vocabulary
that definitively separates the concept of Jihad from that of terrorism.
Much of that Islamic vocabulary already exists. The good news is
that precisely that traditional vocabulary is now again being used by many
prominent Muslims when discussing these vexed issues . . . .
Dr.
Sullivan is a senior fellow at the Fund for American Studies. In that capacity
he serves as director of faculty at the International Institute for Political
and Economic Studies in Greece sponsored by the Fund and Georgetown University.
He holds an honorary appointment as an associate at the Center for Middle
Eastern and North African Studies at the University of Michigan. He has
published widely on the Arab and Islamic world with particular attention
to U.S. foreign policy and Christian-Muslim understanding.
[*]Some
of the ideas in this essay are drawn from Antony T. Sullivan, “New Frontiers
in the Ecumenical Jihad Against Terrorism: Terrorism, Jihad, and the Struggle
for New Understandings”—which appeared in the January-February 2003 issue
of The American Muslim, an online journal accessible at www.theamericanmuslim.org—as
well as from Antony T. Sullivan, “The West, Islam, and the Ecumenical Imperative,”
which will appear in a forthcoming book edited by Roger Boase.
Join
the Historical Society and subscribe to Historically Speaking
The
Rise of Southern Ethnicity
by
David Moltke-Hansen
Southerners
have developed several myths to account for the origins of their common
identity. Three particularly salient myths have generated much scholarly
attention. One is about the region’s leading social classes and ideals.
A second is about the shared antecedents, beliefs, and behaviors of southern
white common folk. A third is about the special fusion of African and European
influences that has distinguished the South’s food, music, religious expressions,
and speech.
The
Cavalier myth and its variants argue that the South’s leadership came from
the gentry or, at least, ascribed to the gentlemanly ideal. Plantation
society was built on and expressed this ideal which helped to distinguish
the South from the North, whose leadership and ideals had different origins
and trajectories. As a result of such legacies and also the “premodern”
culture of most Southerners, honor continued as a principal ideal in the
South, but not in the North. Honor accounts for why southern officers died
at a much greater rate than their men during the Civil War. It also led
Southerners to attack and die in either heroic or appalling numbers, depending
on one’s point of view.
Other
mythmakers and scholars have focused on the common folk, rather than the
elites. Having found that the majority of white Southerners in their own
day share cultural and behavioral characteristics, these writers have asked
from whence did those commonalities originate? The conviction has been
that the North’s and South’s differences and distinctiveness are rooted
in Europe. There are two variants of the argument:
§
The people who defined southern folk culture came from southern and western
England, while the Puritans came from the Midlands and eastern England.
§
Or, more broadly, the great majority of white Southerners came from the
Celtic fringes of Great Britain and from Ireland, areas sharing farming
practices, community attitudes, and personal habits encountered by many
travelers in the South from the early 17th through to the 20th century.
The
mythologizers and scholars who have adopted variants of these folk histories
have tended to focus on the upland South, areas where African Americans
and plantations have not had the centrality they have had in the coastal
plains and lower piedmont. Writers from and on these nether parts of the
South have more often been impressed by the ways people of African and
European origins have influenced one another’s expressive lives, foodways,
and sensibilities. In the accounts reflecting this awareness, the South
is that part of the United States where African-European fusion traditions
came to define a regional culture, so to lay the foundation for southern
ethnicity.
The
mythmakers and the scholars only began to develop these origin narratives
for the American South in the second quarter of the 19th century. It was
then that Southerners started to articulate southern identity, to speak
and write of themselves as a people with common interests, culture, and
ideals. In 1825 just a few asserted the identity. By 1845many shared it.
Outside the region, as well, Southerners had become recognized as a distinct
people.
In
1790 the lyrics to Dixie, composed in the 1850s, would have made
no sense.
I
wish I was in the land of cotton.
Old
times there are not forgotten.
Look
away, look away, Dixieland.
Almost
no cotton grew then in the area later called the American South.
Thirty years later, in 1820, most of the future cotton South had yet to
be brought under the plantation regime. Indeed, as late as 1840 much of
Dixieland had only recently opened to European-American and African-American
settlement. In most of the upland and trans-Appalachian South, even in
1860, the old times were very recent to anyone but Native Americans .
. . .
David
Moltke-Hansen is president of the Historical Society of Pennsylvania. He
is co-author of the Gullah People and Their African Heritage (University
of Georgia Press, 1999).
Join
the Historical Society and subscribe to Historically Speaking
Improbable
Warriors: Mathematicians Grace Hopper and Mina Rees in World War II
by
Kathleen Broome Williams
Between
1943 and 1945 mathematicians Grace Murray Hopper (1906-1992) and Mina Spiegel
Rees (1902-1997) each held positions from which they influenced the nation’s
ability to wage a modern, math-dependent war. They knew of each other during
the war and afterward their lives intersected frequently since they moved
in the same professional circles. These improbable warriors were
among the few women whose war service proved profoundly beneficial to their
later careers.
In
1983 Grace Hopper, then seventy-six years old, was made an admiral by special
appointment of the President of the United States. In 1987 the U.S. Navy
named its new computer center in San Diego for Hopper, and in 1996, four
years after her death, it launched the U.S.S. Hopper, an Arleigh
Burke-class guided missile destroyer. The recipient of numerous
medals, awards, and honorary degrees, Grace Hopper was esteemed as a pioneer
in the field of computing. Admiral Hopper never went to sea, but
her computer expertise and managerial skills made her a pivotal figure
in the Navy's path to the computer age. Even when she retired in 1986—the
oldest serving officer in the Navy—Grace Hopper continued working as a
consultant for Digital Equipment Corporation. She died in 1992 and was
buried with full military honors in Arlington National Cemetery.
At
the heart of all these accomplishments was Hopper’s brilliance in mathematics.
Had it not been for World War II, however, she might never have left the
genteel campus in New York where she was teaching . . . .
Kathleen
Broome Williams is a professor of history at Bronx Community College, CUNY
and the CUNY Graduate School and University Center. She is the author of
Improbable Warriors: Women Scientists and the U.S. Navy in World War
II (Naval Institute Press, 2001), which received the North American
Society for Oceanic History’s 2001 John Lyman Book Award for the best book
in U.S. naval history.
Join
the Historical Society and subscribe to Historically Speaking
France,
1940: National Failure and the Uses of Defeat
by
David Gordon
The
collapse of France in six weeks was arguably the greatest surprise of World
War II. Few could believe that the rapid collapse of a large, well equipped
army was the result of military incompetence. Some socialists and communists
blamed capitalism, clericalism, and anti-republican generals. Those newly
triumphant at Vichy faulted anti-clericalism, parliamentary corruption,
and the Marxist parties that had divided the nation. However, all the explanations
about fundamental weaknesses behind the collapse were greatly exaggerated.
France
had not gone to war divided. Nor was France betrayed. As recent literature
demonstrates, France was defeated because her generals made serious mistakes
during the campaign. Robert Doughty put it best when he wrote that the
Germans “outfought the French tactically and outsmarted them strategically.”[1]
The French had moved deep into Belgium early in the 1940 campaign to establish
defensive positions as far from their border as possible. General Maurice
Gamelin’s one audacious gambit, the so-called Breda variant, in which he
decided to move still farther into the Netherlands, depleted his strategic
reserve and fatally weakened the French. When Guderian’s panzers broke
through the Ardennes, there were no strategic reserves left to stop him.
By the time the French were prepared to resist, the battle was already
lost.
Stanley
Hoffman has famously called Vichy “the revenge of the minorities.” The
Riom trials, begun in February 1942, were part of that revenge. The aim
was to discredit the country’s pre-war Socialist and Radical-Socialist
leadership, as well as Gamelin. But when the political defendants were
able to prove that the mistakes of French strategic planning had been made
long before the creation of the 1936 Popular Front government, the trial
was suspended. The German defeat at Stalingrad little less than a year
later made debate over 1940 less important. With allied victory a growing
possibility, French interest turned to the nature and extent of collaborationist
guilt.
However,
the signal fact remains that for two years after the armistice every political
group was wed to some kind of mythologized explanation for the defeat.
No one was willing to see the military failure for what it was. Further,
no one asked why Germany had been allowed to become so strong in the years
prior to 1940 . . . .
David
Gordon is associate professor of history at Bronx Community College, CUNY.
He is working on a study of French business in China and Indo-China from1920
to 1950.
[1]
Robert Allan Doughty, The Seeds of Disaster: The Development of French
Army Doctrine, 1919-1939 (Archon Books, 1985), 189.
Join
the Historical Society and subscribe to Historically Speaking
The
Origins of the Historical Society: A Personal View
by
Marc Trachtenberg
What
exactly happened in the American historical profession in the late 20th
century? I can talk at length about how things changed during the period
I’ve been in the profession—that is, the years since I started graduate
school in 1966. I can talk about the sources of my own discontent. I can
talk about the things that led me personally to play a certain role in
the establishment of the Historical Society five years ago. I can, and
will, talk about all these things, but I still don’t really understand
what happened. But I do know how those developments affected me personally.
I’m
a diplomatic historian. The term itself is terribly old-fashioned and,
certainly by the 1980s if not earlier, the whole field of diplomatic history
had come to be viewed as old hat. That field, I think it’s fair to say,
was pushed to the margins of the profession. For that reason, diplomatic
historians like me came to be particularly sensitive to what was going
on within the profession. I certainly became quite disaffected.
What
was so disturbing about what I saw going on around me? Two things, really.
First, I could not believe some of the things people were studying. I could
not believe what was often held up as “cutting edge” historical work. (I
wondered when I heard that phrase: cutting through what? And was “cutting
edge” simply a euphemism for “trendy”?) In any event, those topics struck
me, increasingly, as absolutely trivial. It was hard to imagine that people—indeed,
apparently the majority of the profession—could actually view the kind
of work that was being held up in that way as important. But the
fact is that it was considered important, and that judgment went hand in
hand with a dismissive attitude toward subjects which, in my view at least,
really did matter. It seemed to me obvious that the issues of war and peace
were of fundamental importance, but it was also quite obvious that diplomatic
history, the field devoted to the study of such issues, was frowned upon.
Military history was even more beyond the pale. And political history as
a whole was increasingly regarded as passé.
I was
even more disturbed by what I saw as a growing tendency to treat historical
work as a kind of bludgeon for advancing political agendas. People in fact
became increasingly shameless and overt about using history in this way.
And this, in turn, was linked to a certain tendency to reject the older
standards about proof and evidence, and the old ideals of objectivity and
honesty, which I had absorbed by the time I had finished graduate school.
All
these things seemed to go together. The people working on what I thought
of as trivial topics seemed to be the ones who were most eager to politicize
historical work and to move away from the old standards of historical argumentation.
They also seemed to be the ones who were most interested in pushing fields
like diplomatic history—and to a certain extent even political history
as a whole, not to mention a whole series of other fields—to the margins
of the profession. They talked a lot about “diversity,” but in practice
they certainly did not embrace a live-and-let-live philosophy . . .
.
Marc
Trachtenberg is professor of political science at the University of California,
Los Angeles. His most recent book is A
Constructed Peace: The Making of the European Settlement, 1945-1963 (Princeton
University Press, 1999).
h
The Historical Society,
656 Beacon Street, Mezzanine, Boston, MA 02215 | Tele: (617) 358-0260,
Fax: (617) 358-0250
© The Historical Society | web design by Randall J. Stephens | v.
10/26/05
t
h
|
|