Monday, October 31, 2005

The Cliopatria Awards

The Cliopatriarchs have announced that they are going to launch the first annual Cliopatria Awards for History Blogging. It will be a judged competition in six categories: six categories: Best Group Blog, Best Individual Blog, Best New Blog, Best Post, Best Series of Posts, and Best Writing. Here are some more details:
The three committees of judges are:

Best Individual Blog and Best Post
Manan Ahmed, Chairperson
Adam Kotsko
Brian Ulrich

Best New Blog and Best Writing
Jonathan Dresner, Chairperson
Natalie Bennett
Hiram Hover

Best Group Blog and Best Series of Posts
Sharon Howard, Chairperson
Another Damned Medievalist
Timothy Burke

Nominations for the Awards will be open throughout November. You can make nominations here. There you'll find simple guidelines for each award. You may want to use the History Blogroll and/or the History Carnivals to prompt your memory. Judging will take place in December. The winners of the Awards for 2005 will be announced at the Philadelphia convention of the American Historical Association at 9:00 a.m., Saturday 7 January, 2006, and will be posted here at Cliopatria shortly thereafter.

Thursday, October 27, 2005

What is a "Conservative"...or is it "conservative"?

The New Yorker recently devoted a piece (not available on line, but here is a snippet) on the little-known historian and poet Peter Viereck in which the author, Tom Reiss, lauded Viereck as the forgotten founder of the modern American conservative movement.
Viereck became a historian, specializing in modern Russia, and a Pulitzer Prize-winning poet. But, in a series of books published during the late nineteen-forties and early nineteen-fifties (which have recently been reissued by Transaction), he continued to develop his political philosophy. He gave the conservative movement its name and, as the historian George Nash, the author of The Conservative Intellectual Movement in America, says, he "helped make conservatism a respectable word." Moreover, Viereck’s belief that the United States could be a moderating influence, confronting the forces that threaten freedom and democracy without succumbing to liberal optimism, became a central tenet of conservative thought and, with the arrival of neoconservatives in positions of power in Washington, beginning in the nineteen-eighties, of American foreign policy.

Yet Viereck never became a rallying figure. Conservatism remained largely an intellectual movement during its first several decades, from the late nineteen-forties to the late nineteen-seventies—a loose affiliation of scholars and writers who had little more in common than a hatred of liberalism and Communism, which they increasingly saw as indistinguishable. Even in this context, Viereck was an anomaly, insisting on a moral distinction between the moderate and the totalitarian left, and, as conservatives began to attain political influence, denouncing what he perceived as the movement’s demagogic tendencies.
While largely concurring with these points, John J. Miller provides a different angle. Miller also alludes to Nash's comments:
"This was the book which, more than any other of the early postwar era, created the new conservatism as a self-conscious intellectual force," wrote George H. Nash in The Conservative Intellectual Movement in America. "It was this book which boldly used the word 'conservatism' in its title — the first such book after 1945. At least as much as any of his contemporaries, Peter Viereck popularized the term 'conservative' and gave the nascent movement its label."

And so conservatism's naming rights arguably belong to him. Viereck never actually joined the movement, however. When conservatives rallied around Robert A. Taft for president in 1952, in a kind of proto-Goldwater endeavor, Viereck opposed them. He even compared Taft to Robespierre. Two years later, he condemned Joe McCarthy. Then he supported Adlai Stevenson for president. He bought into the liberal academic view espoused by Richard Hofstadter and others that political conservatism was a neurotic form of status anxiety. He spoke of "Midwest hick-Protestant revenge against [the] condescending East" with "the resentment of lower-middle-class Celtic South Boston against Harvard." In 1956, Frank S. Meyer had this to say in National Review: "Viereck is not the first, nor will he be the last, to succeed in passing off his unexceptionably Liberal sentiments as conservatism."

. . . The fundamental weakness of Viereck's conservatism, however, was its disdain of capitalism. In this sense, his brand of conservatism was more aristocratically European than dynamically American. Although Viereck was a strong critic of Communism, he personally preferred a mixed economy to free markets. He once equated "anti-statism" with "plutocracy," and believed the New Deal was worth preserving. Although the early conservatives were an eclectic bunch, their views on capitalism were broadly libertarian and specifically opposed to the New Deal. Viereck may have given conservatism its name, but his achievement was largely semantic. The job of actually defining conservatism fell to the likes of Russell Kirk and Frank Meyer, who quickly eclipsed Viereck.
Miller also believes that the real goal of The New Yorker piece is to put down both William Buckley and President Bush. Thus, some of Miller's skepticism concerning the piece could be attributed to institutional loyalty in the case of the former and political loyalty in the case of the latter. The New Republic's Franklin Foer agrees with many of Miller's points, but thinks that Reiss lauds Viereck for a different, if time-worn, reason:
John J. Miller, however, raises important objections to the New Yorker piece over at National Review Online. Namely, does Viereck really deserve the centrality that the New Yorker accords him? Beyond that, in setting up Buckley as Viereck's foil, the piece neglects Buckley's periodic campaigns against the John Birchers and the lunatic right. Miller accuses the New Yorker of inflating Viereck and caricaturing Buckley to score ideological points. I'm pretty sure that's not the case. More likely, the magazine built up Viereck's importance to justify devoting so many words to him. And, let's face it, every magazine writer has committed this sin before.
I think Foer has a point, but it also seems more than a "happy accident" that boosting Viereck results in disparaging the contributions of Buckley.

Johah Goldberg provides a good--if a bit too National Review-centric--synopsis of how there is a difference between the historical, small "c" conservatism and modern, big "C" conservatism, which, ironically, is really derived from liberalism.
“Before the reformation,” wrote Lord Hugh Cecil, “it is impossible to distinguish conservatism in politics, not because there was none, but because there was nothing else.” While lacking in totalitarian technology, European monarchs had a surfeit of totalitarian metaphysics. Everybody believed that the state was there to impose a religious worldview on the whole of society. Dissenters from that worldview didn’t like it. The horrific fights between Catholics and Protestants — not to mention the Inquisition, the expulsions of Jews from various lands, etc — raged for more than a century until finally a few tired folks declared, “let’s call a draw!” And the compromises inherent to that draw came to be called liberalism. Locke, Hobbes, Smith, Montesquieu, and the gang crafted this neat theory which said the state is formed to protect the interests of individuals. Our rights to life and property exist prior to the state’s right to exist. If the state violates the former it abdicates its claim to the latter.
Finally, some compare the ideals of the contemporary American conservative movement and the ideals of those who are identified as historical conservatives and claim the former can't truly call themselves conservative. Modern "conservatism" is an apellation attached to a contemporary political movement whereas the historical use of "conservatism" was indicative of a class of people who had a vested interest in the status quo. To claim the former are not "real" conservatives because they don't seem to fall in line with the historical conception of conservatism is to play a semantical game.

Tuesday, October 25, 2005

Historic Timewaster

I don't need to be serious all the time. Here's one of those "personality" quizzes that asks: "Which Historic General Are You"? I ended up most like, um, er, Caesar.

"America: Lost in Translation"

Richard Pells offers valuable insight into how real American culture is "lost in translation" by the rest of the world. He also offers a solution.

Monday, October 24, 2005

A Different Kind of Bias in the Classroom?

Not being a professor myself, I sincerely wonder what other professors think about the ethics of students or their parents contributing time or money to a professor's political campaign.
Jennifer Lawless, the Brown University political science professor running against U.S. Rep. Jim Langevin for the 2nd Congressional District Democratic nomination, raised a few eyebrows when Brown's student newspaper, the Brown Daily Herald, reported last week that she had accepted about $5,500 in political contributions from two students and their families.

Lawless quickly decided to return the students' money -- which came to about $1,500 -- but has decided not to return the money from the students' parents. Adam Deitch, Lawless campaign manager, said he has decided there is no 'conflict of interest' in taking political contributions from students' parents, but that the students' money could create the perception of a conflict.

Deitch also said that Lawless is involved in ongoing discussions with Brown administrators concerning allowable student involvement in her campaign. (Source)
I don't ask to stir the pot, I'm just curious what the boundaries are on this one. Could it be compared to a business owner running for office and accepting donations from his employees?

Tuesday, October 18, 2005

Where is the "Liberal Hawk"?

While conservatives are "cracking up" over the Harriet Miers imbroglio, those on the left still have their own problems. To paraphrase the thesis of Drake Bennett's book review in Sunday's Boston Globe: where are the liberal hawks?
Yesterday's referendum on the Iraqi constitution should have been a special triumph for those few liberal thinkers who supported the Iraq War. After all, so-called ''liberal hawks," more than their conservative nest mates, have in recent years been the loudest voices for a foreign policy based on human rights and democratic transformation abroad. The last American war, in Kosovo, was a liberal war, led by Bill Clinton and Tony Blair. The Serbian surrender, with the subsequent toppling of Serbian president Slobodan Milosevic, has gone down in history as a victory of military might deployed in the service of liberal humanitarianism. Might not an Iraqi constitution-or at least a constitutional referendum-count as well?

But today the liberal hawks find themselves in a bind. The circumstances in which the country prepared for yesterday's voting-the assassinations, the suicide bombings, the tattered infrastructure-were not exactly what they had in mind. To make matters worse, once it became apparent that Iraq possessed no weapons of mass destruction, the Bush administration took up the liberal arguments as a sort of retroactive casus belli. Thus the liberal hawks find their arguments embraced by an administration they never trusted and whose conduct of the war and occupation they have harshly criticized.
Benett reviews books by Paul Berman, George Packer and the upcoming one by Peter Beinart, commenting that, "Each thinker, in his way, is trying to salvage something for liberal interventionism out of its ignominious association with Iraq." In particular, Peter Beinart, much like President Bush is comparing radical Islam to Communism (who did so first? Beinart, I think...).
Beinart's current work looks back. . . to an older, and distinctly American, liberalism. The historical touchstone of Beinart's call to arms (the article that his book grows out of was called ''A Fighting Faith,' and his book's provisional title is ''The Good Fight') is the emergence, in the late 1940s, of what came to be known as the Cold War liberal. Starting in 1947, and led by figures such as Arthur Schlesinger Jr., Reinhold Niebuhr, Walter Reuther, and Eleanor Roosevelt, the Democratic Party, along with liberal organizations like the ACLU and the NAACP and most of the country's major unions, began to adopt a fiercely anticommunist line, distancing themselves from anyone, like the former vice president Henry Wallace, seen as soft on communism. ''The health of the democratic left requires the unconditional rejection of totalitarianism,' Schlesinger wrote in his 1949 book ''The Vital Center.'

The parallels to today, Beinart believes, are obvious. . . he argues that the United States faces a new totalitarian threat in al Qaeda, and that liberalism, unlike the American right, has not come to terms with it. ''There is little liberal passion," he wrote in his New Republic essay, ''to win the struggle against al Qaeda."
Bennett explains that liberal pundit-historian Joshua Micah Marshall has problems with this analogy.
''Unlike communism in 1947," the liberal journalist Joshua Micah Marshall wrote on his widely read blog, ''militant Islam simply does not pose an existential threat to our civilization." Marshall has noted that even some hawkish liberals feel Beinart and Berman's analogy between militant Islam and Soviet totalitarianism does more harm than good, because it risks repeating the excesses of Cold War liberalism with much less justification. Beinart's Vital Center liberals, after all, were handmaidens both to McCarthyism and architects of the Vietnam War.
Marshall has more to say, though. In his initial review of Beinart's TNR piece, he focused on the fact that liberals are simply less interested in foreign policy.
The problem is not principally dovishness but rather --- as Peter notes --- that Democrats are by and large simply not sufficiently interested in national security policy, as such. This is at least as much a problem in the Democratic operative world as it is at the grassroots. As I’ve written before, lack of interest in national security policy leads to lack of knowledge. And lack of knowledge leads to tactical and mutable political decisions on national security --- which is both bad on principle but also feeds public perceptions that Democrats aren’t serious about the issue and that they’re not trustworthy guardians of the national security in dangerous times.
To be sure, Marshall's belief that the scope of the threat posed by radical Islam is simply not as broad nor obvious as that posed by Communism allows him to give liberals a pass on not having "caught up" to this realization. Nonetheless, it is this uninformed fear of "risk" that, it seems to me, continues to prevail among the left, (not necessarily "Democrats"). Many have tried to boil war down to a simplistic syllogism such that every war could be another Vietnam. (Similarly, the Patriot Act is likened to a caricature of McCarthyism)--especially when a Republican is in the White House. Therein lies the answer to the mystery of the vanishing liberal hawks.

It would seem that these doubts and concerns about the "risks" of U.S. policy are not informed so much by a studied analysis of U.S. History but are a directly tied to political ideology. I don't recall such outrage from the "mainstream left" when President Clinton engaged the U.S. in military ventures througout the world. (Though Marshall alludes to such a debate. I don't doubt Marshall, its just that any such debate occurred "under the radar," of an--at the time--casual political and historical observer such as myself). Nonetheless, the difference between Bosnia and other Clinton-led ventures were such that no obvious benefits were accrued by the U.S. We appeared to be acting selflessly, sans "interest." We intervened to stop a genocide that had no real affect on the material well-being of America and to help the Europeans to end conflict in a region that could threaten them, but, at least on the surface, not us. In truth, stopping genocide was a materially, though not morally, selfless act. We helped our collective conscious, after all. Ending the conflict was, in fact, in our own best interest: better to stop a small war than enter a big one. So we intervened for humanitarian reasons and to protect our own interests. Both reasons were, and are, perfectly legitimate and recognized as such be people across the political spectrum. So what happened?

The problem that liberal hawks are wrestling with is that, to many of their liberal brethren, an ideologically opposed, and especially an "illegitimate," President simply cannot do anything correctly and will certainly garner no credit when he does. Thus, the effect of American foreign policy, no matter how humanitarian--how liberal--doesn't matter if those effecting it are one's ideological opponents . As such, the contrast between the liberal reaction to the Balkans and that of Iraq gives us a clue as to what event will help bring the liberal hawks back to prominence. The liberal hawks will fly again when a liberal hawk is "legitimately" elected president--so long as that person doesn't advertise themselves as such until after the primary.

Monday, October 17, 2005

Lincoln, Calhoun and the U.N.'s Dilemma

Michael Brandon McClellan calls upon the pre-Civil War ideologies put forth by Lincoln and Calhoun to put forth an interesting theory as to why many Americans seem to reject the U.N.
As was true prior to the Civil War, a legal ideology has emerged that seeks to place primary emphasis upon state equality at the expense of human freedom within the various states. It is plain that the United Nations principle of sovereign equality, like Calhoun's idea of state sovereignty prior to the American Civil War, can lead to the legal protection of the oppressors at the expense of the oppressed. This is necessarily accomplished when equality is emphasized among governments, regardless of their treatment of the individuals they govern.

If Calhoun's idea of "entity equality" is the organizing principle of a legal system, then there is indeed no basis for challenging the internal governance of an autonomous and equal entity. There is no basis upon which to say that a free state is better than a slave state (and thus just in acting to free the slaves), or that a constitutional democracy is better than an absolute despotism (and thus just in removing the despot). In such a framework, what matters is the principle of non-aggression between equal sovereign entities--in the case of Calhoun this meant states, and in the case of the United Nations this means nation states.

Perhaps because Lincoln's ideas have prevailed so emphatically in the United States, it is difficult for Americans to embrace a U.N. system that is moored so securely to the "entity equality" logic of John Calhoun. Just as the moral bankruptcy of Calhoun's political philosophy is so apparent when placed in the natural rights framework of Lincoln, so too is the U.N. framework undermined when viewed in the context of individual human freedom.

To Abraham Lincoln, the value of a government was to be determined not by its sacred sovereignty, but rather by the degree to which it secured the rights of the governed. Lincoln recognized that it is not states, but rather the individuals within states who are the true possessors of rights. Accordingly, even an international legal system cannot remain forever indifferent to the character of the governments it purportedly governs; nor can it remain perpetually neutral between freedom and bondage. The degree to which sovereignty should be afforded states that do not secure the rights of the governed is a question that demands asking now as much as it did in Lincoln's day.
McLellan may be conflating Lincoln's expressed ideology prior to the War with that developed during its course, especially post-Emancipation Proclamation. I'm not a Civil War era expert, but I seem to recall that before the War, Lincoln's argument for maintaining a "United" States wasn't based on a belief that slave states screaming for their own rights of sovereign self-determination was hypocritical. Instead, he believed that the U.S. federal government, as outlined by the Constitution, was the ultimate source of power and authority in the land and that if states could go their own way for any reason, whether it be tariffs or slavery, eventually the Union would dissolve into smaller sections. In the end, Lincoln had publicly come around to regarding the freeing of the slaves as an important goal of the war. Thus, McClellan's theory may be based a bit on a consequentialist view, but it is intriguing nonetheless.

Acephalous hosts History Carnival #18

History Carnival #18 is up, hosted by Acephalous who exhibits the wit and humor we've all come to know.

Thursday, October 13, 2005

Is "Reality TV" Valuable to Historians?

I got sucked up into the original Survivor lo' those many years ago. I was fascinated by the dynamic interplay of individuals as they tryed to "Outwit, Outplay, Outlast" in a Lord of the Flies-like setting and applauded when "the Snake" (and my home state of Rhode Island's own), Richard Hatch--the original Reality TV Machiavelli--took home the big prize.

At roughly the same time as that first exposure to Reality TV, I had again begun "serious" study by going back to school to pursue an MA in History. As I look back, I wonder why two such seemingly-opposite intellectual endeavors--higher education and watching reality tv--could have both appealed to me at the same time. My first thought is that I had turned to the light fare of Reality TV because it allowed my brain some down time from all of the heavy reading and thinking I was doing at school (on top of that done at work). Now I'm not so sure. Can it be that there is some historical value, or at least some value that historians can take, from watching these shows? Can Flavor Flav teach historians a thing or two?

First, it must be recognized that most (if not all) of the situations engineered into reality TV shows are innately fictitious and don't necessarily accurately reflect "real life." Some colleges and universities have gone so far as to offer collegiate courses centered around analyzing and studying reality television. However, others believe that such "reality" is no such thing. In commenting on The Osbournes, Rick Pieto and Kelly Otter observed:
As a popular term, reality TV denotes a variety of shows from Cops to Survivor, from the The Bachelor to The Osbournes. The term reality TV implies the documentation of the "reality" of an event or "referent" that somehow, in some way, exists independently of the recording machines that capture the event. Not only does MTV bend the conventions of the fictional genre with its ironic use of opening credits, but it also bends the codes, conventions, and ethics of documentary filmmaking so as to capture a segment of the youth market. This practice efficiently produces an ironic brand of media for a presumed media-savvy, (read: young) audience. The footage of police pullovers that are recorded by dashboard-mounted cameras for the reality show Cops, however problematic, more accurately fit the description of reality TV. Programs such as Survivor, The Bachelor, The Real World, and even The Osbournes do not document or observe an independent reality through a camera, as documentary films purport to do; they record the behaviors and activities appropriate to self-consciously constructed situations. As Erica Goode stated in a New York Times article, shows like Survivor, Big Brother, and The Bachelor are direct descendants of the social psychology experiments of the sixties and seventies. The film version of Stanley Milgram's infamous study Obedience to Authority and Philip Zimbardo's 1971 Stanford study provide the generic roots of reality TV. What these texts have in common, from Milgram's study to Big Brother, is the construction of an all-encompassing social situation with compelling rules and rigidly defined roles that influence, in often highly predictable ways, the social actions of the people who are in the situations. What reality TV presents is not the unobtrusive observations of an event that would have existed independently of the camera, but a highly controlled situation that produces a social drama constructed specifically for the camera (or experimenter).
The camera changes how people act. I think most people, at least subconsciously, realize this. However, even with this acknowledged, people are still people, and still can act in unpredictable ways. Elspeth Probyn's perspective very much mirrors my own.
Those who are not fans of the genre often wonder why people watch talkshows or reality TV. My educated guess is that it is something about the staging of real emotion. In the new generation of reality TV there’s also the crude but compelling mixture of team spirit versus rugged individualism. Shows like Survivor give us a vision of human sociality stripped naked. The big draw seems to be that we can watch real, if not very likeable, people do unreal things in order to win. And we get to squeal in outrage about the fact we’re watching it.
That, to me, is the key. Reality TV offers a chance to watch people compete and gives us better understanding, if often an editorially manipulated one, of how far some people will go to "win." Can any general conclusions about humanity be drawn from such constructed situations? What, if anything, can historians learn about contigency or "free will" by watching these shows? Does it even matter because such knowledge is only applicable to these specific arenas and couldn't be applied to our analysis of the past?

There have been some reality shows that have attempted to put people in genuine historical situations. In particular, I recall watching the PBS show Colonial House with both great interest and frustration. The problems with such an endeavor were best summed up by Dennis Cass in his review of the show for Slate.
Colonial House is proof that you can take the man out of the 21st century but you can't take the 21 st century out of the man. During Tuesday's episode, for example, cast member Jonathon Allen, a 24-year-old graduate student, tells the colony he is gay. It's an awkward, uncomfortable moment, but not in the way you might think. The weirdness doesn't come from the fact that Allen comes out on television—MTV's The Real World did away with that social taboo ages ago—but from his breaking the rules of Colonial House. "In 1628 I wouldn't even be having this conversation," says Allen, "because the governor would probably put a stop to it and take me out there and kill me." Allen says that he can no longer not be himself—a strange sentiment coming from a man who decided to spend his summer pretending to be a 17th-century indentured servant.

And then there's Gov. Jeff. I never thought I'd be in sympathy with a conservative Baptist minister from Waco, Texas, but Jeff Wyers, playing the colony's governor, seems to be the only person who wants the show to be what it was intended to be. Last night, he attempted to model the colony on the Puritan ideal of a utopian "City on a Hill." But when Gov. Jeff lays down the law—no profanity, women must cover their hair, mandatory church attendance on the Sabbath—almost everyone, in his or her own way, rebels. Saucy indentured servant Paul Hunt keeps swearing up a storm; Mr. and Mrs. Voorhees ditch church to go skinny dipping (!); while freeman Dominic Muir sneaks off to town for a beer and a plate of fries. Implementing historically accurate enforcement measures—wearing scarlet letters, being tied to a wooden stake—proves to be a modern pain in the ass, and work is brought to a near halt. Because the colony is expected to be financially viable (project rules dictate that the cast pay off imaginary "investors"), Gov. Jeff capitulates, proving that at least he's well-versed in that most modern of religions—expediency.

Colonial House is by no means a bad show. On the contrary: It's painstakingly researched, beautifully photographed, and it effectively debunks myths about the colonists as a bunch of dour, buckle-shoed squares. The viewer comes away with a good sense of how arduous life was for early settlers, and somewhere in there is buried a message about the challenges of balancing individual freedom with the individual's responsibility to his or her community. But whether Colonial House provides a true flavor of life in early America, I can't be sure. The next time someone hopes to capture how it truly felt to live in 1628, I hope they hire actors.
In the end, there was some real historical knowledge to be gained by watching Colonial House, but that value was derived more from the production values and accuracy of detail and not from observing the way that the participants acted in a colonial setting. Perhaps the next such endeavor while strive for stricter controls, even if they won't be using actors.

In the end, I see some potential value in a more strictly controlled scenario similar to that of Colonial House. I also think there is some limited value in watching more contemporary shows to study how different people act in pressure situations when their own self-interest is endangered. Perhaps, just like with sports, competition in the "reality" genre can tell us something about being human. As such, I think one lesson learned is that most people will do anything to win in those controlled and manipulated settings of Reality TV. Whether such a lesson can help historians in their study of the past, minus the cameras, seems doubtful.

One More Crack At Dan Brown

OK, I promise, I'll never mention The Da Vinci Code again, but Robert Sheaffer at Skeptic has offered a comprehensive and annoted rebuttal of Dan Brown's pop-culture blockbuster. His conclusion:
The rational person can enjoy reading works of fiction or science fiction, even trite fiction like The Da Vinci Code, without worrying excessively about obvious absurdities within the story. But a problem arises when a work of fiction explicitly claims to be more than a work of fiction, when it resonates with other widespread misinformation within the culture, and when 25 million readers are bamboozled by its specious assertions. The alleged “facts” in The Da Vinci Code are no more credible than those in Holy Blood, Holy Grail, from which they are taken. If you think of yourself as a skeptic you’ll do well to realize that very little fact is mixed in with Brown’s fiction. [link to Holy Blood, Holy Grail article at Wikipedia not in original]
Hopefully, by now, many "regular readers" have realized that there is very little history in Brown's book. However, given mankind's history of believing conspiracies, I doubt it!

About that Che Guevara T-shirt thing

Perhaps I'm making too much over a popular image t-shirt image, and though I've never really given Che Guevara much considered thought, what I did know of him made it seem odd that so many people found him, well, "cool." According to Alvaro Vargas Llosa, many of those running around wearing Che's image plastered across their chest may not really know exactly who he was either. Llosa encountered a Che-t-shirt wearing individual recently and asked him what he admired about Guevara. . . and then proceeded to refute the claims. Like I said, I'm no expert on Guevara, but to tout him as some sort of noble figure seems rather simplistic. Then again, those who do so are probably more enamored with his "counter-culture" mojo than anything else. After all, he was a cult figure in the '60's and the Vietnam War was going on, now we have Iraq, etc. Fill in the blanks.

Wednesday, October 12, 2005

The Franco-American Empire that Almost Was

Douglas Glover offers a positive review of Philip Marchand's Ghost Empire: How the French Almost Conquered North America. In a bit of French-Canadian and Catholic bias, I offer you this snippett:
He takes time to resurrect the Jesuits, especially the missionaries to the Huron, pointing out that, wrong-headed as they might have been in some ways, they far exceeded any other missionary group (especially Protestant Americans) in their ability to empathize with the natives. Their letters are our best source for what 17th-century native Canadians were like and had a profound influence on European thought from Voltaire (who wrote a novella set among the Huron) to Rousseau and his Noble Savage.

He reminisces about his boyhood rambles in the Massachusetts woods, notes with irony the fact that 40 per cent of the Catholic priests in the St. Ignace area are now Indians (from India), explains the shortcomings of the 17th-century French economic system, mounts a terse refutation of Marx (a modern and an enemy of religion) and then takes a quick stab at Michel Foucault and postmodern historians. But his tone is always amiably ironic; he wears his erudition lightly.

Marchand eschews political correctness, taking a swipe at native Americans for the unecological practice of running bison herds over cliffs. But then he digresses on how the Iroquois influenced Friedrich Engels's theories of the family. He is quite clear that native Americans took it in the neck through disease, war, displacement and economic disruption. But again, his point of view allows a wry observation. The fur trade brought the Iroquois sudden wealth but eroded their industrial base. Marchand says, "Perhaps the situation of the United States, in this regard, was not so different from the Great Lakes Indians four hundred years ago."
Sounds a like a book I'd find worth reading. It was my interest in my French-Canadian roots that led me back to my love of History and set me on the path towards getting an MA. (I still hold out hope for going for the PhD, but that'll have to wait for the kids to grow up...or the lottery!)

Tuesday, October 11, 2005

Will there be "Freedom from Fear"?

Franklin Roosevelt's "Four Freedoms" included "Freedom from Fear."
In the future days, which we seek to make secure, we look forward to a world founded upon four essential human freedoms.

The first is freedom of speech and expression--everywhere in the world.

The second is freedom of every person to worship God in his own way--everywhere in the world.

The third is freedom from want--which, translated into world terms, means economic understandings which will secure to every nation a healthy peacetime life for its inhabitants-everywhere in the world.

The fourth is freedom from fear--which, translated into world terms, means a world-wide reduction of armaments to such a point and in such a thorough fashion that no nation will be in a position to commit an act of physical aggression against any neighbor--anywhere in the world.
Frank Furedi thinks we've managed to broaden the definition of "fear" and, in so doing, managed to tighten its shackles even tighter then in Roosevelt's day.
One of the distinguishing features of fear today is that it appears to have an independent existence. It is frequently cited as a problem that exists in its own right, disassociated from any specific object. Classically, societies associate fear with a clearly formulated threat - the fear of plague or the fear of hunger. In such formulations, the threat was defined as the object of such fears: the problem was death, illness or hunger. Today, we frequently represent the act of fearing as a threat itself. A striking illustration of this development is the fear of crime. Today, fear of crime is conceptualised as a serious problem that is to some extent distinct from the problem of crime. That is why politicians and police forces often appear to be more concerned about reducing the public's fear of crime than reducing crime itself.

Yet the emergence of the fear of crime as a problem in its own right cannot be understood as simply a response to the breakdown of law and order. It is important to note that fear as a discrete stand-alone problem is not confined to the problem of crime. The fear of terrorism is also treated as a problem that is independent of, and distinct from, the actual physical threat faced by people in society. That is why so many of the measures undertaken in the name of fighting terrorism are actually oriented towards managing the public's fear of this phenomenon.
What to do?
Although the politics of fear reflects a wider cultural mood, it did not emerge spontaneously. Fear has been consciously politicised. Throughout history fear has been deployed as a political weapon by the ruling elites. Machiavelli's advice to rulers that they will find 'greater security in being feared than in being loved' has been heeded by successive generations of authoritarian governments. Fear can be employed to coerce and terrorise and to maintain public order. Through provoking a common reaction to a perceived threat it can also provide focus for gaining consensus and unity.

Today, the objective of the politics of fear is to gain consensus and to forge a measure of unity around an otherwise disconnected elite. But whatever the intentions of its authors, its main effect is to enforce the idea that there is no alternative.

The promotion of fear is not confined to right-wing hawks banging on the war drums. Fear has turned into a perspective that citizens share across the political divide. Indeed, the main distinguishing feature of different parties and movements is what they fear the most: the degradation of the environment, irresponsible corporations, immigrants, paedophiles, crimes, global warming, or weapons of mass destruction. . .

In one sense, the term politics of fear is a misnomer. Although promoted by parties and advocacy groups, it expresses the renunciation of politics. Unlike the politics of fear pursued by authoritarian regimes and dictatorships, today's politics of fear has no clearly focused objective other than to express claims in a language that enjoys a wider cultural resonance. The distinct feature of our time is not the cultivation of fear but the cultivation of our sense of vulnerability. While it lacks a clearly formulated objective, the cumulative impact of the politics of fear is to reinforce society's consciousness of vulnerability. And the more powerless we feel the more we are likely to find it difficult to resist the siren call of fear.

The precondition for effectively countering the politics of fear is to challenge the association of personhood with the state of vulnerability. Anxieties about uncertainty become magnified and overwhelm us when we regard ourselves as essentially vulnerable. Yet the human imagination possesses a formidable capacity to engage and learn from the risks it faces. Throughout history humanity has learned from its setbacks and losses and has developed ways of systematically identifying, evaluating, selecting and implementing options for reducing risks.

There is always an alternative. Whether or not we are aware of the choices confronting us depends upon whether we regard ourselves as defined by our vulnerability or by our capacity to be resilient.
Roosevelt boldly declared that we could get ourselves free of the fear as he defined it. He believed in American resiliency, to use Furedi's term, and I doubt he would have allowed himself or the U.S. to be "defined by our vulnerability."

"Blood Feud"

Before reading "Blood Feud" by Brendan I. Koerner, I had vaguely remembered that the Cherokee's had had slaves. What I didn't know was that they had freed them and made them full citizens of the tribe.
The Cherokee kept black slaves until 1866, when an emancipation treaty freed them from bondage and granted them full tribal citizenship. Known as the Freedmen, these men and women were embraced by the Cherokee as equals, and often married the offspring of their former masters. . . they identified with local cultures, spoke tribal languages, and took part in tribal religious rites.
Now, however, it looks like the Cherokees aren't quite as keen on recognizing the progeny of these Freedmen. Why?
Leslie Ross has been denied citizenship in the tribe on the grounds that he is not truly Indian. "They said I don't have any Indian blood. They say blacks have never had a part in the Cherokee Nation," says Ross, his usually calm voice swelling with anger. "The thing is, there wouldn't be a Cherokee Nation if it weren't for my great-grandfather. Jesus, he was more Indian than the Indians!" [According to the piece, Leslie Ross is and ancestor or Stick Ross, who " is thought to be the illegitimate grandson of Chief John Ross, who led the tribe along the Trail of Tears."]

Ross is just one of at least 25,000 direct descendants of Freedmen who cannot join Oklahoma's largest tribes. Once paragons of racial inclusion and assimilation, the Native American sovereign nations have done an about-face and systematically pushed out people of African descent. "There's never been any stigma about intermarriage," says Stu Phillips, editor of The Seminole Producer, a local newspaper in central Oklahoma. "You've got Indians marrying whites, Indians marrying blacks. It was never a problem until they got some money."
Where did the money come from? Casinos.

"Celebrating wrong Italian?"

In his syndicated column, James C. Bennett explains that there were actually three initial pathways, or "streams," to the New World:
one was the Spanish movement to the Caribbean, Mexico, Peru, and ultimately other areas, stemming from Columbus's voyage; another was the Portuguese movement to Brazil, which was intimately linked to their explorations of Africa predating Columbus; and the third was the stream of peoples from the British Isles and ultimately elsewhere to North America to found the nations of the North American Anglosphere. These three distinct streams founded the three principal cultural-linguistic communities of the Americas.

Although Cabot's voyage to Newfoundland was undoubtedly spurred by news of Columbus's voyages, the expanding English maritime enterprise would sooner or later have recapitulated the Viking achievements in the North Atlantic. There are interesting conjectures about prior voyages from the British Isles to North America before Columbus, from Bristol fishermen working the Grand Banks (not unlikely) to other, more speculative theories, such as Farley Mowat's ideas about voyagers from the Orkneys preceding the Vikings in the Dark Ages.

Whatever the realities of these theories, it is the expansion of the cultures and traditions that form the template on which today's societies in the U.S. and English Canada that we should commemorate. Columbus, whatever his merits and demerits may be, is in this regard beside the point.
He provides some historiography on how Columbus gained preeminence in America, too.

The NY Times Digs Hanson's Latest

William Grimes' review of Victor Davis Hanson's new book begins:
What the First World War was for Europe, the Peloponnesian War was for the ancient Greeks. It was also their Napoleonic Wars and their American Civil War. The protracted, ruinous conflict between Athens and Sparta, which dragged on for nearly 30 years (431 B.C. to 404 B.C.) prefigured, in one way or another, nearly every major conflict to come, right up the present war on terror.

The "war like no other," as Thucydides called it, continues to fascinate because it always seems pertinent, and never more so than in Victor Davis Hanson's highly original, strikingly contemporary retelling of the superpower confrontation he calls "a colossal absurdity."

In his capable hands, the past, more often than not, seems almost painfully present. Thucydides, the great historian of the war, is described as a kind of embedded reporter. The Athenians, relying on local populations under Spartan rule to greet them as liberators, never encountered quite the enthusiasm they anticipated, and the imperial assumptions behind "Athenianism," which Mr. Hanson calls "the Western world's first example of globalization," suggest uncomfortable comparisons. Like the Athenians, he writes, Americans are "all-powerful, but insecure, professedly pacifist yet nearly always in some sort of conflict, often more desirous of being liked than being respected, and proud of our arts and letters even as we are more adept at war."

Mr. Hanson, whose books on classical warfare include "The Western Way of War" and "The Wars of the Ancient Greeks," does not harp on this theme. He directs most of his attention to the war itself, the way it was fought and the profound changes in methods and psychology that took place over time.

Friday, October 07, 2005

President Bush's Speech on Terror to the National Endowment for Democracy

I mentioned President Bush's Speech on the War on Terror to the National Endowment for Democracy below. It has gained some praise from around the web. Jonathan Dresner has also started a much needed dialogue for we historians over at Cliopatria. It's about time we stop and take stock. With elections a year behind us and in front of us, maybe we can all take a step back from the ideological lines we tend to toe and give the war on terror a fresh look. As Jonathan says, "There are good arguments against terror, against Islamism, for putting real effort (including patience) into Iraqi success. There are good arguments for multilateralism, for multiculturalism, for imperial restraint. Honestly, there's a middle ground here, somewhere." Yes, and part of the analysis should be to take a look back and not only see which arguments have been proven wrong, but which have been proven right.

Conservative Revolutionary Mothers?

Ellen Hartigan-O'Connor explains in her review of Carol Berkin's Revolutionary Mothers: Women in the Struggle for America’s Independence
. . . Berkin focuses on wartime realities, not the ideological causes or implications of the American Revolution. After a succinct summary of the prevailing mid-eighteenth-century view of white women as first and foremost men’s helpmates, she explores how these women struggled to survive wartime scarcity, murderous home-front fighting, the grim realities of life with the army, relocation and abandonment. Politics are secondary because, in Berkin’s view, although the Revolution may have eventually inspired movements for white women’s and African Americans’ rights, the experience of war bred short-term social conservatism. The 'Daughters of Liberty' nobly arose only to sit back down again. 'No sweeping social revolution followed in the wake of the political revolution; indeed, like women and men after many wars, white Americans seemed more eager to return to the life that had been disrupted than to create a new one' (x). The transformations of the war, she argues, tended to be personal and often temporary.

Colonial Animals Bring out Witty Historical Writing

When historians plant tongue in cheek, they deserve recognition. This is how Gregory Nobles opens his review ("Bovine Invaders, Porcine Imperialists" HA!) of Virginia DeJohn Anderson's Creatures of Empire: How Domestic Animals Transformed Early America:
Did cattle cause King Philip's War? Might swine give new meaning to the term Bacon's Rebellion? Could dumb brutes exert agency in shaping human history? The answer to all three questions is "Yes—sort of." And yes, there are many more, and probably better, questions to emerge from this smart and fascinating study of the role of farm animals in seventeenth-century American society. With a clear sense of where she's going and how to get there, Virginia DeJohn Anderson skillfully shepherds us through a familiar time and territory that we thought had already been grazed over far too many times, leading us into greener intellectual pastures, which give us plenty of fresh ideas to chew on.
It got my attention and a couple of guffaws along the way. Just as funny (and startling!) was this excerpt from Anderson's book:
Anderson notes that seventeenth-century England, with a human population of just over five million, also contained an estimated four million cattle, twelve million sheep, and two million pigs. Indeed, she also cites a contemporary reckoning that "the ideal husbandman spent far more time each day with his livestock than with his wife and children—as much as 14 of 17 waking hours" (85). Such familiarity could breed something other than contempt, and some farmers "developed sentimental ties with their animals that seemed to match in emotional intensity their connections to relatives and friends" (91). This emotional intensity occasionally led to licentious excess, and bestiality became both a concern and a capital crime on both sides of the Atlantic, especially in Puritan New England, where four men suffered execution between 1640 and 1647.
Like I said, this is the kind of writing that makes history fun.

Thursday, October 06, 2005

A Progressive on the War on Terror

Sasha Abramsky rebuts his fellow leftists knee-jerk opposition to President Bush and the War on Terror and calls for Progressives to work to actively formulate a better plan than "pull out of Iraq and everything will be fine." This blind spot could render them politically moot in the future.
They assume that groups like al-Qaida are almost entirely reactive, responding to western policies and actions, rather than being pro-active creatures with a virulent homegrown agenda, one not just of defence but of conquest, destruction of rivals, and, ultimately and at its most megalomaniacal, absolute subjugation.

It misses the central point: that, unlike traditional “third-world” liberation movements looking for a bit of peace and quiet in which to nurture embryonic states, al-Qaida is classically imperialist, looking to subvert established social orders and to replace the cultural and institutional infrastructure of its enemies with a (divinely inspired) hierarchical autocracy of its own, looking to craft the next chapter of human history in its own image.

Simply blaming the never quite defined, yet implicitly all-powerful “west” for the ills of the world doesn’t explain why al-Qaida slaughtered thousands of Americans eighteen months before Saddam was overthrown. Nor does it explain the psychopathic joy this death cult takes in mass killings and in ritualistic, snuff-movie-style beheadings. The term “collateral damage” may be inept, but it at least suggests that the killing of civilians in pursuit of a state’s war aims is unintentional, regrettable; there is nothing unintentional, there is no regret, in the targeting of civilians by al-Qaida’s bombers.

Moreover, many of those who reflexively blame the west do not honestly hold up a mirror to the rest of the world, including the Muslim world, and the racism and sexism and anti-semitism that is rife in many parts of it. If bigotry were indeed the exclusive preserve of the west, their arguments would have greater moral force. But given the fundamentalist prejudices that are so much a part of bin Ladenism, the cry of western racism is a long way from being a case-closer.

We should attend to the way bin Laden and his followers invoke “the west.” They do so alternately to describe any expansive and domineering “first world” economic and political system and, even more ominously, to demarcate a set of ostensibly decadent liberal political, cultural, social, and religious beliefs and practices.

Indeed, what al-Qaida apparently hates most about “the west” are its best points: the pluralism, the rationalism, individual liberty, the emancipation of women, the openness and social dynamism that represent the strongest legacy of the Enlightenment. These values stand in counterpoint to the tyrannical social code idealised by al-Qaida and by related political groupings such as Afghanistan’s Taliban.

In that sense, “the west” denotes less a geographical space than a mindset: a cultural presence or a sphere of anti-absolutist ideas that the Viennese-born philosopher Karl Popper termed the “open society.” In his day, when fascists and Stalinists held vast parts of the globe, the concept of “the west” prevailed over a smaller territory than today. But with the rise of bin Ladenism, the prevalence of this concept again is shrinking.

It is because bin Ladenism is waging war against the liberal ideal that much of the activist left’s response to 11 September 2001 and the London attacks is woefully, catastrophically inadequate. For we, as progressives, need to uphold the values of pluralism, rationalism, scepticism, women’s rights, and individual liberty and oppose ideologies and movements whose foundations rest on theocracy, obscurantism, misogyny, anti-Semitism, and nostalgia for a lost empire.
So what's the difference between current policy and what he proposes?
There are progressive alternatives on the table. In 2004, for example, the New York Academy of Medicine published a report, Redefining Readiness: Terrorism Planning Through the Eyes of the Public, arguing that encouraging a greater public involvement in pre-emptively preparing for large-scale terrorist attacks would actually prove more effective than simply relying on opaque instructions handed out by secret government agencies in the event of a civic emergency.

Others, like Senator Jon Corzine of New Jersey, have stressed the need to safeguard our chemical and nuclear plants, which Bush has refused to do, lest it cost companies money. And John Kerry dwelled on the importance of rounding up the “loose nukes” in the former Soviet Union, a programme Bush has woefully underfunded. . .

In terms of laws to tackle terrorism, instead of activists denouncing any and all special legal powers granted courts and governments in this fight, how about acknowledging that organised terrorism does pose certain tricky legal questions and, from there, attempt to craft responses that, unlike those proposed by the right, don’t result in the creation of legal black holes for terrorism suspects? How about, for example, recognising that in wartime there might be legitimate grounds for pre-emptively detaining a person for a prescribed and limited period of time on a suspicion of plotting a major attack, while still denouncing the notion that such a person doesn’t have the right to an attorney or to a speedy trial?

Conservatives have made the war on terror all about military power and homeland security operations, while rarely addressing global economic inequalities and social injustices. The left’s challenge is to not produce an exact mirror image of this; that is too easy. We always denounce economic inequalities and social injustices. And we’re right to do so. But today that’s not enough.
These are good, broad points, and yes the devil is in the details, but more self-described progressives should take such an approach--indeed, shoulder such responsibility--instead of viewing everything through the prism of whether or not this or that benefits their own ideological position.

UPDATE: Interestingly, much of the Abramsky's arguments could have been made by President Bush. Maybe the President read Abramsky's piece before giving the aforelinked speech?

Wednesday, October 05, 2005

So, How Good Were the Old Days?

Providence Journal columnist Edward Achorn recently wrote a piece in which he both complemented historians and attempted to remind people that the "good old days" weren't always so.
Among the great unsung heroes of our culture are surely our history teachers: those who make the past compelling and meaningful. Only history makes sense of the present, giving us the perspective we need to live happy and fulfulling lives -- and avoid falling prey to demagogic politicians.

Too many Americans stumble around in the dark with a weak flashlight, capable of illuminating only a narrow circle of their immediate surroundings. History throws open the curtains, and lets the light flood in, so that we may grasp the true dimensions of the room.

These thoughts come to mind whenever I see the politicians or some of the media tearing down America. I know of people who, having watched too much cable TV news in recent weeks, think our country is worse off than it has ever been -- economically, culturally, politically, environmentally, militarily. Some say they are terrified of the future.

That's very human behavior. We tend to believe our experiences are unprecedented -- that no one loved, or suffered, or dreamed, as we do.
He then listed a litany of contemporary problems--hurricanes, poverty, disease, the economy, etc.--that many feel are worse than ever while he contends that history shows the opposite.
I'd urge anyone who imagines American life is harder and more pressure-packed today than ever to pick up a copy of The Good Old Days -- They Were Terrible!, by Otto L. Bettman. The author points out that Americans in the late 19th Century contended with a life expectancy in the 40s; malaria, cholera and polio; the drudgery and acute loneliness of farm life; filthy surgery and bad health care; brutal oppression of minorities and many women; foul and toxic air; maddening traffic; fire-trap tenements; government corruption on a scale that puts today's Louisiana or Rhode Island to shame; rancid food and adulterated milk; garbage- and feces-strewn streets; deadly train and steamboat travel; and deeply corrupt professional baseball (okay, maybe everything hasn't changed).

This does not mean today's America cannot do better, or that other ways of fighting terrorism shouldn't be considered, or that we should ever stop working to help lift up those who are less fortunate than we. (For one thing, what else would columnists write about?)

But, amidst the tragedies and troubles swirling around us in this imperfect world, we should reflect on the fact that -- from the standpoint of living long, rewarding and healthy lives in freedom -- today's Americans are astoundingly fortunate in their choice of when to be born.
It's a useful reminder, and one in which Achorn is not alone in trumpeting. Michael McNeil goes deeper into the historical to make much the same case, though he fixes "liberals" firmly within his sights.
The fact is that the modern industrial age, in combination with the scientific revolution, and organized along the lines of the modern American model of society (which has now been transferred, more or less, to many another country around the world) has created the only instance in history where the bulk of the population of affected areas can enjoy a life of ordinary (what we think of as “common” nowadays), healthy, leisured, literate, decency. [emphasis in original]
McNeil offers up some historical (sourced!) evidence in a piece worth reading. We are all guilty of looking back at the old days romantically, in one manner or another. It would serve us well to heed such reminders as offered by Achorn and McNeil before we harken back with rose-colored glasses to a gloried past when all was right. That does not mean that we should broadly assume that everything before was bad, though. Rather, we should assume that the common baseline, the water level of progress, now rests at a higher mark then it has in the past. There are surges and swells--and even whirpools and water spouts--but the natural level which the water seeks continues to rise over time.