The sword guild isn't just about boys and their toys. Female fighters, while rare, can be found in the earliest surviving fight manual, the ''Walpurgis Fechtbuch,' which was written primarily in Latin in central Germany roughly 700 years ago. In that manual, a swordswoman named Walpurgis is depicted fighting a male opponent, and references in subsequent texts show women participating in judicial duels against men, Forgeng points out.Makes sense to me.
In fact, there's nothing in fencing ''that inherently favors the male over the female,' Pugliese says. ''It helps to be stronger, but I can find you any number of women who are stronger than I am.'
''I can't tell you how many times I've been at events where people say, Well, women didn't use swords,' and here we have evidence just blowing that out of the water,' says Holly Hunt, a Shrewsbury resident who is among the Guild's four long-term women members among its 20 regulars. She says that women historically studied fencing to defend themselves."
Tuesday, May 31, 2005
Hell Hath No Fury Like a Woman with a Sword
An article on the historical swordplay group that meets in Worcester's Higginson Armory Museum had this interesting and gender-busting bit.
Thursday, May 26, 2005
Barbed Wire: Tool of Violence and Oppression?
Edward N. Luttwak, who owns a cattle ranch in Bolivia, reviewedBarbed Wire by Reviel Netz and summarized and commented upon some of the author's central arguments.
For Netz, the raising of cattle is not about producing meat and hides from lands usually too marginal to yield arable crops, but rather an expression of the urge to exercise power: “What is control over animals? This has two senses, a human gain, and an animal deprivation”. To tell the truth, I had never even pondered this grave question, let alone imagined how profound the answer could be. While that is the acquisitive purpose of barbed wire, for Professor Netz it is equally – and perhaps even more – a perversely disinterested expression of the urge to inflict pain, “the simple and unchanging equation of flesh and iron”, another majestic phrase, though I am not sure if equation is quite the right word. But if that is our ulterior motive, then those of us who rely on barbed- wire fencing for our jollies are condemned to be disappointed, because cattle does not keep running into it, suffering bloody injury and pain for us to gloat over, but instead invisibly learns at the youngest age to avoid the barbs by simply staying clear of the fence. Fortunately we still have branding, “a major component of the culture of the West” and of the South too, because in Bolivia we also brand our cattle. Until Netz explained why we do it – to enjoy the pain of “applying the iron until – and well after – the flesh of the animal literally burns”, I had always thought that we brand our cattle because they cannot carry notarized title deeds anymore than they can read off-limits signs. Incidentally, I have never myself encountered a rancher who expensively indulges in the sadistic pleasure of deeply burning the flesh of his own hoofed capital, opening the way for deadly infection; the branding I know is a quick thrust of the hot iron onto the skin, which is not penetrated at all, and no flesh burns.The only additional commentary I can offer is my previous post.
We finally learn who is really behind all these perversities, when branding is “usefully compared with the Indian correlate”: Euro-American men, of course, as Professor Netz calls us. “Indians marked bison by tail-tying: that is, the tails of killed bison were tied to make a claim to their carcass. Crucially, we see that for the Indians, the bison became property only after its killing.”
We on the other hand commodify cattle “even while alive”. There you have it, and Netz smoothly takes us to the inevitable next step:
“Once again a comparison is called for: we are reminded of the practice of branding runaway slaves, as punishment and as a practical measure of making sure that slaves – that particular kind of commodity – would not revert to their natural free state. In short, in the late 1860s, as Texans finally desisted from the branding of slaves, they applied themselves with ever greater enthusiasm to the branding of cows.”
Texans? Why introduce Texans all of a sudden, instead of cowboys or cattlemen? It seems that for Professor Netz in the epoch of Bush II, Texans are an even more cruel sub-species of the sadistic race of Euro-American men (and it is men, of course).
History and Politics
A year ago, Robert McElvaine did a study and analysis of why so many historians are against George W. Bush. That would have been fine and, as it was, predictable in its outcome. However, they larger story from a historical perspective was that so many historians had already deemed the Bush Presidency a failure. This even while still in the first term. I'd be interested to see this story updated given that a year has passed since the heat of the election.
But that really isn't my concern. I think that we historians have to be careful about offering such analysis too close to the history of the given event, so to speak. Especially if, say twenty or fifty years from now most historians come to a different conclusion. Could these "future peers" look at such data and wonder whether a group of historians that were so universally wrong in their too-instant "historical analysis" of a president may have also been wrong in other areas. If ideology so colored the contemporary analysis of so many historians, to what degree did it color their historical analysis?
This will be exacerbated, in my opinion, by the obvious belief held by many current historians that, because they are knowledgable of the past they are more qualified than most to comment on the present. I always thought that most historians rejected determinism. Sometimes it seems that we don't if it can support our politics. (Iraq=Vietnam, America=Rome, Bush=Hitler, etc.) I'm not saying that historians should exclude themselves from politics, but perhaps a bit more care should be taken before using one's own historical credentials as proof of expertise and the incontrovertible proof of one's own proclamations.
But that really isn't my concern. I think that we historians have to be careful about offering such analysis too close to the history of the given event, so to speak. Especially if, say twenty or fifty years from now most historians come to a different conclusion. Could these "future peers" look at such data and wonder whether a group of historians that were so universally wrong in their too-instant "historical analysis" of a president may have also been wrong in other areas. If ideology so colored the contemporary analysis of so many historians, to what degree did it color their historical analysis?
This will be exacerbated, in my opinion, by the obvious belief held by many current historians that, because they are knowledgable of the past they are more qualified than most to comment on the present. I always thought that most historians rejected determinism. Sometimes it seems that we don't if it can support our politics. (Iraq=Vietnam, America=Rome, Bush=Hitler, etc.) I'm not saying that historians should exclude themselves from politics, but perhaps a bit more care should be taken before using one's own historical credentials as proof of expertise and the incontrovertible proof of one's own proclamations.
Thursday, May 19, 2005
Kagan and History as Moral Teacher
After reading Phillip Kennicott's pretty derogatory remarks regarding Donald Kagan's "In Defense of History" (the 34th Annual Jefferson Lecture for the National Endowment for the Humanities), I got the impression that Kennicott was being unnecessarily critical and rather hyperbolic in his criticism. This was primarily because there was no access to the text of the speech and, as such, I only could read the snippett's quoted by Kennicott. I also wondered what others would have to say. Well, the wait is over. The speech is now available (see above link) and others (George Will and Suzanne Fields, for instance) have weighed in. Now I realize why Kennicott was so critical. I won't parrot Will or Fields, you can read them yourself, but in essence Kagan championed History as a method by which morality can be taught through its own chronicled examples. This has actually caused a debate among the conservatives at The National Review's "The Corner" blog. Ramesh Ponurru has taken umbrage with George Will's characterization of Kagan's downplaying of religion's role in being a moral guidpost while Johah Goldberg asks Ponurru
But back to the original point. It must be accepted that not all Americans will adhere to a religious basis for their morality. Lacking that, only tradition informed by history can perform a similar function. Thus, to me, it is important to ensure that the ideals held by those that came before us not be characterized as without value because of misplaced revisionism based on the "hypocrisy" of the dead white guys. Inherent in this is that we don't downplay or forget from whence these individuals derived their morality. (Please, no comments on the benefits of revision. I agree, I'm only talking about revsion in the sense that it is overplayed for political or polemical reasons).
But what exactly is your problem with the Will column? I didn't love it, but it seems to me one can derive some moral lessons from the history of humanity. I don't think history will ever convince a large number of atheists that God exists -- or vice versa -- but surely your inner Hayek or Burke sees some use for history in moral argumentation?To this, Ponurru responded
Well, sure, I see some use for history. But I wouldn't argue, as Will does, for history at the expense of philosophy, religion, and poetry. You need to bring philosophy to bear on history to generate moral conclusions. Nor would I argue for history over philosophy, as Will does, on the basis of philosophy's capacity to generate disagreement.I think what Ponurru is not willing to concede that Kagan and Will already believe is that there are some philosophical truths that have been mutually arrived at that are (seemingly) incontrovertably juxtaposed. As a result, Kagan and Will have turned to the "practicallity" of history instead of the idealistic realm of philosophy that, as Kagan put it, "inevitably leads to metaphysics, the investigation of first principles and the problem of ultimate reality, which over the millennia has led to massive disagreement, no progress, cynicism and rejection." Will and Kagan fail to see anything that can be taken out of the "black box" of modern day philosophy that can be applied pragmatically. Ponurru thinks that they mean that philosophy is not as practical because it doesn't generate disagreement. I think it would be better to say that the disagreement's generated by philosophy are particularly tough to resolve.
But back to the original point. It must be accepted that not all Americans will adhere to a religious basis for their morality. Lacking that, only tradition informed by history can perform a similar function. Thus, to me, it is important to ensure that the ideals held by those that came before us not be characterized as without value because of misplaced revisionism based on the "hypocrisy" of the dead white guys. Inherent in this is that we don't downplay or forget from whence these individuals derived their morality. (Please, no comments on the benefits of revision. I agree, I'm only talking about revsion in the sense that it is overplayed for political or polemical reasons).
Wednesday, May 18, 2005
Interpreting the Past - Slate's History Week
Of interest to all (though they probably don't need a plug from my little site), this week is Slate's History Week. A good place to start would be American History 101, a dialogue between Diane Ravitch of New York University, history curriculum advisor (she was primary writer for the California History/Social Science Framework adopted by the State Board of Education in 1988) and author of The Language Police and Jon Wiener of the University of California, Irvine, a contributing editor of the Nation, and is the author, most recently, of Historians in Trouble: Plagiarism, Fraud, and Politics in the Ivory Tower. Here is Part 2 of their debate and so far both bring up good points. However, I must note that Wiener seems especially inclined to associate history with contemporary politics. Tsk, Tsk.
"Deep Blue Campuses"
The Leadership Institute has produced a new report, "Deep Blue Campuses," that really offers nothing new or earth-shattering regarding campus political leanings:
While the debate, in the near term, has taken on a predictable ideological bent, the real goal should be to promote and champion the cause of true intellectual freedom on campus. Without real debate on campus, the sharp edge of rhetoric can dull and critical thinking becomes less so. Perhaps for those permanently ensconced in the Ivory Tower this is nothing of concern, but for those whom they ostensibly teach, for the students who pay the often-high tuitions that subsidize the academic profession, they enter the world unable to critically scrutinize those ideas that both don't align and do align with their own preconceptions. If they have never been really forced to defend their ideas, will they be able to once outside the ivy covered walls?
Thus, for purely ideological reasons if nothing else, (and to approach this from a cynical angle) it would seem that the best way to indoctrinate students and keep them on board once they leave campus would be to offer them real counter-arguments, perhaps even presented by those who genuinely believe them (gasp), against which the students can practice. Yes, perhaps some will actually have their minds changed, but probably more won't. As a result, they will be better prepared to deal with those who don't necessarily march in ideological lock-step with them in the "real world."
American campuses are very different from the nation that surrounds them. The differences are especially profound when it comes to politics. The United States is closely divided between the two major parties. No such division exists on any major college campus. . .Now, I don't believe that it necessarily follows that political inclinations manifest themselves to a large degree in the classroom, but, in my experience and others, there are certainly some detectable "shadings" and asides that do crop up from time to time. Academics are almost inherently political and it is naive to believe that these numbers are no big deal. While some sort of "ideology test" is a violation of the spirit of open debate, perhaps what these reports can accomplish is to give those in "the Academy" pause before diverting too far from the course work or lecture at hand. I'm not just talking about the "liberals" here, either, given the recent reports out of the Air Force Academy that there is a culture of active evangelical Christian proslytizing, and perhaps even religious discrimination, that has sprung up.
Employees at Harvard University gave John Kerry $25 for every $1 they gave George W. Bush. At Duke University, the ratio stood at $8 to $1. At Princeton University, a $302 to $1 ratio prevails.
The Kerry/Bush split in the number of donations is even more extreme. John Kerry received 257 donations of $200 or more from Stanford, while his opponent got just 28. At Northwestern, Kerry received 100 contributions and Bush six. Georgetown University donations swung 132 to six in Kerry’s favor.
Deep Blue Campuses examines the political donations of employees at the top twenty-five national universities listed in U.S. News and World Report’s 2004 college issue. Specifically, the booklet compares donations in the 2004 election cycle to the two major presidential candidates, George W. Bush and John Kerry.
Although George Bush claimed a bare majority of votes in the actual election, John Kerry trounced him in donations received from colleges and universities. In fact, John Kerry received the lion’s share of donations from workers at all twenty-five schools featured in U.S. News and World Report’s annual survey. At one school (Dartmouth), Kerry posted an infinite advantage: FEC records show 39 donations to Kerry but not a single Dartmouth employee donating to George W. Bush’s campaign.
While the debate, in the near term, has taken on a predictable ideological bent, the real goal should be to promote and champion the cause of true intellectual freedom on campus. Without real debate on campus, the sharp edge of rhetoric can dull and critical thinking becomes less so. Perhaps for those permanently ensconced in the Ivory Tower this is nothing of concern, but for those whom they ostensibly teach, for the students who pay the often-high tuitions that subsidize the academic profession, they enter the world unable to critically scrutinize those ideas that both don't align and do align with their own preconceptions. If they have never been really forced to defend their ideas, will they be able to once outside the ivy covered walls?
Thus, for purely ideological reasons if nothing else, (and to approach this from a cynical angle) it would seem that the best way to indoctrinate students and keep them on board once they leave campus would be to offer them real counter-arguments, perhaps even presented by those who genuinely believe them (gasp), against which the students can practice. Yes, perhaps some will actually have their minds changed, but probably more won't. As a result, they will be better prepared to deal with those who don't necessarily march in ideological lock-step with them in the "real world."
Hanson: The Bush Doctrine's Next Test
In "The Bush Doctrine's Next Test," Victor Davis Hanson examines the inconsistency of championing the spread of democracy and maintaining support for autocratic regimes that, to this point, has exemplified the "Bush Doctrine."
The President’s bold plan appears to be based on a model of democratic contagion. We have seen such infectious outbreaks of popular government in Latin America and Eastern Europe, so we know the prognosis is not fanciful. But in the Muslim and Arab Middle East, democracy has no real pedigree and few stalwart proponents. Thus, recalcitrant autocracies will inevitably serve as sanctuaries and strongpoints for those trying to reverse the verdict in an Afghanistan, an Iraq, or a Lebanon; the idea that these same anti-democratic societies are supported by the U.S. is presently embarrassing and eventually unsustainable.The whole thing is worth a read and Hanson treats the problems with the Bush Doctrine fairly. He also makes an important point regarding the inevitable criticisms that the BD will endure from those ideologically opposed, on both the Left and Right, to the President's plan.
Fortunately, however, the reverse is also true. A metamorphosis of these same dictatorships would help accelerate the demand for democratization elsewhere. Far from representing a distraction in the struggle against current front-line enemies like Iran and Syria, the reformation of Egypt, Pakistan, and Saudi Arabia would only further isolate and enfeeble those states—as William Tecumseh Sherman’s “indirect approach” of weakening the rear of the Confederacy, at a considerably reduced loss of life, helped to bring to a close the frontline bloodshed of northern Virginia, or as Epaminondas the Theban’s freeing of the Messenian helots dismantled the Spartan empire at its very foundations.
Egypt, Pakistan, and Saudi Arabia are not the equivalent of the Soviet Union’s satellite states of Albania, Bulgaria, and Romania. Rather, they are the East Germany, Hungary, and Poland of the unfree Middle East: pivotal nations upon whose fate the entire future of the Bush Doctrine may well hinge.
At home, the Left will continue to score points against the President, citing either the impracticality of his policy when news is bad or, when things seem headed in the right direction, decrying the cultural chauvinism in thinking that Western concepts like democracy can be “privileged” over indigenous forms of rule. The Right will warn against the danger of betraying trusted allies, or of playing into the hands of popular extremists—or of giving an inside track to European and Chinese commercial interests that exhibit no such squeamishness about doing business with tyrants.I can't recall where I heard it, but someone said that perhaps the reason the President has seized upon the ideal of spreading Democracy in this manner is because up until now nothing else has worked and, to this point, democracy in the Middle East is just about the only thing that hasn't been tried. Of course the result won't be an American style Federalist system. The culture and societies of Arab/Islam nation states will put their own indelible stamp on their form of democratic government. Perhaps it is presumptious of us to believe that it can be done. But we won't know unless we make the attempt.
Some of these apprehensions are well grounded. Violent upheaval followed by Islamist coups could endanger world commerce well beyond oil, in the choke points of the straits of Hormuz and the Suez Canal. And, as we saw with Arafat on the one hand and the Iranian clerics on the other, plebiscites can indeed become the basis for years of Western appeasement of despotic rule. But a pre-9/11-like stasis is even worse: change is inevitable in any case, and there may be only a brief window to ensure that it is democratic and stays that way, rather than Islamist, reactionary, and/or nuclear.
Friday, May 13, 2005
What did Donald Kagan do to Philip Kennicott?
Yale historian Donald Kagan gave the 34th Annual Jefferson Lecture for the National Endowment for the Humanities, titled "In Defense of History," last night. According to Philip Kennicott of the Washington Post, the speech was mostly "boiler plate" (the title certainly was!). No transcript is available, so it's tough to independently analyze the speech. However, Kennicott believes that much can be read into what Kagan didn't say, but was included in the prepared text of the speech that was presumably handed out to the media. (I don't doubt Kennicott, I'm just trying to lay it all out).
According to Kennicott, Kagan said, "The real importance of historians is in leading the charge against the 'mindlessness promoted by contemporary political partisanship.'" To which Kennicott commented
(via Ramesh Ponnuru at NRO).
According to Kennicott, Kagan said, "The real importance of historians is in leading the charge against the 'mindlessness promoted by contemporary political partisanship.'" To which Kennicott commented
This is rich, coming from a beloved father figure of the ascendant neoconservative movement, and the co-author of "While America Sleeps," a 2000 book comparing the complacency of Britain after the First World War with the supposed complacency of the United States after the Cold War. Written with his son, Frederick W. Kagan, this book begins with the disturbingly alarmist line, "America is in danger." Prescient words, it might seem, given the events of September 2001. But Kagan's book had almost nothing to say about terrorism. He was stumping for a strong military and for not squandering the peace dividend. The book was very much a traditionalist argument about traditional military power.I'm not familiar with Kagan's work or politics. But it seems that Kennicott is reaching just a bit. I'll be interested to see what others think.
No matter. No one with an overarching Gloomy Gus view of the world has ever been completely wrong. Granted, Kagan's book, and his life's work, have contributed to an environment in which fear of vague potential threats often overwhelms sane evaluation of real threats (Kagan went on and on about potential WMDs in Iraq in his book, though they were never found). The neoconservative worldview espoused by Kagan -- despite his protestations about the importance of history standing aside from political partisanship -- is airtight. Pessimists can always count on the accumulating tragedies of history to efface memory of one little mistake.
Moral certainty is today's big intellectual and political fetish. Politicians tell us of certainties big and small, the certain presence of certain dangers that threaten our society, which is certainly the best of all possible societies. Kagan has made a career of this business. Which makes one line left out of the speech (but included in the printed remarks distributed at his talk) fascinating. In his conclusion, he noted that as the power of religion to provide moral truth declined, people needed something else, and so they turned to history for moral object lessons. "History, it seems to me, is the most useful key we have to open the mysteries of the human predicament," read the omitted text. Has the ascendancy of religious fundamentalism among Kagan's allies in the conservative movement made the humanistic certainty of that line unsayable? Has history come full circle, and is the old dinosaur now an accidental outsider?
(via Ramesh Ponnuru at NRO).
Jonah Goldberg: "What is a Conservative"?
Jonah Goldberg (via Cincinnati Historian) ponders what exactly is this creature--the Conseervative--that he (and I) call ourselves. Importantly, he notes that an American conservative is different that others:
Goldberg also stepped away from the philosophical plane and explained why Liberals and Conservatives (in America) operate from such different places.
As I’ve written many times here, part of the problem is that a conservative in America is a liberal in the classical sense — because the institutions conservatives seek to preserve are liberal institutions. This is why Hayek explicitly exempted American conservatism from his essay “Why I am Not a Conservative.” The conservatives he disliked were mostly continental thinkers who liked the marriage of Church and State, hereditary aristocracies, overly clever cheese, and the rest. The conservatives he liked were Burke, the American founders, Locke et al.After additional ruminations, he concludes that being conservative means having "Comfort with contradiction". As he puts it, "Think of any leftish ideology and at its core you will find a faith that circles can be closed, conflicts resolved." He offers the examples of Marxism and Freudism as examples. In contrast, Dewey, calling on Hegel, reasoned that "society could be made whole if we jettisoned dogma and embraced a natural, organic understanding of the society where everyone worked together."
This is a point critics of so-called “theocons” like to make, even if they don’t always fully realize they’re making it. They think the rise of politically conservative religious activists is anti-conservative because it smells anti-liberal. Two conservatives of British descent who’ve been making that case lately are Andrew Sullivan and our own John Derbyshire. I think the fact that they’re British is an important factor. British conservatives, God love ‘em, are typically opponents of all enthusiasms, particularly of the religious and political variety. Personally, I’m very sympathetic to this outlook. . .it seems to me patently obvious that religion and conservatism aren’t necessarily partners. Put it this way, Jesus was no conservative — and there endeth the lesson.
Goldberg also stepped away from the philosophical plane and explained why Liberals and Conservatives (in America) operate from such different places.
Think about why the Left is obsessed with hypocrisy and authenticity. The former is the great evil, the latter the closest we can get to saintliness. Hypocrisy implies a contradiction between the inner and outer selves. That’s a Freudian no-no in and of itself. But even worse, hypocrisy suggests that others are wrong for behaving the way they do. Hypocrites act one way and behave another. Whenever a conservative is exposed as a “hypocrite” the behavior — Limbaugh’s drug use, Bennett’s gambling, whatever — never offends the Left as much as the fact that they were telling other people how to live. This, I think, is in part because of the general hostility the Left has to the idea that we should live in any way that doesn’t “feel” natural. We must all listen to our inner children.In the end, Goldberg boils down his conservatism to Patriotism
Now look at the arguments of conservatives. They are almost invariably arguments about trade-offs, costs, “the downside” of a measure. As I’ve written before, the first obligation of the conservative is to explain why nine out of ten new ideas are probably bad ones. When feminists pound the table with the heels of their sensible shoes that it is unfair that there are any conflicts between motherhood and career, the inevitable response from conservatives boils down to “You’re right, but life isn’t fair.” Some conservatives may be more eager than others to lessen the unfairness somewhat. But conservatives understand the simple logic that motherhood is more than a fulltime job and that makes holding a second fulltime job very difficult. Feminist liberals understand this logic too, they just don’t want to accept it because they believe that in a just society there would be no such trade-offs.
. . . the devotion to a set of ideals, rooted in history, and attached to a specific place. And once again we are spun back to Hayek. To a certain extent patriotism is conservatism, in the same way that being a Christian involves some level of conservatism. It is a devotion to a set of principles set forth in the past and carried forward to today and, hopefully, tomorrow. (I wish it weren’t necessary to point out that this is a non-partisan point: Patriotic liberals are holding dear some aspects of our past as well.) What we call patriotism is often merely the content we use to fill-up the amoral conservatism discussed above. Axiomatically, if you are unwilling to conserve any of the institutions, customs, traditions, or principles inherent to this country you simply aren’t patriotic. . .The belief that all good things move together and there need be no conflicts between them is, ultimately, a religious one. And — by definition — a totalitarian one. Mussolini coined that word not to describe a tyrannical society, but a humane society where everyone is taken care of and contributes equally. Mussolini didn’t want to leave any children behind either.I guess my larger point is that, as with historical theory, there is no political theory or ideology that can explain all and solve all. Someone will always want to opt out of whatever political utopia that a given set of intellectuals or leaders would devise. Adherence to one historical theory is self-limiting and can lead to forming the wrong conclusion. The ideal cannot be met, but that doesn't mean we shouldn't attempt and try and discuss theories in the arena of ideas. We just shouldn't be so presumptious as to believe that we have all the answers, be it in History or life.
The attempt to bring such utopianism to the here and now is the sin of trying to immanentize the eschaton. . . the rub of my disagreement with Derbyshire (and another Brit, Andrew Stuttaford) and others who are touting the supposed incompatibility of conservative Christianity and political conservatism. Christianity, as I understand it, holds that the perfect world is the next one, not this one. We can do what we can where we can here, but we’re never going to change the fact that we’re fallen, imperfect creatures. There’s also the whole render-unto-Caesar bit. And, of course, the Judeo-Christian tradition assumes we are born in sin, not born perfect before bourgeoisie culture corrupts us into drones for the capitalist state.
In other words, while Christianity may be a complete philosophy of life, it is only at best a partial philosophy of government. When it attempts to be otherwise, it has leapt the rails into an enormous vat of category error. This is one reason why I did not like it when President Bush said his favorite political philosopher was Jesus Christ. I don’t mind at all a president who has a personal relationship with Jesus. It’s just that I don’t think Jesus is going to have useful advice about how to fix Social Security.
Any ideology or outlook that tries to explain what government should do at all times and in all circumstances is un-conservative. Any ideology that sees itself as the answer to any question is un-conservative. Any ideology that promises that if it were fully realized there would be no more problems, no more trade-offs, no more elites, and no more inequality of one kind or another is un-conservative. . .Contrary to all the bloviating jackassery about how conservatives are more dogmatic than liberals we hear these days, the simple fact is that conservatives don’t have a settled dogma. How could they when each faction has a different partial philosophy of life? The beauty of the conservative movement — as Buckley noted in that original essay — is that we all get along with each other pretty well. The chief reason for this is that we all understand and accept the permanence of contradiction and conflict in life.
Thursday, May 12, 2005
"Working Class" Disconnect Theories Abound
Slate's Timothy Noah recently joined his liberal bretheren in wondering why the Democrats aren't enjoying the support of the "working class" they once did. Noah pointed to the by-now familiar book by Thomas Franks blaming the decline of unions, or Tom Edsall's contention that it's a "values gap" or even Arlie Russell Hochschild's "testosterone gap." For his part, Noah plays with the idea that the "working class" suffers from a psychological disorder but then dismissed the whole thing. To all of this, James Taranto offered this observation (emphasis his)
Liberals are partially correct to pay attention to economic interests--Franks thinks they should ressurect the class-warfare model that won for them previously--but they are missing a key component of the more general "American ideology" if they think class warfare alone will work. Whether based on history or "myth", most Americans actually, really and truly do believe in the ideal of the individual making his own way in this nation where opportunity abounds. From this follows the belief that with a good idea, hard work, sacrifice and a little luck, everyone can be one of "the rich." In short, they may be a little jealous of the rich, but not too much: after all, they want to be one someday.
As such, at the risk of taking this too far, it could be reasoned that the "working class" are not "voting against their own economic interest" at all. Instead, perhaps they are showing foresight by sacrificing now for potential benefits later. It might be a stretch, but its possible.
Regardless of this last, the main fact is that liberals would do well to stop criticizing the motivations of others based on the assumption that these are "ignorant" of their own "self interests." Many Americans consider liberals to be elitist know-it-alls and this perception is only reinforced by the latter's continuous attempts to blame others for the failure of their own liberal ideas. In all actuality, as suggested by Taranto, perhaps it is the liberal Democrats who are the truly disconnected class.
How come it never occurs to liberals or Democrats that the very terms in which they phrase the question are part of their problem? These, after all, are people who are obsessed with politically correct terminology, from "African-American" to "fetus." Yet somehow it never dawns on them that "working class" is an insult.The values issue that Taranto brings up is one valid, and probably the primary, reason for the middle class supposedly "not voting their interests" as framed by liberals. Yet, perhaps there is another factor that liberals miss because they take for granted the simple-mindedness of the so-called "working class".
Think about it: Would you call a janitor, a secretary or a carpenter "working class" to his face? The term connotes putting someone in his place: Your lot in life is to work. Thinking is for the higher classes. The questions the Democrats ask about the "working class" reflect precisely this contempt: What's the matter with these people? Why don't they understand that we know what's good for them? Why do they worry about silly things like abortion and homosexuality? If they must believe in all that religious mumbo-jumbo, can't they keep it to themselves?
Every time the Democrats lose an election, they make a big show of asking questions like these. Then, the next time they lose an election, they once again wonder why the "working class" has forsaken them. Maybe it's as simple as: because they were listening.
Liberals are partially correct to pay attention to economic interests--Franks thinks they should ressurect the class-warfare model that won for them previously--but they are missing a key component of the more general "American ideology" if they think class warfare alone will work. Whether based on history or "myth", most Americans actually, really and truly do believe in the ideal of the individual making his own way in this nation where opportunity abounds. From this follows the belief that with a good idea, hard work, sacrifice and a little luck, everyone can be one of "the rich." In short, they may be a little jealous of the rich, but not too much: after all, they want to be one someday.
As such, at the risk of taking this too far, it could be reasoned that the "working class" are not "voting against their own economic interest" at all. Instead, perhaps they are showing foresight by sacrificing now for potential benefits later. It might be a stretch, but its possible.
Regardless of this last, the main fact is that liberals would do well to stop criticizing the motivations of others based on the assumption that these are "ignorant" of their own "self interests." Many Americans consider liberals to be elitist know-it-alls and this perception is only reinforced by the latter's continuous attempts to blame others for the failure of their own liberal ideas. In all actuality, as suggested by Taranto, perhaps it is the liberal Democrats who are the truly disconnected class.
Buchanan Crosses the Line
So, Pat Buchanan wondered " Was World War II worth it?" and is now being properly blasted for his half-historical perception. (The aforementioned blast is tame, check this one or this one out). Buchanan, who long ago lost his conservative credibility, attempted to play off of President Bush's recent statement that Yalta was a historical mistake that hurt many Eastern European countries as it helped facilitate the end of their post WWII self-determination by way tacitly legitimizing the Soviet Union as the "steward" of these small, relatively powerless nations. (As we all know, that's an entire debate on its own). From here, Buchanan has extrapolated that the U.S. shouldn't have gotten into the war at all. It would seem that Buchanan's own ultra-nationalistic xenophobia has finally affected his historical perspective.
When one considers the losses suffered by Britain and France – hundreds of thousands dead, destitution, bankruptcy, the end of the empires – was World War II worth it, considering that Poland and all the other nations east of the Elbe were lost anyway?Buchanan's view of the "real" historical motivation is questionable at best. Nonetheless, even if his interpretation was correct, we cannot judge history strictly on whether the initial goals were achieved, or even addressed, can we? We need to also take into account the results of the action, regardless of original motivation. Thus, when Buchanan asked, "Was that worth fighting a world war – with 50 million dead?", he fails to even acknowledge the inherently evil Nazi regime and the multitude of atrocities it committed. Without the intervention of Britain, France and the U.S. et al, for whatever reason, how many Jews, Gypsys, and other non-Aryan "undesirables" would have been murdered? It is appropriate to show how historical actions, undertaken for one reason or another, often miss their initial goals. Buchanan does this, but the so-called rule of unintended consequences can result in good just as well as ill. In his piece, Buchanan ignores these positives in attempt to buttress his contemporary isolationist nationalism with a historical half-truth.
If the objective of the West was the destruction of Nazi Germany, it was a "smashing" success. But why destroy Hitler? If to liberate Germans, it was not worth it. After all, the Germans voted Hitler in.
If it was to keep Hitler out of Western Europe, why declare war on him and draw him into Western Europe? If it was to keep Hitler out of Central and Eastern Europe, then, inevitably, Stalin would inherit Central and Eastern Europe.
Was that worth fighting a world war – with 50 million dead?
The war Britain and France declared to defend Polish freedom ended up making Poland and all of Eastern and Central Europe safe for Stalinism.
Wednesday, May 11, 2005
Woods' is mainly an "Incorrect Guide to American History"
Writing for Claremont, John B. Kienker explains why conservatives should not embrace Thomas E. Woods The Politically Incorrect Guide to American History as some sort of reasonable "untold" history. Yes, many things in it are correct, but there is a disturbing undercurrent.
At its core, The Politically Incorrect Guide to American History is just more wheezy propaganda from the Old Confederacy (the book's cover features a scowling Dixie general). "Thomas E. Woods, Ph.D.," as he refers to himself, a professor of history at the Suffolk County Community College in New York, rehearses all the familiar fictions: the "States had the right to secede," the so-called Civil War was really a "War of Northern Aggression," Abraham Lincoln was probably a racist and only "fought to 'save the Union'… and consolidate its power."
Granted, Abraham Lincoln wanted to save the Union, the Union that the American Founders had established, dedicated to the proposition of human equality and constitutional majority rule. When he was elected president, a minority of citizens refused to abide the results of his legitimate election. But they were in a bind. They hadn't suffered a long train of abuses as their founding forefathers had, and what's more, they couldn't invoke the laws of nature and of nature's God because they were seeking to strengthen and perpetuate a slave system that made a mockery of natural rights.
And so they denounced the central principle of the American Founding as a "self-evident lie" and invented a supposedly lawful "right" to secession— a constitutional right to overthrow the Constitution! This is absurd on its face. But not to Thomas E. Woods, Ph.D. He says that states like Virginia and Rhode Island actually reserved the right of secession when they rati- fied the U.S. Constitution. Though he admits that "[s]ome scholars have tried to argue that Virginia was simply setting forth the right to start a revolution," he finds this interpretation "untenable." Of course, the ratifying documents of those states make no mention of secession but do speak of "certain natural rights." The Constitution itself never condones secession, though it does insist that "No State shall enter into any Treaty, Alliance, or Confederation," and "No State shall, without the Consent of Congress...enter into any Agreement or Compact with another State."
Unable to distinguish between secession and the right of revolution, Woods blithely reproduces quotes from Thomas Jefferson and even Abraham Lincoln that refer to the latter, not the former. In discussing the nullification crisis that was the dress rehearsal for Southern secession, Woods claims that "nullification isn't as crazy as it sounds." James Madison was still alive at that time, and publicly affirmed that the Constitution "was formed, not by the Governments of the component States" and "cannot be altered or annulled at the will of the States individually." Woods suggests Madison's thought lacked "coherency." But then Madison wasn't a Ph.D. like Woods.
It's no surprise to learn that Woods is a founding member of the League of the South, which officially declares: "The people of the South must come to understand that they indeed are a 'nation,'" and may resort to secession if their demands are not met.
Though debunking him is fun, what's really at stake is the conservative movement's respectability and honor. As conservatives, we embarrass ourselves when we promote sloppy scholarship. We disgrace ourselves when we promote books, like PIG and others, that seek to discredit the principles of the American Founding.
Tuesday, May 10, 2005
Debating Hanson
Victor Davis Hanson penned a piece, "What Happened to History," that has driven a bit of commentary among webbified historians. On the one hand are those, such as Winfield Myers at Democracy Project, who agree with much of what Hanson said. On the other are the gaggle at Cliopatria who essentially believe Hanson is too shallow, with Ralph Luker stating that the piece "seemed to challenge history bloggers to give it a fisk." Luker then proceeded to make something of Hanson not properly citing Emerson in an attempt to suggest that Hanson's critique of Ward Churchill was somehow undermined by such supposed hypocrisy. In general, the critics seem to have inferred that Hanson was calling for Historians to do and teach History the way it used to be done, implying (to the critics) that he was calling for the return to preeminence of "dead white male" history. In this, they especially reacted to this portion of Hanson's piece
Meanwhile, Dave Beito first critiqued Hanson for what he didn't write and then claimed Hanson "betrays his ultimate reverence for the history of the American state, as personified by politicians he admires, over the history of how ordinary individuals ultimately provided the basis for American prosperity." While Beito agreed that Hanson properly "addresses the failure of historians to teach more about the founders and the great political issues of the past," he took exception to what he perceived as Hanson's denigration of the "little things":
In the rarified air of post-graduate History, studies of seemingly trivial (but not necessarily so) items/people/events are of value and can offer new and interesting, if sometimes a bit esoteric, interpretations. However, Hanson's main point was this:
As a coda, I find it interesting, and unsurprising, that those who are "conservative" tend to agree with Hanson while liberal historians don't. Could it really be because Hanson is advocating the study of one kind of history, associated with conservative or consensus views of history, over other kinds, generally falling under the umbrella of cultural or social history? Perhaps. For me, I find value in all branches of history but sincerely believe that there are some important "must-see-history" events that all should be aware of. Perhaps my interpretation of Hanson's piece is wrong, but that is what I took out of it. (Yikes! Am I being post-modernist and reading into the text!!!???)
...there is a radically new idea that most past occurrences are of equal interest -- far different from the Greeks' notion that history meant inquiry about "important" events that cost or saved thousands of lives, or provided ideas and lessons that transcended space and time.(In fact, the example of the pencil spawned its own debate generated by Dave Beito). My guess is that Hanson's aforementioned cause/effect linkage really hit some historians where it hurt. In a field in which people are always searching for some new way to interpret past events, a critique of such usually-applauded innovations probably surprised and hurt the feelings of a few historians. However, I think they (perhaps willfully) misread Hanson. They particularly seem to have inferred that Hanson was being critical of social history, with Luker taking exception to "his denigration of the social history of ordinary lives as 'trivial'."
The history of the pencil, girdle or cartoon offers us less wisdom about events, past and present, than does knowledge of U.S. Grant, the causes of the Great Depression or the miracle of Normandy Beach. A society that cannot distinguish between the critical and the trivial of history predictably will also believe a Scott Peterson merits as much attention as the simultaneous siege of Fallujah, or that a presidential press conference should be pre-empted for Paris Hilton or Donald Trump.
Meanwhile, Dave Beito first critiqued Hanson for what he didn't write and then claimed Hanson "betrays his ultimate reverence for the history of the American state, as personified by politicians he admires, over the history of how ordinary individuals ultimately provided the basis for American prosperity." While Beito agreed that Hanson properly "addresses the failure of historians to teach more about the founders and the great political issues of the past," he took exception to what he perceived as Hanson's denigration of the "little things":
Hanson's disparagement of the "history of the pencil" betrays a worldview that is fundamentally at odds with the tradition of freedom represented by Thomas Jefferson (at his best), Rose Wilder Lane, Frederich Hayek, and Ludwig Von Mises. Unfortunately it is worldview that is rubbing off on libertarians who embrace the Bushian dream of entrusting the American state to bring "liberty" to every corner of the planet. [comments on this post concern the debate over whether or not Hanson was specifically "calling out" Leonard Read].As a historian, I would simply say that Beito's contention is but one way to interpret Hanson's piece.
In the rarified air of post-graduate History, studies of seemingly trivial (but not necessarily so) items/people/events are of value and can offer new and interesting, if sometimes a bit esoteric, interpretations. However, Hanson's main point was this:
Why do we not carry with us at the least the whispers of those who gave us what we have, from the Hoover Dam and Golden Gate Bridge to penicillin and relief from polio? In part, it is a simple ignorance of real history. The schools and university curricula today are stuffed with therapy -- drug counseling, AIDS warnings, self-improvement advice, sex education, women's/gay/Chicano/African-American/ Asian/peace/urban/environmental/leisure studies. These are all well-meaning and nice -isms and -ologies that once would have been seen as nonacademic or left to the individual, family or community. But in the zero-sum game of daily instruction, something else was given up -- too often it was knowledge of the past.Some historians seem to have forgotten that we all have that baseline of which Hanson speaks, but the students we teach don't. In the rush to make History appealing, to be more inclusive, the formerly compulsory knowledge of important past events has given way to the "nice-isms" mentioned by Hanson. To me, one commenter got it right when he wrote:
Um, guys, I think I'm going to have to defend Hanson here. I don't think he was denigrating the miracle of spontaneous orders, or the use of the example of a pencil by Read to illustrate it. His point is that education is in some sense a zero-sum game -- college students only take x number of classes, and so for every course on the cultural history of lingerie or whatever, that's one less chance to study Thucydides. And that's true. Now maybe he's being overly curmudgeonly, and I think there's probably some value in some of the stuff he's trying to discredit -- but some of it, surely, _is_ trivia or fluff, and meanwhile they end up with _no_ courses on Thucydides. And it's true, isn't it, that our society as a whole _does_ tend to fixate more on the Scott Petersons than on the Fallujahs. And both hawks and doves can agree, I should think, that this isn't healthy. He's being a bit hyperbolic, IMO, but just a bit. His overall point about the state of higher ed is a sound one.Winfield Myers had a similar view of Hanson's piece
That last paragraph is particularly important, because too many history professors who know better duck their moral obligations to both the subjects of their study and to posterity by silently acquiescing to the leveling of history by their colleagues who are more skilled at degrading past actors than in making history come alive for new generations. Is a colleague down the hall writing another article to prove that the marginalized should be lauded, the peripheral centralized? Are doctoral students directed to spend years examining ephemera while ignoring, or remaining ignorant of, events and persons who shaped our world?I agree with these last two. Historians have had the benefit of getting that foundation of the "usual" history and, once informed, are able to branch off, elaborate or challenge these interpretations. I think Hanson is saying that we should ensure that everyone receives that basic history baseline before we introduce other, perhaps "sexier", ways of looking at the past.
When the professional intellectual class largely abandons its obligation to research and produce readable accounts explaining how we arrived at the present moment, and instead entertains itself with parlor games and careerist machinations, it's little wonder that a keep observer like Hanson will bemoan the arrogance that presentism fosters.
As a coda, I find it interesting, and unsurprising, that those who are "conservative" tend to agree with Hanson while liberal historians don't. Could it really be because Hanson is advocating the study of one kind of history, associated with conservative or consensus views of history, over other kinds, generally falling under the umbrella of cultural or social history? Perhaps. For me, I find value in all branches of history but sincerely believe that there are some important "must-see-history" events that all should be aware of. Perhaps my interpretation of Hanson's piece is wrong, but that is what I took out of it. (Yikes! Am I being post-modernist and reading into the text!!!???)
Monday, May 09, 2005
I'm Caught in the 'Dustbin of History’?
This story offers a very brief summary of a recent AHA report (PDF) on the Master's Degree in History. In reading the report, I came to realize that I was pursuing a useless degree and then that I actually wasn't. Quite a roller coaster ride!
The reason I decided to pursue a degree in a field other than my profession (I'm an engineer by trade) was due to nothing more than pure intellectual gratification and the fun I have "doing history." And while the AHA report is worthwhile in pointing out the value of the MA in History, I still think that it would be more worthwhile if some institutions offered an "easier" way for professionals with backgrounds in fields other than history to work toward and attain a History PhD without having to give up their professional lives (read:incomes).
I believe that all History PhD programs are intimately tied to teaching whereby PhD candidates are required to instruct and work at institutions as part of their program. This is fine for young students, but what about "older" students like myself? Are we out of luck because we came too late to the realization that we loved history and wanted to take our hobby more seriously and pursue it more "professionally"? I'm not crying "not fair" because of any sense of entitlement. If I feel strongly enough, I can take the chances, get accepted into a PhD program sell the house, work for peanuts, send my wife to work and relocate the kids, get a degree and work for half of what I make now. However, I do have to ask: is all that necessary to pursue and earn a History PhD? Is that all part of "paying my dues" because everyone else had to do it? If so, isn't that attitude a bit, well, infantile?
My point is that there should be at least a few respectable programs that offer an alternative PhD track for non-traditional students (ie: older people who are somewhat removed from their undergraduate education) who have lived life a bit and can offer an outside-of-the-academy perspective before pursuing and attaining a History PhD. My guess is that most of us non-traditional wannabe historians enjoy research and writing more than the prospect of teaching. (I suspect most traditional PhD's feel this way, too, and I don't think this is anything that is really surprising). As such, why should we be required to teach if that is not what we want to do? It used to be that one pursued a PhD and became tied to teaching at an institution because the latter vocation provided a means of support while the Historian pursued his research and wrote on scholarly topics. But what if teaching as a vocation is not needed to support a PhD who is otherwise employed? Is my ability to support myself outside of the academy cause for suspicion? Or is it resentment? Is there a belief that, because I won't be academically affiliated, I will somehow be intellectually or scholastically compromised?
Some of these questions have been asked before, but usually within the context of questioning why there are relatively few conservative historians represented on campus. I ask them because I wonder why the presumption is that "respectable" historians have to be represented primarily on campus at all. I see a lot of lip-service paid to the independent historian, but my sense is that these folks are regarded somewhat as mavericks or, to use a less complimentary term, loose-cannons. I hear and read a lot about the ideals of open dialogue, opportunity and scholarship, but the History field I see does little to promote, encourage or facilitate alternative paths. Nonetheless, perhaps there are PhD programs that are accomodating of non-traditional, "professionals" like myself. If there are, they should do a better job of advertising such. If not, I think there is a void that can be filled by doing so. I'll be one of the first to sign up (especially if they're located in Southern New England)!
The reason I decided to pursue a degree in a field other than my profession (I'm an engineer by trade) was due to nothing more than pure intellectual gratification and the fun I have "doing history." And while the AHA report is worthwhile in pointing out the value of the MA in History, I still think that it would be more worthwhile if some institutions offered an "easier" way for professionals with backgrounds in fields other than history to work toward and attain a History PhD without having to give up their professional lives (read:incomes).
I believe that all History PhD programs are intimately tied to teaching whereby PhD candidates are required to instruct and work at institutions as part of their program. This is fine for young students, but what about "older" students like myself? Are we out of luck because we came too late to the realization that we loved history and wanted to take our hobby more seriously and pursue it more "professionally"? I'm not crying "not fair" because of any sense of entitlement. If I feel strongly enough, I can take the chances, get accepted into a PhD program sell the house, work for peanuts, send my wife to work and relocate the kids, get a degree and work for half of what I make now. However, I do have to ask: is all that necessary to pursue and earn a History PhD? Is that all part of "paying my dues" because everyone else had to do it? If so, isn't that attitude a bit, well, infantile?
My point is that there should be at least a few respectable programs that offer an alternative PhD track for non-traditional students (ie: older people who are somewhat removed from their undergraduate education) who have lived life a bit and can offer an outside-of-the-academy perspective before pursuing and attaining a History PhD. My guess is that most of us non-traditional wannabe historians enjoy research and writing more than the prospect of teaching. (I suspect most traditional PhD's feel this way, too, and I don't think this is anything that is really surprising). As such, why should we be required to teach if that is not what we want to do? It used to be that one pursued a PhD and became tied to teaching at an institution because the latter vocation provided a means of support while the Historian pursued his research and wrote on scholarly topics. But what if teaching as a vocation is not needed to support a PhD who is otherwise employed? Is my ability to support myself outside of the academy cause for suspicion? Or is it resentment? Is there a belief that, because I won't be academically affiliated, I will somehow be intellectually or scholastically compromised?
Some of these questions have been asked before, but usually within the context of questioning why there are relatively few conservative historians represented on campus. I ask them because I wonder why the presumption is that "respectable" historians have to be represented primarily on campus at all. I see a lot of lip-service paid to the independent historian, but my sense is that these folks are regarded somewhat as mavericks or, to use a less complimentary term, loose-cannons. I hear and read a lot about the ideals of open dialogue, opportunity and scholarship, but the History field I see does little to promote, encourage or facilitate alternative paths. Nonetheless, perhaps there are PhD programs that are accomodating of non-traditional, "professionals" like myself. If there are, they should do a better job of advertising such. If not, I think there is a void that can be filled by doing so. I'll be one of the first to sign up (especially if they're located in Southern New England)!
The Good War
Continuing on the same theme from my last post, Geoffrey Wheatcroft writes about the myths that arose in the aftermath of World War II, both for good and ill.
Was it ‘‘a noble crusade’’? For the liberation of western Europe, maybe so. Was it a just war? That tricky theological concept has to be weighed against very many injustices. Was it a good war? The phrase itself is dubious. No, there are no good wars, but there are necessary wars, and this was surely one.Meanwhile, Theodore Dalrymple reminds that Frederick Hayek wrote The Road to Serfdom in 1944 in response to the growing belief among intellectual Britons, such as George Orwell, that social collectivism was a necessary good and believed that the wartime experience justified and confirmed their beliefs.
Hayek believed that while intellectuals in modern liberal democracies—those to whom he somewhat contemptuously referred as the professional secondhand dealers in ideas—did not usually have direct access to power, the theories that they diffused among the population ultimately had a profound, even determining, influence upon their society. Intellectuals are of far greater importance than appears at first sight.Hayek believed that the British intellectuals were being influenced by the experience of World War II in which British society was atypically united because of war.
Hayek was therefore alarmed at the general acceptance of collectivist arguments—or worse still, assumptions—by British intellectuals of all classes. He had seen the process—or thought he had seen it—before, in the German-speaking world from which he came, and he feared that Britain would likewise slide down the totalitarian path. Moreover, at the time he wrote, the “success” of the two major totalitarian powers in Europe, Nazi Germany and Soviet Russia, seemed to have justified the belief that a plan was necessary to coordinate human activity toward a consciously chosen goal. For George Orwell, the difference between the two tyrannies was one of ends, not of means: he held up Nazi Germany as an exemplar of economic efficiency resulting from central planning, but he deplored the ends that efficiency accomplished.
Collectivist thinking arose, according to Hayek, from impatience, a lack of historical perspective, and an arrogant belief that, because we have made so much technological progress, everything must be susceptible to human control. While we take material advance for granted as soon as it occurs, we consider remaining social problems as unprecedented and anomalous, and we propose solutions that actually make more difficult further progress of the very kind that we have forgotten ever happened. While everyone saw the misery the Great Depression caused, for example, few realized that, even so, living standards actually continued to rise for the majority. If we live entirely in the moment, as if the world were created exactly as we now find it, we are almost bound to propose solutions that bring even worse problems in their wake.
Monday, May 02, 2005
World War II: History or Preferred Remembrance?
Adam Krzeminski writes that World War II was mythologized in individual nations to replace previous ones destroyed by the conflict. However, one of the ironic twists of European unionization has been a debunking of these myths:
The war destroyed not just countries, but the whole edifice of traditional myths that supported the identity of the European nations before it began. Meanwhile, the effort to create some new myths fell foul of the shocking reality: millions of people had been killed or murdered, there was immense material destruction, and Europe had been politically and morally degraded. . . .To all intents and purposes there were as many Second World Wars as there were nations. . . .For six decades in Europe, the USA and Israel, monuments and mausoleums have been built, films have been made, and posters and postage stamps have been printed. Heroic tales of war heroes who "ducked the bullets" have been written, yet at the same time some of the legends began to be debunked very early on. Books that were praised one day were thrown on the rubbish heap the next. Monuments erected earlier were demolished, and heroes were scorned, while those who were once regarded as traitors were rehabilitated. . . .
Europeans will go on living with competing memories and competing myths for a long time to come. What is new is that these competing myths are no longer being fostered in confinement, but in constant dialogue between neighbours, besides which in each country as well as being fostered they are also being debunked. Time will tell if this clash of national myths will ultimately engender a common European view of the Second World War, without dropping the national experiences. Already in many countries the Europeans are gradually ceasing to be victims of autism, exclusively fixated on separate images of the past.
Why are academics so unhappy?
After reading this, I'm not sure if I ever want a teaching job:
Universities attract people who are good at school. Being good at school takes a real enough but very small talent. As the philosopher Robert Nozick once pointed out, all those A's earned through their young lives encourage such people to persist in school: to stick around, get more A's and more degrees, sign on for teaching jobs. When young, the life ahead seems glorious. They imagine themselves inspiring the young, writing important books, living out their days in cultivated leisure.
But something, inevitably, goes awry, something disagreeable turns up in the punch bowl. Usually by the time they turn 40, they discover the students aren't sufficiently appreciative; the books don't get written; the teaching begins to feel repetitive; the collegiality is seldom anywhere near what one hoped for it; there isn't any good use for the leisure. Meanwhile, people who got lots of B's in school seem to be driving around in Mercedes, buying million-dollar apartments, enjoying freedom and prosperity in a manner that strikes the former good students, now professors, as not only unseemly but of a kind a just society surely would never permit.
Now that politics has trumped literature in English departments the situation is even worse. Beset by political correctness, self-imposed diversity, without leadership from above, university teachers, at least on the humanities and social-science sides, knowing the work they produce couldn't be of the least possible interest to anyone but the hacks of the MLA and similar academic organizations, have more reason than ever to be unhappy.
Sunday, May 01, 2005
History Carnival 7: Studi Galileiani
Studi Galileiani is hosting History Carnival VII and has a plethora of diverse, history-related blog posts to recommend. He also reminds us that the concept behind the original History Carnival is that the host should receive submissions, do a little digging themselves and then post the results. It looks as if S.G. had to do most of the digging this time. Hopefully the next Carnival will see more submissions (I'm working on one myself). In fact, I hope that by the time I host one, in a month or so, I won't have to do as much work as S.G. did!
Subscribe to:
Posts (Atom)