1) Traditionalist conservatism (first appeared in the 8th century BC)...The missing are 2) Neoconservatism and 5) Libertarianism. Interestingly, of the 5, my own conservatism is a mix of the three above.
Medieval: Venerated heroes and saints. Defended the honor of the family. Viewed the family as a community of souls, living and dead, stretching back to the primordial past. Lived in tightly woven communities of long continuance. The ancients were seen as giants in wisdom in contrast to their own modest understanding. Venerated the literary classics. Taught the seven liberal arts.
***
3) Christian conservatism (1st century AD)
Roman and Medieval: Marriage, family, church, community and government are ordained of God for our good and we are obliged to submit to these institutions. The government must fight evil. "Christendom," or Christ's kingdom, is gradually being formulated in society as God works through the church. The church has a mission for the spiritual formation of souls. It also has a mission to educate the people, to develop the leaders of society and to sponsor culture. Man is fallen and needs redemption, restraint, and holy fear. For "athletes of Christ," the potential for personal holiness is great.
***
4) Natural law conservatism (13th century)
Aquinas, Locke and Montesquieu. Also called "classical liberalism." Man has a nature according to the Creator's design. By nature he is entitled to certain freedoms and bound by certain duties. Human reason is the means by which we discover these rights and the duties. The universal moral law and the laws of nature are binding upon man. The main role of government is to protect human rights. There are certain activities such as the police and the military which government can offer but men cannot provide for themselves. The legitimate role of government is limited. Men form a social contract with government — men will submit to government and government will protect their rights.
Tuesday, October 09, 2007
Quick Guide to History of Conservatism
Congress Locks up the Kennewick Man, AHA Missing In Action Again
[U]nder the North American Graves Protection and Repatriation Act (NAGPRA) — a well-meaning law passed in 1990 — tribes can lay claim to cultural objects and human remains locked away in federally funded museums or unearthed on federal land. In order to do so, they must prove a reasonable connection between themselves and the objects they wish to obtain.While this is technically archaeology, shouldn't the AHA be interested? Where's their editorial arguing for safeguarding the profession and ensuring that we have timely access to items of the historical record (like Executive Orders that lengthen the duration of sealed Presidential records)? Would there be outrage if a bunch of Northern Europeans started putting a halt to bog-people autopsies?
When Kennewick Man came to light, a coalition of tribes in the Pacific Northwest demanded the remains under the provisions of NAGPRA. They said they wished to bury the bones, making further study impossible. The Army Corps of Engineers, which has jurisdiction over Kennewick Man, took steps to comply. But then a group of prominent scientists sued. In 2004, the U.S. Circuit Court of Appeals ruled in favor of the scientists, pointing out that the modern tribes had failed to demonstrate an adequate link between themselves and the skeleton of a person who died more than nine millennia ago.
So the tribes turned to Congress. Two years ago, Sen. John McCain proposed altering NAGPRA’s definition of “Native American” from “of, or relating to, a tribe, people, or culture that is indigenous to the United States.” The new language would add two words: “...is, or was, indigenous...” McCain’s efforts failed, in part because of public objections. But now the change has slipped through in a bill of “technical corrections” that the Senate’s Indian Affairs Committee has just approved.
Friday, October 05, 2007
Heroes
As one battalion commander complained to me, in words repeated by other soldiers and marines: ’Has anyone noticed that we now have a volunteer Army? I’m a warrior. It’s my job to fight.’ Every journalist has a different network of military contacts. Mine come at me with the following theme: We want to be admired for our technical proficiency--for what we do, not for what we suffer. We are not victims. We are privileged.Kaplan led into this by explaining that our modern media coverage of the war "too often descends into therapy for those who are not fighting, rather than matter-of-fact stories related by those who are." Later, he offers a heroic example and portrays how media coverage has changed:
The first Medal of Honor in the global war on terror was awarded posthumously to Army Sgt. First Class Paul Ray Smith of Tampa, Fla., who was killed under withering gunfire protecting his wounded comrades outside Baghdad airport in April 2003.According to LexisNexis, by June 2005, two months after his posthumous award, his stirring story had drawn only 90 media mentions, compared with 4,677 for the supposed Quran abuse at Guantanamo Bay, and 5,159 for the court-martialed Abu Ghraib guard Lynndie England. While the exposure of wrongdoing by American troops is of the highest importance, it can become a tyranny of its own when taken to an extreme.
Media frenzies are ignited when American troops are either the perpetrators of acts resulting in victimhood, or are victims themselves. Meanwhile, individual soldiers daily performing complicated and heroic deeds barely fit within the strictures of news stories as they are presently defined. This is why the sporadic network and cable news features on heroic soldiers in Iraq and Afghanistan comes across as so hokey. After all, the last time such reports were considered "news" was during World War II and the Korean War.
In particular, there is Fox News's occasional series on war heroes, whose apparent strangeness is a manifestation of the distance the media has traveled away from the nation-state in the intervening decades. Fox's war coverage is less right-wing than it is simply old-fashioned, antediluvian almost. Fox's commercial success may be less a factor of its ideological base than of something more primal: a yearning among a large segment of the public for a real national media once again--as opposed to an international one. Nationalism means patriotism, and patriotism requires heroes, not victims.
In a post elsewhere, in which I discussed Ken Burn's "The War", I discussed how changes in the media had changed how we perceived war.
Yet the most striking thing [Katherine Phillips] said was that she didn't know how bad Guadalcanal was until after Sidney [her brother] came home. No one on the homefront did. The 5,000+ casualties weren't reported. The brutal fighting wasn't shown on Movietone.In contrast, Katherine Phillips also talked about how the American public had been prepped for war against Nazi Germany for a few years prior to Pearl Harbor. The American public was shown some of the Nazi and Japanese atrocities on Movietone and they became convinced it was a moral imperative to act. When the time came, they were ready to go.
They also didn't equate Nazi or Japanese propaganda with U.S. war reporting. Looking back, there can be no doubt that the U.S. glossed over things. But even then, even if the American people had known more, I doubt that they would have considered the press releases of the enemy as just "another point of view." It points to how much faster and accurate our wartime information has become since then and that difference helps to explain, at least partially, why WWII is considered "The Good War" and why subsequent conflicts aren't.
But Kaplan's point about lost nationalism gets much closer to the difference between then and now. As we've internationalize and become more relativistic--sorry, we are--it is more difficult to choose sides, even if on one side is your own country. As Kaplan concludes:
[W]hile the U.S. still has a national military, it no longer has a national media to quite the same extent. The media is increasingly representative of an international society, whose loyalty to a particular territory is more and more diluted. That international society has ideas to defend--ideas of universal justice--but little actual ground. And without ground to defend, it has little need of heroes. Thus, future news cycles will also be dominated by victims.The media is but one example of the slow crumbling of the nation-state at the upper layers of the social crust--a process that because it is so gradual, is also deniable by those in the midst of it. It will take another event on the order of 9/11 or greater to change the direction we are headed. Contrary to popular belief, the events of 9/11--which are perceived as an isolated incident--did not fundamentally change our nation. They merely interrupted an ongoing trend toward the decay of nationalism and the devaluation of heroism.
The Fallacy of the unchanging Dark Ages
Yikes. How did we possibly, um, change then...if there was no change. Methinks the man should listen to Terry Jones before making those kinds of statements:For most of human history, change has been the exception. Our ancestors for nearly a million years used one basic tool, a hand axe chipped out of stone. They made these axes the same way, every time. Theirs was a culture in neutral.
The Dark Ages were likewise unblemished by change. For a thousand years, there was almost no invention, no new ideas and no exploration. Literacy was actively discouraged. Anything that might pass for progress was outlawed.
Yup. And that's why I named this blog "Spinning Clio."Q You write that our view of medieval life is unduly grim because historians maligned the period. It's easy to see why a nobleman might want to burnish his image by commissioning a writer to vilify a predecessor, but who would benefit from a campaign to disparage an era?
A A very interesting question. Well, in the first place, it would have been the thinkers of the Renaissance, who wanted to establish a break with the past. They also wanted to establish their own sense of importance by belittling what had gone before. This then gets taken up by the promoters of Renaissance culture who are keen to establish its supremacy over the medieval world -- particularly since the Renaissance is a backward-looking movement which harks back to the classical world rather than establishing something new.
In the 20th and 21st century, Renaissance values have been adapted to fit the modern capitalist world. The whole myth that there was no sense of human individuality before the Renaissance is part of this attempt to make the present day seem the culmination of human progress, which I don't think it is.
Q Then how did the unrealistic stereotypes of the noble knight and the ignorant, downtrodden peasant originate and why have they persisted?
A Well, undoubtedly you did have proud and unfeeling aristocrats who treated the peasants like dirt. Also, the Middle Ages is a wide span of time, and there were times and places where the peasantry would undoubtedly have been downtrodden and ignorant. So there is a basis for all that. But the little bit of history I'm interested in -- late 14th century England -- saw a rise in education and the pursuit of knowledge amongst ordinary people -- partly it was a result of the Black Death and the fact there were so few people around that everyone was questioning everything. But it was a time of intellectual activity amongst all classes. Much more so than today.
Q Washington Irving, who gave us "Rip van Winkle," apparently also contributed some fabrications that still distort our view of medieval life?
A Yes. He seems to have been responsible to a large degree for promoting the myth that people in the Middle Ages thought the Earth was flat and that this formed part of Church doctrine. It never did, and people didn't think the world was flat. Chaucer himself talks about "this world that men say is round." There's a fascinating book called "Inventing the Flat Earth" by Jeffrey Burton Russell, which sets the whole story out.
Q What does this tell us about the trustworthiness of historians, in general? Do you have any advice on how to spot a sound or flawed account of the past? Is there such a thing as history or only histories?
A Well, I think you're right that there is no such monolith as "history" in the singular. I think every age writes its own histories and I think it's important that they do. It's how we help to define ourselves and to know who and where we are. I don't think there is any rule of thumb to spot distorted history any more than there is to spot distorted news that we read today in the press or watch on TV.
The main thing is to be aware that the makers of "spin" are at work today just as much as they were in the Middle Ages or at any time in human history. It's all a bit like a detective story. We have to look for the motives behind what leaders do rather than take at face value the reasons that they give us. It's just the same with history.
Thursday, October 04, 2007
Why a Family Guy Won't Get a History PhD Any time soon
For those who attempt it, the doctoral dissertation can loom on the horizon like Everest, gleaming invitingly as a challenge but often turning into a masochistic exercise once the ascent is begun. The average student takes 8.2 years to get a Ph.D.... Fifty percent of students drop out along the way, with dissertations the major stumbling block. At commencement, the typical doctoral holder is 33, an age when peers are well along in their professions, and 12 percent of graduates are saddled with more than $50,000 in debt.Well, since I started my MA at 34, I was already behind the curve! Oh well. I may not have been elite enough (via PhDinHistory) anyway:
First, doctoral students in history have come and continue to come disproportionately from relatively privileged family backgrounds. Second, the proportion and number of students in doctoral programs from first-generation college families is declining. This trend—if the weak data are sufficient to speak of a real trend—is relevant to both diversity and opportunity questions. And the data point to a third, more speculative point. The uncertainty of employment in history may be discouraging students from first-generation college families from pursuing history careers. One can understand their preference for more secure career paths, but the profession loses vitality and students of potential lose an opportunity to pursue what may be to them a substantively if not practically appealing life work.Neither of my parents got their B.A., but all of their children have. For myself, I always loved history. Heck, in high school, I wanted to be a History teacher and a HS Soccer coach. But my parents steered me into engineering because, as they pragmatically pointed out, there is always a need for engineers. They were right. Sure, I still like History, but a PhD wasn't even on my radar coming out of High School.
Besides, being an engineer led to financial stability such that I could finally scratch that itch and get the MA. But the PhD just isn't practical for me right now, even though I'd absolutely love to do it.
PhDinHistory adds (check out his charts that illustrate the below):
I think we should be concerned by the way that history has suddenly become more elitist than almost every other major discipline in the humanities and social sciences. Like the authors in The Education of Historians for the Twenty-first Century, I worry that history faculty will have less and less in common with their students. I am concerned also that smart history majors from lower- and lower-middle-class households will be increasingly steered away from entering graduate programs in history. Lastly, I share the fear of the authors that this problem stems, at least in part, from the mismatch between PhD production and the demand for history faculty in the job market.I think he's right on. Many of those "smart history majors from lower- and lower-middle-class households" already survived the temptations of following other, potentially more lucrative educational paths. They need to be encouraged to keep climbing the ivory towers. More diversity and more perspectives will strengthen profession. And maybe someday, people who have lived a portion of their adult life outside of the ivy walls will be better able to open the gates and climb the tower stairs.
Thursday, September 20, 2007
I took the ISI History Quiz
What did I get wrong?You answered 57 out of 60 correctly — 95.00 %Average score for this quiz during September: 75.3%
Average score since September 18, 2007: 75.3%
Answers to Your Missed Questions:Question #19 - C. philosopher kings.
Question #54 - D. can be reversed by government spending more than it taxes.
Question #58 - B. An increase in the volume of commercial bank loans.
Here are the questions and all possible answers (including my wrong ones) for the ones I missed:
All in all, I was surprised I did so well, frankly. While some are arguing the test is unfair because it doesn't adequately deal with big concepts, a knowledge of basic historical facts is necessary to support the "big picture" takeaway that colleges aim to teach. Besides, many of the questions do deal with concepts. For instance:
19) In The Republic, Plato points to the desirability of:
That was a classic brain f**t on my part.
54) Keynesian economists conclude that the recession phase of a business cycle: Eh. Economics ain't my strong suit...I suspect most historians can relate. ....And again....
58) What is a major effect of a purchase of bonds by the Federal Reserve?
There is a lot of reading that goes into answering these questions, no?
31) Which author’s view of society is presented correctly?
39) The question of why democracy leads to well-ordered government in America when disorder prevails in Europe is central to:
Or how about:
Doesn't understanding the "Monroe Doctrine" or why Johnson and the Radical Republicans were fighting display more than just knowing facts and figures. Aren't these the sort of historical concepts that many say should be tested?
35) The Monroe Doctrine:
13) The struggle between President Andrew Johnson and the Radical Republicans was mainly over:
44) The Gulf of Tonkin Resolution (1964) was significant because it:
And then there's question about the Gulf of Tonkin Resolution. Heck, they even give the date to help you make an educated guess!
I can only surmise that the historians who take a negative view multiple choice tests--thinking "only" facts and dates can be tested--have probably not read the actual test. Maybe they should.
Monday, September 17, 2007
"Cultural history is written by dissenters"
While it is often said that history is written by the winners, the truth is that the cultural images that come down to us as history are written, in large part, by the dissenters -- by those whose strong feelings against life in a particular generation motivate them to become the novelists, playwrights, and social critics of the next, drawing inspiration from the injustices and hypocrisies of the time in which they grew up....The social critics of the past two decades have forced on our attention the inconsistencies and absurdities of life a generation ago: the pious skirt-chasing husbands, the martini-sneaking ministers, the sadistic gym teachers.I am not arguing with the accuracy of any of those individual memories. But our collective indignation makes little room for the millions of people who took the rules seriously and tried to live up to them, within the profound limits of human weakness. They are still around, the true believers of the 1950s, in small towns and suburbs and big-city neighborhoods all over the country, reading the papers, watching television, and wondering in old age what has happened to America in the last thirty years. If you visit middle-class American suburbs today, and talk to the elderly women who have lived out their adult years in these places, they do not tell you how constricted and demeaning their lives in the 1950s were. They tell you those were the best years they can remember. And if you visit a working-class Catholic parish in a big city, and ask the older parishioners what they think of the church in the days before Vatican II, they don't tell you that it was tyrannical or that it destroyed their individuality. They tell you they wish they cold have it back. For them, the erosion of both community and authority in the last generation is not a matter of intellectual debate. It is something they can feel in their bones, and the feeling makes them shiver.
Rhode Island Historical Society Goes Online
The Rhode Island Historical Society is about to jump into the digital age.
No longer will local history buffs have to drive to Providence to thumb through the society’s 600,000-item card catalogue to do research.
The group plans to abandon its 185-year-old card catalogue system later this month, launching an electronic database that will allow anyone with Web access to search for items stored in the society’s museum and library.
“Now we’ll have a catalog on the Web that people from all over the world can search,” said Karen Eberhart, special collections curator for the historical society. “It’s like going from the 18th century right into the 21st all in one fell swoop.”
Digital reproductions of the historical items will not be available online, Eberhart said. But for the first time, computer users will be able to comb through the historical society’s extensive collection and determine the location of an item from the convenience of home.
***
When the database goes online on Sept. 27, about one quarter of the total collection — about 150,000 items — will be searchable through the historical society’s Web site. Among the items initially listed will be John Brown’s papers, paintings by the renowned American portrait painter Robert Feke, Rhode Island maps dating back to the 1700s and an extensive collection of textiles — bonnets, dresses, and a cashmere shawl imported in the 18th century.
The historical society also boasts a unique collection of news footage from WJAR-TV Channel 10 collected between the 1950s and 1980s, according to Eberhart.
The historical society plans to continue cataloguing the rest of its holdings with help from grants and individual donations. The next batch of artifacts to be catalogued includes genealogical objects, such as diaries and 19th-century books...
Tuesday, September 11, 2007
Goodbye AHA
Its object shall be the promotion of historical studies through the encouragement of research, teaching, and publication; the collection and preservation of historical documents and artifacts; the dissemination of historical records and information; the broadening of historical knowledge among the general public; and the pursuit of kindred activities in the interest of history. -- Article II, Constitution of the American Historical Association
=======================================================
Resolved, That the American Historical Association urges its members through publication of this resolution in Perspectives and other appropriate outlets:When the second quoted item--the AHA's Iraq War resolution--was passed, I wrote,1. To take a public stand as citizens on behalf of the values necessary to the practice of our profession; and
2.
Resolution on United States Government Practices Inimical to the Values of the Historical Profession, March 12, 2007.To do whatever they can to bring the Iraq war to a speedy conclusion.
I don't think the average person gives a crap about what the AHA has to say about Iraq. And I guess I don't either. My only decision is whether or not such an organization deserves my dues.Since then, I've neither seen nor heard anything from the AHA regarding standing-up for the profession other than when it is against the Bush Administration and an Imperial Presidency. Nothing about the Clinton's stonewalling the release of records or of their former crony Web Hubbell absconding with historical documents from NARA. No hue and cry about the history lost. Oh, they reported it in one of their "Inside Washington"-type columns, but didn't see fit to decide or "resolve" over it. I guess actually stealing and destroying documents isn't as bad as putting a hold on them while we are at war. I'm sure the AHA was all over FDR for the same things.
Anyway, enough is enough. I'm letting my membership lapse and am discontinuing my affiliation with the AHA. I'm fed up with their inability to resist immersing themselves in ideological politics while under a veneer of doing so to safeguard the "values necessary to the practice of our profession." Sure, there are other, practical ($) reasons why I'm checking out of the professional side of the, er, profession. Basically, the services the AHA offers an "Independent Historian" like me (basically, access to book reviews and a few articles in AHR) are easily found (for free!) here on the web. Frankly, because I wasn't going to be going for a PhD or teach any time soon, it was never a perfect match to begin with. Face it, the AHA is of, for and by the PhD's, all of their wailing and gnashing of teach about the "role of the MA" or "public historians" aside. And that's fine, but ain't for me. No harm, no foul....and no more money from me.
Friday, September 07, 2007
Barone's Theory of History: Conformity Lost, or Conformity Redefined?
The natural state of America, in my theory, is decentralized toleration: We stand together because we can live apart. We are, most of the time, the nation described by Alexis de Tocqueville, made up of various ethnic, religious, and racial strands who believe fervently that we can live and triumph together if we allow one another to observe our local mores. We can embody David Hackett Fischer's "four British folkways" and at the same time be a united people. There's a tension in that, which threatens to come apart. In the midcentury America of the 1850s, the threat was that we would come apart: We had an explosive political conflagration over the issue of slavery in the territories and an explosive ethnic conflagration in the decade that had the largest immigrant influx, in percentage of pre-existing population, of any decade in our history. Citations: Kenneth Stampp's America in 1857; the opening chapters in James McPherson's Battle Cry of Freedom. And in fact, we produced a civil war.I'm not sure about this. If nothing else, we have become more centralized governmentally and culturally, haven't we? Perhaps we are more polarized politically, yes, and perhaps we all live in our own fortresses, Bowling Alone, so to speak. I think Barone is correct insofar as he links the America seen by Tocqueville with that of today vs. the 1950's. But I think that what has happened is that a new definition of conformism has evolved. Conformism now implies "to each his own" and an underlying "don't be judgmental." Additionally, so many traditional cultural boundaries have been pushed or broken, that nothing much surprises us anymore. Further, to be critical or judgmental about the wrong people is frowned upon. It may be going too far to say the new conformity is essentially what we today call political correctness, but there is some sort of alignment there, I think.We had the opposite situation in the midcentury America of the 1950s. After the shared experiences of the Depression and World War II, with universal institutions like the comprehensive high school, the military draft, and the big factory workforces represented by giant industrial unions, we were a culturally more uniform country than we have been before or since. We were a nation of conformism, of the regular guy, of the average guy who gets along with his peers. Citations: David Riesman's The Lonely Crowd; William H. Whyte's The Organization Man. It was a society, to take one example, far more hostile to homosexuality: The midcentury society of the 1850s could evidently tolerate Ishmael and Queequeeg sleeping together in Moby Dick and the poems of Walt Whitman, while the midcentury society of the 1950s cast its eyes away from the obvious gayness of the early Gore Vidal and Truman Capote and Roy Cohn.
The Civil War, the imposition of New England Yankee mores in the way described by Morton Keller, and the creation of national business and professional organizations described by Robert Wiebe in The Search for Order 1877-1910 reversed the extreme decentralization of the 1850s. The cultural rebellions, to the left and the right, described recently in neat form by Brink Lindsey's The Age of Abundance reversed the extreme centralization of the 1950s.
For those of us who grew up in the backwash of the 1950s, this decentralization seemed like an abandonment of American tradition. In the long line of history, I think it is more like a reversion to norm. The seeming inconsistency of currently prevailing attitudes on marriage and divorce, gambling and drinking, cigarette smoking and marijuana smoking, is part of the continuing turmoil of a decentralized society. The results don't cohere, but perhaps that is to be expected in a society like ours.
Wednesday, August 29, 2007
Medieval Gay Marriage? Not quite
Civil unions between male couples existed around 600 years ago in medieval Europe, a historian now says.I've emphasized the caveats and qualifications, but notice how the story leads with an assertion-as-fact that "Civil unions between male couples existed around 600 years ago in medieval Europe..." OK, so what does Tulchin base his interpretation on:
Historical evidence, including legal documents and gravesites, can be interpreted as supporting the prevalence of homosexual relationships hundreds of years ago, said Allan Tulchin of Shippensburg University in Pennsylvania.
If accurate, the results indicate socially sanctioned same-sex unions are nothing new, nor were they taboo in the past.
[Tulchin] found legal contracts from late medieval France that referred to the term "affrèrement," roughly translated as brotherment. Similar contracts existed elsewhere in Mediterranean Europe, Tulchin said.Medieval contracts aren't up my alley, but I'd bet that while affrèrement "contracts" can be likened to a marriage contract, there's a good chance they can be likened to other medieval legal documents. But setting them up as "like a marriage contract" is laying the groundwork for Tulchin's theory.
In the contract, the "brothers" pledged to live together sharing "un pain, un vin, et une bourse," (that's French for one bread, one wine and one purse). The "one purse" referred to the idea that all of the couple's goods became joint property. Like marriage contracts, the "brotherments" had to be sworn before a notary and witnesses, Tulchin explained.
The same type of legal contract of the time also could provide the foundation for a variety of non-nuclear households, including arrangements in which two or more biological brothers inherited the family home from their parents and would continue to live together, Tulchin said.
But non-relatives also used the contracts. In cases that involved single, unrelated men, Tulchin argues, these contracts provide “considerable evidence that the affrèrés were using affrèrements to formalize same-sex loving relationships."The MSNBC story doesn't provide all of Tulchin's reasoning, for that we go to the press release (heh, imagine, promoting potentially controversial scholarship. Welp, it works!):
The effects of entering into an affrèrement were profound. As Tulchin explains: “All of their goods usually became the joint property of both parties, and each commonly became the other’s legal heir. They also frequently testified that they entered into the contract because of their affection for one another. As with all contracts, affrèrements had to be sworn before a notary and required witnesses, commonly the friends of the affrèrés.”This is a bit more solid. But it isn't clear if the "They" he's alluding to are those in agreements between blood brothers (or other relatives) or those between un-related men or both. It's also not clear to me how he knows that two men are unrelated. If they're cousins, would they have the same last name? What if one is a bastard? I don't think you can tell if they are un-related for sure by just looking at the documents. Again, it's an assumption.
Back to the MSNBC story:
The ins-and-outs of the medieval relationships are tricky at best to figure out.Let's call it like it is: there are weasel words aplenty there. Tulchin admits there is no proof, he only "suspects". Worse, he then says proving his suspicion or not is irrelevant because "[t]hey loved each other, and the community accepted that." No, it is relevant because maybe the community accepted them because the "freres" weren't regarded as "married" homosexuals but as legally-bound "buds." It's possible that men could be affectionate of each other but not be lovers, even in the gory middle ages, right?
"I suspect that some of these relationships were sexual, while others may not have been," Tulchin said. "It is impossible to prove either way and probably also somewhat irrelevant to understanding their way of thinking. They loved each other, and the community accepted that.”
Again, though, this was clearly a contractual, not a spiritual arrangement. No mention is made of the clergy attending the "ceremony". Besides, if it was a recognized marriage, why not call it, well, marriage?
From what I can gather from the press release (the whole article is in the September issue of the Journal of Modern History), Tulchin is making some pretty big assumptions about human motivation based on written documents only. His facts appear to be right, but they don't support his suppositions. Yet historical accuracy--supported by facts, not conjecture--is less important to him than grounding a contemporary political agenda in a wished-for past.
Even if that is not his intent, it is clearly the result:
Opponents of gay marriage in the United States state that nuclear families have always been the standard household form. Turns out this may not be true. While gay marriage itself may not have happened in medieval times there is evidence that homosexual civil unions did and that could lend important historical insight to the debate.Of course, insight = support. Maybe I'm being too hard on Tulchin, but scholars have to be aware that what they say--whatever they theorize--will be picked up, expanded, expounded and hyperbolized by those who seek to benefit or "lose" from a particular spin. In the particular case of history, it's dangerous to go too far down the path of unsupported conjecture as opinion becomes fact and history becomes an idealized past.
ADDENDUM: Tulchin mentioned similar arrangements and is probably talking about adelphopoiesis (via TMI), which historian John Boswell attempted to liken to same-sex marriage in his book Same-sex unions in pre-modern Europe.
Boswell maintained that they were celebrating romantic, indeed sexual unions between two men, and thus a forerunner of gay marriage. Boswell comments on the lack of any equivalent in the Catholic church; however, the British historian Alan Bray in his book The Friend, gives a Latin text and translation of a similar Roman Catholic rite from Slovenia, entitled Ordo ad fratres faciendum, literally "Order for the making of brothers"...
Alternative views are that this rite was used in many ways, such as the formation of permanent pacts between leaders of nations or between religious brothers. This was a replacement for "blood brotherhood" which was forbidden by the church at the time. Others such as Brent Shaw have maintained also that these unions were more akin to "blood-brotherhood" and had no sexual connotation...
It is worth noting that Boswell himself (Same-sex Unions, pp. 298-299) denies that adelphopoiesis should be properly translated as "homosexual marriage". He decries such a translation as "tendentiously slanted". This, however, has not stopped many gay activists from claiming (incorrectly) that Boswell's book purports to demonstrate that "gay marriage" was in fact sanctioned by Christian churches in the past.
At the same time, Boswell claims that "brother-making" or "making of brothers" is an "anachronistically literal" translation and proposes "same-sex union" as the preferrable rendering. Boswell's preference, however, is not unproblematic. "Sex", for instance, while pointing to a seemingly "objective" characteristic of the participants involved in the rite, in fact draws attention to the physical condition or biological sex of the "brothers" -- whereas the rites for adelphopoiesis explicitly deny that the union itself is a "carnal" one.
"Union of spiritual siblings" is perhaps a more "neutral" translation than Boswell's "same-sex union."
Friday, August 24, 2007
Have Master's Degree, Will (Can)Teach
To encourage our best minds to become teachers, we should also change the qualifications for becoming one. Students should be able to pursue careers in teaching either by getting a standard teaching credential or by substituting a master’s degree in an academic subject. That way we will eventually end up with more instructors with real academic knowledge rather than prepped with theories about how to teach.If I wanted to teach History in Rhode Island, I'd have to go back to one of the teacher mills to get all of the proper credentials over-and-above my MA. I'm sure the other facets of "doing my time" or "paying dues" wouldn't go away, but it would certainly save me a little money and time!
Wednesday, August 22, 2007
Anachronistic History: Ruth Simmons on George Washington
She touched upon the moral contradictions underlying the noble desires of past leaders who were eager to uphold freedom, despite an indifference to the injustice of slavery.This is simplistic. Historians agree that Washington's views on slavery certainly evolved from his early manhood up until he freed many of his slaves in his last will. For Simmons to opine that he "fail[ed] to apprehend the corrosive evil of slavery and the immoral inequities that it was to create for generations of descendants" betrays a blindered view of history. The fact is, Washington was hardly indifferent and fully recognized the evils of slavery.
“We all know that these lofty and compelling ideals were largely omitted from discourse when it came to Africans and Native Americans.… In failing to apprehend the corrosive evil of slavery and the immoral inequities that it was to create for generations of descendants, Washington compromised his legacy as a moral leader,” she said.
In a letter to the Marquis de Lafayette on May 10, 1786, Washington wrote:
The benevolence of your heart my Dr. Marqs. is so conspicuous upon all occasions, that I never wonder at any fresh proofs of it; but your late purchase of an estate in the colony of Cayenne, with a view of emancipating the slaves on it, is a generous and noble proof of your humanity. Would to God a like spirit would diffuse itself generally into the minds of the people of this country; but I despair of seeing it. Some petitions were presented to the Assembly, at its last Session, for the abolition of slavery, but they could scarcely obtain a reading. To set them afloat at once would, I really believe, be productive of much inconvenience and mischief; but by degrees it certainly might, and assuredly ought to be effected; and that too by Legislative authority.In September of that year, he wrote to John Mercer:
I never mean (unless some particular circumstance should compel me to it) to possess another slave by purchase; it being among my first wishes to see some plan adopted, by which slavery in this country may be abolished by slow, sure, and imperceptible degrees.He wrote to Charles Pinckney on March 17, 1792:
I must say that I lament the decision of your legislature upon the question of importing Slaves after March 1793. I was in hopes that motives of policy, as well as other good reasons supported by the direful effects of Slavery which at this moment are presented, would have operated to produce a total prohibition of the importation of Slaves whenever the question came to be agitated in any State that might be interested in the measure.Or Lawrence Lewis, in August of 1797, that:
I wish from my soul that the Legislature of this State could see the policy of a gradual Abolition of Slavery...Yet, amidst this, he very clearly did equivocate with regards to his own slaves. For instance, in his letter to Tobias Lear on April 12, 1791, concerning the anti-slavery laws in Pennsylvania: Philadelphia, as the capital then, would be Washington's home and he was concerned with how to maintain his own slaves without having them freed (As they would be if in Pennsylvania for over 6 months). And he went to secretive ends to assure his hold on them:
...in case it shall be found that any of my Slaves may, or any for them shall attempt their freedom at the expiration of six months, it is my wish and desire that you would send the whole, or such part of them as Mrs. Washington may not chuse to keep, home, for although I do not think they would be benefitted by the change, yet the idea of freedom might be too great a temptation for them to resist. At any rate it might, if they conceived they had a right to it, make them insolent in a State of Slavery. As all except Hercules and Paris are dower negroes, it behoves me to prevent the emancipation of them, otherwise I shall not only loose the use of them, but may have them to pay for. If upon taking good advise it is found expedient to send them back to Virginia, I wish to have it accomplished under pretext that may deceive both them and the Public; and none I think would so effectually do this, as Mrs. Washington coming to Virginia next month (towards the middle or latter end of it, as she seemed to have a wish to do) if she can accomplish it by any convenient and agreeable means, with the assistance of the Stage Horses &c. This would naturally bring her maid and Austin, and Hercules under the idea of coming home to Cook whilst we remained there might be sent on in the Stage. Whether there is occasion for this or not according to the result of your enquiries, or issue the thing as it may, I request that these Sentiments and this advise may be known to none but yourself and Mrs. Washington . From the following expression in your letter "that those who were of age might follow the example of his (the Attorney's people) after a residence of six months", it would seem that none could apply before the end of May, and that the non age of Christopher, Richmond and Oney is a bar to them.Clearly, Washington wasn't above subverting his ideals the closer the issue of slavery got to home. However, he also recognized the evils of slavery even if his actions failed to align with this recognition. One common defense of Washington's actions is encapsulated at the MountVernon.org website:
Washington did not lead a public fight against slavery, however, because he believed it would tear the new nation apart. Abolition had many opponents, especially in the South. Washington seems to have feared that if he took such a public stand, the southern states would withdraw from the Union (something they would do seventy years later, leading to the Civil War). He had worked too hard to build the country to risk tearing it apart.Historian Dorothy Twohig elaborates:
For Washington, as for most of the other founders, when the fate of the new republic was balanced against his own essentially conservative opposition to slavery, there was really no contest. And there was a widely held, if convenient, feeling among many opponents of slavery that if left alone, the institution would wither by itself. Ironically, the clause of the Constitution barring the importation of slaves after 1808 fostered this salve to the antislavery conscience by imparting the feeling that at least some progress had been made.Further, Twohig explains that Washington, essentially an aristocrat, was nervous about the emotionalism of many abolitionists (particularly Quakers). To that end, she observes:
...given his accurate conception of his own great and pivotal role in the infant country and his fears for the survival of the Republic itself, it is far from likely that he was ever sorely tempted to open as a national issue the Pandora's box that the Constitutional Convention appeared to contemporaries to have closed for the next twenty years.It is a tragedy that neither he nor the other Founders took action sooner, but their primary concern was with safeguarding a nascent nation, even if that meant sacrificing the central American ideals of freedom and liberty in the process.
However, it is last will and testament that probably indicates his final, and true, feelings on the matter of slavery.
Washington once told a visiting Englishman that slavery was neither a crime nor an absurdity, noting that the U.S. government did not assure liberty to madmen. "Until the mind of the slave has been educated to understand freedom, the gift of freedom would only assure its abuse," Washington explained.Yet, as historian Dennis Pogue comments:
His will, drafted a year later, said otherwise. He wrote that he wished he could free all the slaves at Mt. Vernon, but couldn't because some belonged to his wife's heirs, and he didn't want to divide families. Unless Martha or her heirs freed the Custis slaves as well, families would be broken up. [Henry] Wiencek [author of The Hairstons: An American Family in Black and White] believes George was trying to persuade Martha to use her influence on her heirs to free the Custis slaves--but she did not. Washington also stipulated that the freed children be taught reading, writing and a trade.
"His will was a rebuke to his family, to his class, and to the country. He was well ahead of people of his time and place," Wiencek said. "This is George Washington's true legacy. He'd said the slaves weren't ready for freedom, but at last he said they must have it because of their humanity."
Washington's will swiftly gained the public attention envisioned by its author, appearing in print almost immediately, with no less than 13 editions published in 10 different cities in 1800 alone. And yet, if Washington hoped that the decision to free his slaves would compel large numbers of his countrymen to follow his lead, he was sadly mistaken.His final act, though noble, didn't inspire the sort of change that he foresaw. He tried--if only fitfully and sometimes half-heartedly--to end slavery. He could have done more. Yet, Simmons' critique that Washington "fail[ed] to apprehend the corrosive evil of slavery" is clearly wrong. He knew it was immoral and that its existence ran counter to the claims of the American Revolution, but he felt his hands were tied by the practical politics of the day. Further, it is unfair of Simmons to expect that Washington could have had the Delphic vision to see "the immoral inequities that [slavery] was to create for generations of descendants." Like the other Founders, Washington believed that slavery would wither away. He was clearly wrong. Nonetheless, he recognized that to succeed, slaves (or former slaves) needed to be educated and prepared for a life of freedom before actually being set free.
Ultimately, Washington's failure was one that became more obvious as time went on. He and the other Founders kept the nascent Republic together by acceding to the political practicalities of the day. This meant acquiescing temporarily--as they truly believed--over the issue of slavery. Retrospectively, it is indeed a failure to uphold the American ideals of freedom and liberty for all.
Perhaps Simmons was trying to say that the failure to deal properly with the slavery issue shows that Washington and the other Founders weren't really as great as we should have hoped. Such an argument is hardly new, especially in academic circles. But Simmons has taken a now-common recognition of the acute failure of the Founders with regards to slavery--a critique that is deserved, if in context--and applied a layer of hyperbole that that results in skewing the perspective too much the other way. It is both undeserved and innaccurate. Washington's writings indicate he was at times rueful, at times hypocritical, and at times idealistic about the issue of slavery and its eventual end. Such conflicting thoughts and actions made him all the more human and make it all the more remarkable that he was able to do what he did.
Friday, August 17, 2007
President Madision, kinda
The Ogden School District needs a big eraser. After naming a new school "James A. Madison Elementary School" in May, a history teacher pointed out this month that the fourth president of the United States didn't have a middle initial.
"I'm blindsided," school board member John Gullo said. "I hate being embarrassed."
Gullo heads the American Dream Foundation, which donated a large painting of the former president to the school. An accompanying plaque does not have the mystery initial.
Word of the mistake reached superintendent Noel Zabriskie, who verified it and called the company that was making a sign for the new school. The call came in time for the error to be fixed on the sign. It is set to be installed Friday.
Some school letterheads will need to be replaced.
The board voted May 23 to approve the school name as "James A. Madison." The majority of board members chose Madison because the school borders Madison Avenue. Several board members also said they feel James Madison was a great president.
Tuesday, August 14, 2007
Maybe Someday....
Medieval/Ancient WorldWithin an hour of my home. Possibility of working with a really interesting English professor. Damn the practical bent!!! Ok, vent over. Seriously, if I had the time and money to go for a PhD, I would. But I've got a couple kids to raise, and they come first! Besides, my engineering job ain't bad.
The Department of History at Wheaton College (MA) seeks a tenure-track assistant professor with scholarly and teaching expertise in the fields of classical, late-antique, and/or medieval history. The History Department is especially interested in social or cultural historians whose thematic expertise includes gender, popular religion, material culture, cross-cultural contact, or the history of science or the environment. Geographic field open; preference for Celtic world, northwestern Europe, or southeastern Europe. Ph.D must be in hand at time of appointment. Send letter of interest, CV, and three letters of reference by November 26, 2007 to Alexander Bloom, Chair, Department of History, Wheaton College, Norton, MA, 02766. Preliminary interviews will be conducted at the 2008 AHA annual meeting. For more information, please contact hr@wheatonma.edu. AA/EOE. Wheaton College seeks educational excellence through diversity and strongly encourages applications from women and men from historically underrepresented groups.
FOIA and Historians' Focus
Maybe that will change now that a certain former First Lady seems to be hiding behind FOIA...
Sen. Hillary Rodham Clinton cites her experience as a compelling reason voters should make her president, but nearly 2 million pages of documents covering her White House years are locked up in a building here, obscuring a large swath of her record as first lady.I bet they would. Currently, it is political activists--and not historians....um....--that are calling for the release of the Clinton papers.
Clinton's calendars, appointment logs and memos are stored at her husband's presidential library, in the custody of federal archivists who do not expect them to be released until after the 2008 presidential election.
A trove of records has been made public detailing the Clinton White House's attempts to remake the nation's healthcare system, following a request from Bill Clinton that those materials be released first. Hillary Clinton led the healthcare effort in 1993 and 1994.
But even in the healthcare documents, at least 1,000 pages involving her work has been censored by archives staff because they include confidential advice and must be kept secret under a federal law called the Presidential Records Act. Political consultants said that if Hillary Clinton's records were made public, rivals would mine them for scraps of information that might rattle her campaign.
I'm trying not to play to the historians-are-liberal stereotype, but it sure seems that the motivation for hammering the Bush Administration on this issue--while partly altruistic--also nicely coincides with an ideological desire to dig up dirt on the current Bush as well as the past Bush, Reagan and Nixon Administrations (Republicans all). Hey, I'm sure there's dirt to be found! But any similar desire to dig into Carter or Clinton Administration records seems muted in comparison. At least that's my impression. Maybe historians need to wake up and realize that their relative silence on Clinton and Carter records feeds into stereotypes and undercuts the profession. If nothing else, they should bring Carter and Clinton into their arguments for professional (and political) cover.
Thursday, August 09, 2007
Post-Modern Conservatives
Kirk, like the best of the postmodernists, is calling not for a radical relativism—i.e., an assertion that truth doesn’t exist—but for a humility of the intellect. The postmodernist challenges us to learn from the aspects of the truth presented by today’s less powerful voices; the traditionalist Kirk asks us to show the same respect for those who are less powerful because they are voices of the past. (This idea is echoed in Chesterton’s phrase “democracy of the dead.”) Modernity viewed the past as an oppressor whose shackles must be removed from thought, but postmodernism can view the past the way a healthy new republic treats its former king: He is no longer king, but he must not be denied the full rights of citizenship—for he, too, has much to teach us.In an interview with John J. Miller at NR, Russello explained:
Like the postmodernists, Kirk presents us with what has been called a “romance of the marginal.” Gerald Russello’s fine book demonstrates how looking at the margins can give us directions to the moral and intellectual center; how Lyotard’s “crisis of narratives” can yield a vision of the permanent things.
Kirk’s conservatism is “postmodern” in the sense that it was never modern, and therefore is not burdened as liberalism is with the weaknesses of the Enlightenment worldview. Kirk’s emphasis on imagination, his concern for the imagery a society creates of what it admires or condemns, his treatment of tradition and history as not objective but one in which we participate and can change, and his devotion to what Burke called the “little platoons” of society all have parallels in postmodern thought. Moreover, Kirk himself saw this. In 1982, he wrote in National Review, that “the Post-Modern imagination stands ready to be captured. And the seemingly novel ideas and sentiments and modes may turn out, after all, to be received truths and institutions, well known to surviving conservatives.” With liberalism moribund, it “may be the conservative imagination which is to guide the Post-Modern Age.”Looking at the historical margins for the moral constants exhibited by the silent majorities of the past could be--and has been--a worthwhile endeavor for conservatives.
But perhaps this bit--from an interview with James G. Poulos at the American Spectator--goes further in explaining Kirk's post-modern leanings:
Reviewers since the 1950s have noted the internal dilemma of conservatives: once they start articulating what it is to be conservative, the battle is already half lost. Bernard Crick, in a review of The Conservative Mind, thought that Kirk was in an intellectual quandary, because "[h]aving no significant conservative tradition, Americans are put to the unconservative task of inventing one." In order to defend what they thought was worth conserving, many mainstream conservatives once believed that they had to engage liberalism on its own terms, in a "dialectic" mode that is foreign to the rhetorical, didactic, and imaginative modes more amenable to conservative expression. Kirk tried to overcome that difficulty by wrapping his arguments in a protective covering of narrative imagination. To state outright the traditions one wishes to preserve, and the means to do so, succumbs to the liberal temptation of reducing to reason things that are not always rational. To cast the same lessons through stories and autobiography, however, can leave enough room for the creation and preservation of tradition to take root. I think this is what Kirk was trying to do in the overall body of his work.Now we can see the allure of post-modernism for anti-modern traditionalists like Kirk and many other conservatives. (Admit it, most of us can at least be partially described this way). Yet, I don't doubt it'll still rub conservatives the wrong way to be lumped in with the contemporary--and satirized--post-modernist. But they, like the po-mo social and cultural historians, may discover important truths (heh) if they pay a visit to the heretofore ignored masses.
Wednesday, August 08, 2007
A New Movie about....Calvin Coolidge?
When the idea was first suggested to me I barely could muster a yawn. As a "liberal" filmmaker, what little I knew of Coolidge came from New Deal historians who view him as a somnambulant "capitalist tool" whose presidency served only as a prelude to disaster.What did Karol learn? Well, among other things:"Why Coolidge?"
"Read his autobiography — 250 pages, large print."
I did, and was intrigued. I moved on to his speeches, all of which he wrote himself. A master at delegating duties, Coolidge was not one to delegate beliefs. His speeches read like lay sermons to the American public, revealing fundamental values and ideals any small "d" democrat should embrace. I was hooked.
Others may disagree, but I can't imagine Coolidge rising to political bait like flag burning, the Pledge of Allegiance, gay marriage, or school prayer. In my opinion, he would have viewed these "hot-button" issues as inappropriate, having nothing to do with presidential business.What's in the film and what conclusions are drawn?
"Things of the Spirit" takes a deeper than usual look at the personal and political life of our thirtieth President. We already have completed the research, preproduction, production and story edit. Among other things, "Things of the Spirit":Regarding the Coolidge's economic record, Karol explains:
- Is the first fully researched film that has ever been made on Calvin Coolidge and the political and economic issues of the 1920's.
- Dispels the assumption of most American history textbooks that Coolidge was a small-minded materialist who served only as a handmaiden to business.
- Establishes clearly and finally that the Coolidge-Mellon tax cuts of the 1920's generated increased revenue to the federal government; that Coolidge ran surpluses in all his annual budgets; and that by the time he left office he had cut the national debt by one-third.
- Challenges the popular opinion of historians that the Coolidge Prosperity led inevitably to the 1929 stock market crash and the Great Depression of the 1930's.
- Illustrates America's leadership in post World War I European recovery, and prominence in worldwide economic and cultural development during the 1920's.
- Inspires viewers to give open minded consideration to the political beliefs, moral character and spiritual values of perhaps our most misunderstood President, Calvin Coolidge.
Maybe forgotten Cal will become "cool" again.Harding, Coolidge, and Secretary of the Treasury Andrew Mellon sought to kick-start the economy by reducing the top marginal tax rate to 25%. They did. Revenues increased dramatically, presaging Arthur Laffer by half a century. Both presidents ran surpluses in all their annual budgets. By the time Coolidge left office, the national debt had been cut by one-third.
New Deal historians maintain that the tax cuts of the 1920s reversed the progressive tax policies of Woodrow Wilson. Far from it. Exemptions increased so much that by 1927 almost 98% of the American people paid no income tax whatsoever. When Coolidge left office in 1929, wealthy people paid 93% of the tax load. During Wilson's last year in office they had paid only 59%.
Less remembered, and less appreciated by contemporary politicians, was Coolidge's aversion to farm subsidies. At great political risk, Coolidge twice vetoed the popular McNary-Haugen farm subsidy bill. As Coolidge put it:
"If the government gets into business on any large scale, we soon find that the beneficiaries attempt to play a large part in the control … and those who are the most adroit get the larger part of it." Although some may wish otherwise, Coolidge was not one publicly to condemn private organizations. Rather than censure the Ku Klux Klan following its massive 1925 march in Washington, Coolidge chose to address the annual meeting of the American Legion in Omaha on "toleration and liberalism," concluding:
"I recognize the full and complete necessity of 100 percent Americanism, but 100 percent Americanism may be made up of many various elements … Whether one traces his Americanisms back three centuries to the Mayflower, or three years to the steerage … we are all now in the same boat … Let us cast off our hatreds."
I can't think of any other public figure who would have dared deliver that message to that audience at that time.
Via Powerblog.
Friday, August 03, 2007
House Revises....um, Erases?...History on the Fly
Details remain fuzzy, but numerous Republicans argued afterward that they had secured a 215-213 win on their motion to bar undocumented immigrants from receiving any federal funds apportioned in the agricultural spending bill for employment or rental assistance. Democrats, however, argued the measure was deadlocked at 214-214 and failed, members and aides on both sides of the aisle said afterward.Breitbart has video. Will historians get up in arms if the Congressional Record is allowed to be "revised" like this? Would they tolerate similar actions by a GOP controlled House?One GOP aide saw McNulty gavel the vote to a close after receiving a signal from his leaders – but before reading the official tally. And votes continued to shift even after he closed the roll call - a strange development in itself.
***The official House website did not show a record of the vote as of 1 a.m. Friday.
Thursday, August 02, 2007
Kennedy Assassination: Ideological Turning Point of Democratic Party
[Kennedy's] kind of liberalism -- "tough and realistic," as Piereson puts it, in the tradition of FDR and Truman -- was carried away in the riptide of his death. In a crucial and counterintuitive interpretive act, the nation's opinion elite made JFK a martyr to civil rights instead of the Cold War. Kennedy had been killed by a communist, Lee Harvey Oswald, who a few years before had tried to defect to the Soviet Union. Liberals nonetheless blamed the assassination on, in the characteristic words of Supreme Court Chief Justice Earl Warren, "the hatred and bitterness that has been injected into the life of our nation by bigots."Fred Siegal disagrees:
Thus, the assassination curdled into an indictment of American society: "Kennedy Victim of Violent Streak He Sought to Curb in Nation," read a New York Times headline. Until this point, 20th-century liberalism had tended to see history as a steady march of progress. Now, the march had been interrupted by the country's own pathologies. "Kennedy was mourned in a spirit of frustrated possibility and dashed hopes," Piereson argues, and that sense of loss came to define the new liberalism.
Mr. Piereson's own argument is persuasive and well-presented, but liberalism was never as reasonable as he assumes. The irrationalism that exploded later in the 1960s had been a component of left-wing ideology well before. Herbert Croly, the liberal founder of the New Republic magazine, was drawn to mysticism. In the 1950s ex-Marxists fell over themselves in praise of Wilhelm Reich and "orgone box," hoping that sexual therapy might replace Marxist theory as the toga of the enlightened. And in the very early 1960s a veritable cult of Castro, informed by Franz Fanon's writings on the cleansing virtues of violence, emerged among intellectuals searching for an alternative to middle-class conventions.Ed Driscoll, too:It's not reason that is at the heart of modern-day liberalism but rather the claim to superior virtue and, even more important, to a special knowledge unavailable to the unwashed or unenlightened. Depending on the temper of the time, such virtue and knowledge can derive disproportionately from scientism or mysticism--or it can mix large dollops of both. "Camelot and the Cultural Revolution" lays bare the long-ignored failure of intellect that hastened the decline of American liberalism. If liberals can belatedly come to grips with their failure to acknowledge Oswald's political identity, they might be able to celebrate a revival that involves more than a Broadway show.
But the actual causes of liberal disorientation regarding Kennedy's death and the motives of his killer predate his assassination by several years. It was during the 1950s and early '60s that that liberal elites declared America's nascent and disparate conservative movements to be a greater threat to the nation than the Soviet Union, as illustrated by films of the day such as Dr. Strangelove and The Manchurian Candidate. And the subtext of those films was very much based upon "a vast literature that developed in the '50s and early '60s about the threat from the far right," Piereson says, specifically mentioning Richard Hofstadter's The Paranoid Style In American Politics, and Daniel Bell's The Radical Right.
As Piereson writes, leading up to Kennedy's fateful trip to Dallas, there was a remarkable amount of violence in the south, caused by a backlash against the civil rights movement. In October of 1963, Adlai Stevenson, the Democrats' presidential candidate in the 1950s who had been appointed the ambassador to the UN by Kennedy, traveled to Dallas for a speech on United Nations Day. Stevenson is heckled, booed, spat upon, and hit over the head with a cardboard sign. Stevenson says publicly, there's a "spirit of madness" in Texas. And Kennedy's White House staffers believe that he should cancel his already announced November visit to Dallas.
Thus, at the beginning of November 1963 a framework has been established that the far right is the threat to American democracy, "and that they've moved from heated rhetoric to violent act," Piereson says.
"So when the news spreads that Kennedy has been killed, the immediate response is that it must be a right winger who's done it," Piereson notes. And while the Birch-era right definitely had severe issues, JFK's assassin on November 22, 1963 had, of course, a polar opposite ideology. "When the word is now spread that Oswald has been captured, and that he has a communist past, and they start running film of him demonstrating for Castro in the previous summer, there is a tremendous disorientation at this."
...
The shock that Kennedy was in reality a victim of the Cold War simply did not compute on a national level.