Wednesday, February 23, 2005
A Lull
I will be off line for a while as I have some personal matters that will preclude me from regular posting. I hope to be back in a week or two. Please come back then!
Friday, February 18, 2005
"Patriot's History of the United States "
Larry Schweikart, co-author of A Patriot's History of the United States, was interviewed by Kathryn J. Lopez at National Review Online. In it, he offered he and co-athor Michael Allen's philosophy of history
We assume that people usually mean what they say; that they don't always have hidden motivations; and that ideas are more important than "class" or "race" or "gender." Under more normal times, our book would simply be entitled, A History of the United States, because it is accurate. . . . we reject "My Country, Right or Wrong," but we equally reject "My Country, Always Wrong." I think you'll find us quite critical of such aspects of our past-such as the Founders' unwillingness to actually act on slavery on at least three separate occasions; or about Teddy Roosevelt's paternalistic regulations and his anti-business policies. On the other hand, as conservatives, we nevertheless destroy the myth that FDR "knew" about the Pearl Harbor attack in advance. Instead, we try to always put the past in the context of the time — why did people act then as they did, and was that typical?It sounds like they have approached the topic responsibly and, hopefully, can join Paul Johnson in showing that conservatives can also right good, respectable academic history. (Unlike some others). In that vein, Schweikart offers what is probably a summation of what most conservative historians believe
regardless of America's faults, it has always aspired to be a "city on a hill" and, more often than not, has attained that goal. It remains a beacon of liberty throughout the world, so much so that people still risk their lives just to come here and, despite threats to do so by the Hollywood elites after every election, they do not leave. I only need ask these students, "Can you think of any other country, really, where you'd rather live today?"In other words, America isn't perfect, it screws up, but it has basically trended to the good. Empire, Hegemony, or whatever you may call it, America clings to an ideal and believes that it can attain. Impossible? Yes, but the effort produced, and continues to produce, benefits that the world can share.
Thursday, February 17, 2005
Black History Month
The South Bend Tribune has an editorial today regarding the dichotomy that is Black History Month:
I bring this up because I'm not sure at what point these sorts of "mandated" celebrations or acknowledgements outlive their usefulness, if ever. It's not just Black History either, I believe next month is Women's History month, created to serve the same purpose. The need for such designations was clear: for too long, the historical roles of women and minorities were either understated or, more likely, untaught and unrecognized. However, in my graduate-level experience, no matter what the topic, instructors go to great lengths to discuss the aforementioned roles within their historical context. (Say, medieval women throughout a survey of medieval history). However, I would venture that it is not for grad-level History students that such periods of topic-specific enlightenment are designated.
As such, I am ignorant of to what degree blacks and women and other minorities are woven into the historical narrative taught at the Grade and High School level. Additionally, and by extension, I'm unsure of how many average Americans know about both the unique contributions of minorities and women to American history and, perhaps more importantly, the degree to which these groups shared in the common experiences of American life throught history. Thus, I think the "high profile black Americans" who refuse to speak in February are doing themselves and others a disservice. It is only through education and exposure that attitudes will change. Instead of sending a "message" meant to register their displeasure with the pace of historical enlightenment in these matters, they are simply self-censoring and ensuring that no message whatsoever is heard.
Today, what was Negro History Week has evolved into Black History Month. In the midst of this annual celebration, it's clear that progress has been made in recognizing the role that blacks played in this nation's history.During a conversation prior to a class, the subject of teaching history and trying to squeeze all relevant subject matter in so that "testable" (state mandated) material could be covered. It was mentioned that it was extremely difficult to cover pertinent African-American historical topics during Black History month because so many other topics had to be covered. Not being a teacher myself, I asked if it was possible to bring up such topics throughout the course of the year and within the context of the given topic. The response was an ambiguous sort of "Well, yeah..."
But the idea that this month-long celebration is still necessary to fill in gaps that exist within the rest of the year suggests there's more to be done.
This uncomfortable reality was illustrated in a recent Associated Press story about high profile black Americans who refuse to accept speaking requests during the month of February. For many, the demands of the month highlight the fact that, come March 1, public interest in their work dries up.
As one noted, "Black people were visible during February, but the other 11 months of the year we became the invisible people.''
But that's not to suggest that Black History Month should be eliminated. Those interviewed for the AP story also say the celebration should continue, that black history would become even more marginalized without it.
That's a far cry from what Carter G. Woodson intended when he designated the second week in February Negro History Week. Woodson hoped the week could one day be eliminated when black history became fundamental to American history.
That goal means acknowledging the role of black Americans in this country's rich history -- not as a separate tale told once a year, but as an integral part of a larger story. It means that all of us, particularly those who teach and write about history, should take on the responsibility of telling these stories and recognizing their importance throughout the year."
I bring this up because I'm not sure at what point these sorts of "mandated" celebrations or acknowledgements outlive their usefulness, if ever. It's not just Black History either, I believe next month is Women's History month, created to serve the same purpose. The need for such designations was clear: for too long, the historical roles of women and minorities were either understated or, more likely, untaught and unrecognized. However, in my graduate-level experience, no matter what the topic, instructors go to great lengths to discuss the aforementioned roles within their historical context. (Say, medieval women throughout a survey of medieval history). However, I would venture that it is not for grad-level History students that such periods of topic-specific enlightenment are designated.
As such, I am ignorant of to what degree blacks and women and other minorities are woven into the historical narrative taught at the Grade and High School level. Additionally, and by extension, I'm unsure of how many average Americans know about both the unique contributions of minorities and women to American history and, perhaps more importantly, the degree to which these groups shared in the common experiences of American life throught history. Thus, I think the "high profile black Americans" who refuse to speak in February are doing themselves and others a disservice. It is only through education and exposure that attitudes will change. Instead of sending a "message" meant to register their displeasure with the pace of historical enlightenment in these matters, they are simply self-censoring and ensuring that no message whatsoever is heard.
History Channel
Just for the record, I'd like to plainly state that I find at least 50% of what is on the History Channel to be total crap. Too much "Modern Marvels" and "World War II". (It's not referred to as the Hitler Channel for nothing). Occasionally, original programming like specials on the French Revolution or the War of 1812 are good (or at least adequate), but for someone with an interest in Medieval History....blech. Outta luck. Unless of course I pay extra to my cable provider and then I can receive 14 more History channels. But I guess that's the point, isn't it? Lowest common denominator served on "regular" cable, intellectual appetite sated on premium channels at greater expense. C'est la vie.
Giving "Politically Incorrect History" the Boot
Max Boot, self-described neo-con, took offense when the NY Times book review labeled Thomas E. Woods Jr.'s Politically Incorrect Guide to American History as a neo-con take on history. According to Boot, and many others, Woods History is not only Politically Incorrect, it is also Historically incorrect. Oh, and as far as being a neo-con work? Nope, try paleo-con, says Boot. Though I would aver that confedera-con may be more accurate once one learns of Wood's backstory. In short, stay away from Wood's work. Boot suggests that a conservative would be better served by reading Paul Johnson's A History of the American People, which I also recommend, or Walter McDougall's A New American History, which is currently underway.
Tuesday, February 15, 2005
How the Right and Left Handle Academic Controversy
Jonah Goldberg has contrasted the liberal reaction and "apology" effort regarding the Larry Summers affair to the conservative effort to get rid of Ward Churchill. Goldberg marvels at how the Left is still more quickly effective in achieving their goal.
In the Summers affair, free speech and academic freedom barely came up, except among a few conservative commentators and one or two academics who were already known for their political incorrectness. Instead, Summers was a pinata to be bashed for material rewards and to send the message that some subjects — no matter what the evidence — are simply taboo even for serious scholars to discuss in closed-door, off-the-record meetings.
Meanwhile, Ward Churchill, whose scholarship is a joke, whose evidence is tendentious at best, and who called the victims of 9/11 the moral equivalent of a man who sent babies to the gas chambers, is a hero of free speech. He has refused to apologize. Many conservatives are forced to defend free speech and "diversity" in academia while liberals let the NOWers feed on Summers's flesh.
Liberals may despise what Churchill said, but it's a matter of principle now. The normally insightful and fair Mort Kondracke declared on Fox News, "I really think it's useful for universities to have people like this around, to show students and the rest of us just how odious some of the ideas of the far Left are." Would Kondracke punt on a professor who'd endorsed slavery? I somehow doubt it.
Hopefully — and, I think, probably — someone will find enough academic fraud to fire Churchill for cause. No doubt, we'll hear from many on the left about the "chilling effect" such a move would have on "academic freedom," and many conservatives will clear their throats in embarrassment. You really have to marvel at how the other side has mastered this game.
Truth
Douglas Groothuis, Ph.D., director of the Philosophy of Religion program at Denver Seminary, reviewed True to Life: Why Truth Matters by Michael Lynch.
Lynch defends four basic and interrelated claims in this brief but meaty book. First, truth is objective; it is not mere belief. Humans are fallible. We often hold beliefs that we later reject because they have been refuted by reality. Believing something does not make it true. Neither can two contradictory beliefs (such as 'There is a God' and 'There is no God') both be true.Groothuis also questioned "Lynch's secular worldview" and believed that he left unanswered from where did human's derive this desire for truth. Nonetheless, without reading the book, Lynch and Groothuis have provided a concise template for those of us who "believe" in truth.
Second, it is good to believe what is true. Therefore, third, truth is worth pursuing intellectually. Fourth, truth has objective and intrinsic value. That is, truth is not a means to an end, but an end itself. If we are thinking clearly, we don't use truth for something higher than truth itself.
'True to Life' addresses more deep philosophical issues than a short review can adequately accommodate. Especially noteworthy, though, are his arguments against relativism ('True for me, but not for you') and pragmatism ('What's true is what works'). These philosophies dominate popular culture and have infected much of the academy as well. Nevertheless, they fail to survive Lynch's careful scrutiny. For example, when Martin Luther King Jr. cried out against institutional racism (speaking truth to power), he based his arguments on objective-truth claims: that African-Americans were equal to whites, that African-Americans had been exploited and that they deserved freedom as equal citizens of the United States. King's power came not merely from his oratorical abilities, but because he was challenging the social consensus and the law itself on the basis of objective truth.
Appeals to relativism and pragmatism would have carried no persuasive power. As Lynch notes, 'Having a concept of truth allows us to make sense of the thought that a claim, no matter how entrenched in one's culture (such as racism), no matter how deeply defended by the powers that be, may still be wrong.'
Lynch rightly notes that if truth exists, and if we should pursue it, certain dispositions or habits of the mind are appropriate. This "involves being willing to hear both sides of the story, being open-minded and tolerant of other's opinions, being careful and sensitive to detail, being curious, and paying attention to the evidence. And it also involves being willing to question assumptions, giving and asking for reasons, being impartial, and being intellectually courageous - that is, not believing simply what is convenient to believe." How much of popular American culture - especially television - encourages these virtues?
Campos: Affirmative Action to Blame for Churchill
Paul Campos , a University of Colorado law professor and self-described liberal, says that affirmative action is to blame for the hiring of Ward Churchill.
One of the many ironies of this scandal that threatens to undermine academic freedom is that it couldn't have happened if those who decided to hire, tenure and promote Churchill had taken advantage of academic freedom themselves.
The privileges created by tenure are supposed to insulate faculty from political pressures in general and censorship in particular. Yet those of us in the academy, if we were candid, would have to admit that few places are more riddled with the distorting effects of politics and censorship than university faculties.
Academics claim to despise censorship, but the truth is we do a remarkably good job of censoring ourselves. This is especially true in regard to affirmative action. Who among us can claim to have spoken up every time a job candidate almost as preposterous as Churchill was submitted for our consideration? Things like the Churchill fiasco are made possible by a web of lies kept intact by a conspiracy of silence.
The University of Colorado hired Churchill onto its faculty because he claimed to be an American Indian. Anyone who has the slightest familiarity with research universities can glance at his resume and state this with something close to complete confidence.
Churchill thus represents the reductio ad absurdum of the contemporary university's willingness to subordinate all other values to affirmative action. When such a grotesque fraud - a white man pretending to be an Indian, an intellectual charlatan spewing polemical garbage festooned with phony footnotes, a shameless demagogue fabricating imaginary historical incidents to justify his pathological hatreds, an apparent plagiarist who steals and distorts the work of real scholars - manages to scam his way into a full professorship at what is still a serious research university, we know the practice of affirmative action has hit rock bottom. Or at least we can hope so.
As someone of generally liberal political inclinations, I support affirmative action in principle. (And I have surely benefited from it in practice: My parents came to this country from Mexico in the year of my birth, and I spoke no English when I started school.) In theory, the argument that aggressively seeking out persons of diverse backgrounds can enrich the intellectual life of the university has great force.
Affirmative action is based, in part, on the idea that it will help us understand the viewpoints of the conquered as well as those of the conqueror, of the weak as well as the strong, of those far from power as well as those who wield it.
Too often, these sentiments are abused by those who sacrifice intellectual integrity while engaging in the most extreme forms of preferential hiring. Ward Churchill's career provides a lurid illustration of what can happen - indeed, of what we know will happen - when academic standards are prostituted in the name of increasing diversity.
Tenure and academic freedom are hard to defend if they don't provide us who benefit from them with the minimal degree of courage necessary to say, when confronted by someone like Churchill, enough is enough.
If even the extraordinary protections of tenure don't lead us to condemn a fraud of this magnitude in unmistakable and unapologetic terms, then we don't deserve them. What else is academic freedom for?
Monday, February 14, 2005
Priscilla, a slave story
The Providence Journal is runnig 3-part story on slavery (part 1, part 2, part 3). In particular, it goes "behind the scenes" of Edward Ball's National Book Award-winning Slaves in the Family and tells the story of how Ball, a desecendant of a plantation owner, discovered the family and descendents of one particular slave, named Priscilla, who was owned by Ball's ancestor. I'll leave it to you to read the series. There is much here that could be useful in the classroom. I particularly like the graphic of the triangle trade (PDF here) and the very useful page of related links. It is compelling reading.
"The best scholars see the world as children do."
Jeffrey Nesteruk was a "deep thinker." Then he had a daughter and things changed.
My den at home was always my intellectual sanctuary, the place to which I withdrew to write and, as my wife says, "think lofty thoughts." The change is the way those lofty thoughts now mingle more easily with the mundane ones.
Today I write with the door to my den open, something I never did before my daughter was born. She often wanders in while I'm working for a hug or a laugh or a quick spin in the big leather chair that sits in the corner. Though she's getting better at avoiding the carefully arranged stacks of books around my desk, she's knocked more than one weighty philosophic tome from its comfortable perch.
The increased commingling of the lofty and pedestrian that my daughter brought to my life has changed my writing. Slowly -- at first, imperceptibly -- both my scholarship and writing have become more personal, more revealing. I tell stories along with arguing doctrine. In making a philosophical point about the nature of obligation, I'm as likely to disclose how my wife and I share the washing of dishes as I am to cite a Supreme Court ruling.
I also say things more provisionally. I let more of my uncertainties onto the page. I'm less interested in winning an argument than in starting a conversation. . . . More than I did before, I want to give everyone, especially my critics, their due. As my daughter has taught me through her own initial refusal to fit neatly into my life, those who don't conform to your preconceptions about the world can teach you the most. . . .
The best scholars see the world as children do. They share with children an openness and wonder that too often fade in the course of an academic career. They are able to consider anew the unwitting assumptions that often guide our thinking. At some level, even the youngest kids recognize this, knowing that when they meet a genuine scholar they have found a kindred spirit. My daughter has discovered one in my colleague Kerry, an accomplished political theorist. The last time he stopped over, she did several laps around the house, screaming with delight.
Each day both my writing and my daughter take me to the edge of what I thought I knew -- and then push me over, sometimes gently, sometimes not. But, having survived the unsettled stacks of books in my den, I take my own toppling more in stride, too. I've learned that being knocked off a comfortable perch is the best education a scholar can hope for.
Philosophers Within the Context of their Personal Lives
Nigel Rodgers and Mel Thompson offer a few details about the personal lives of some noted philosophers as a reminder that they didn't think in a vacuum.
It is sometimes assumed that the life of reason will lead to a reasonable life. Socrates claimed that the unexamined life is not worth living, and tried to persuade his fellow Athenians to examine their lives and so change them. Plato expected the philosopher-ruler to concern himself with matters social and political, even though his mind might be on higher things, aware of the unreality of the shadows that most take for reality. Indeed, he argues in The Republic that only philosophers are fit to rule, since they alone are fully rational, capable of perceiving the good and controlling their baser passions. Stoics expect life to be lived in accordance with universal reason, and Epicureans may have made happiness the great motivator, but their idea of happiness entailed a life of simplicity. So we might expect philosophers to live well, in the broadest sense of that word.Rodgers and Thompson do just that, but they conclude with the reminder that "bad behaviour of philosophers does not invalidate their work, but sets it within a human context. It illustrates the way in which even the greatest thinkers find their lives governed by forces that are far from intellectual."
This is not to say that philosophers are immune from the desires of the flesh. Among the ancients, Diogenes deliberately offends by masturbating in public places and even Plato acknowledges Socrates’ interest in contemplating the beauty of young men. Mediaeval misdemeanours include the extra-curricular interest that Abélard took in his young student Héloise, even if his resulting castration serves as a warning to libidinous teachers. In the modern period we know rather more about the lives and lusts of great thinkers. They are worth exploring, not just for tabloid entertainment, but because they raise questions about the role of rationality in human behaviour.
Artists, musicians, novelists or poets can behave outrageously badly and still be accepted as great exemplars of their art. Indeed, bad behaviour may enhance their reputations and give them a certain glamour. From philosophers, however, we expect nobler, wiser behaviour. Philosophers may not claim to lead lives of impeccable virtue, and their individual foibles do not automatically invalidate their arguments, but it is only fair to ask to what extent the lives of those committed to reason are shaped by that same faculty. Let us consider the behaviour of just a few philosophers of the modern period, with an eye to whether it has influenced, or been influenced by, their thought.
Heloise & Abelard: Love Hurts
The New York Times > Books >Christina Nehring offers a fine wrap-up review on the bevy of Heloise & Abelard books on the shelves.
The story of Abelard and Heloise hardly resonates with the spirit of our age. Not least, its origins in the classroom offend: teachers, we know, are not supposed to fall in love with their students. Heloise, moreover, is no feminist heroine, despite having been one of the best educated women of her age and writing some of its most affecting prose. Nobody who takes the veil on the command of her husband and swears ''complete obedience'' to him can hope to sneak into the bastion of feminism. Today, even the high romance of the couple's liaison strikes us as foreign: all that sacrifice and intensity! We live in a time of broad antiromanticism when teenagers, according to The Times Magazine, have given up on relationships altogether and adults write to the editor to salute their wisdom. ''Romance?'' scoffed one correspondent. It's just ''an excuse . . . to work off sexual energy.''
Small wonder, in this climate, that the anguish Abelard and Heloise suffered for each other renders them even more suspect. What with safe sex, prenuptial agreements and emotional air cushions of every stripe, we have almost managed to riskproof our relationships. The notion that passion might comprise not only joy but pain, not only self-realization but self-abandonment, seems archaic. To admire, as an early-20th-century biographer of Abelard and Heloise does, the ''beauty of souls large enough to be promoted to such sufferings'' seems downright perverse.
And yet there's a grandeur to high-stakes romance, to self-sacrifice, that's missing from our latex-love culture -- and it's a grandeur we perhaps crave to recover. How else to account for the flurry of new writing on these two ill-fated 12th-century lovers?
Hegemony vs. Empire
Lee Harris explains why Hegemony and Empire aren't the same thing. This is interesting to me because I'm always fascinated by how a "lost" word can be resurrected, become a part of the everyday lexicon of discourse and then, finally, turn into an overused piece of jargon. To me, hegemony has become such a word, especially among historians. According to Harris, it was the radical historian George Grote, in his History of Greece, who resurrected the word from obscurity:
Hegemony, according to Grote, was emphatically not empire. On the contrary, Grote used these two different words in order to demarcate between two radically different kinds of political organization, both of which had been illustrated by Athens during two different historical phases of its career. Hegemony had come first; and only afterwards did it degenerate into empire. . . .Yes, sometimes we historians do love to bandy terms about for the sake of sounding sophisticated, don't we? Nonetheless, no matter how "high falutin'" the term "hegemony" is, those who have succeeded in conflating empire and hegemony have taked away the ability to accurately apply the term, in its correct connotation, to the United States. So, in the "Grotian" sense, the U.S. is a hegemony. To many, though, it is a hegemony in the Chomskyite sense. When such a confusion exists, perhaps it is time to re-retire a term.
The corrupting of an ideal, brought about by human greed and ambition, is always lamentable; but the corruption does not invalidate the ideal -- and it was the ideal of hegemony that George Grote wanted his readers to focus on. True, we may reasonably argue about whether hegemony inevitably degenerates into empire; but we may not reasonably argue that there is no difference between the two forms of political organization. Democracies have often degenerated into tyranny -- yet no one in his right mind would argue that, because of this melancholy fact, there is no difference between the ideals of despotism and democracy.
Hegemony, as Grote used the word, meant the leadership by a single stronger partner of other less strong, but still autonomous partners, undertaken for the mutual benefit of all parties concerned -- and in the case of the Delian league, a partnership that, as a matter of historical fact, brought peace and prosperity to those who were its members, and which, in addition, gave grave second thoughts to the vast and powerful Persian empire whose seemingly infinite resources perennially threatened the autonomy of each of the individual Greek city-states.
For Grote, the fact that the Delian League worked, and worked so well for so long, was a point that needed to be brought emphatically to his reader's attention. Hence, his insistence on reviving the concept of hegemony. There had to be some simple way of referring to mutually beneficial confederacies led by strong, but not overbearing leaders -- leaders who, while leading, continue to respect the autonomy of their partners -- and what better word to serve this purpose than the Greek word that had originally been intended to refer to precisely such a confederacy?
By a sublime irony, this once useful linguistic distinction has been completely lost in the intellectual discourse of contemporary politics, and lost due to the fact that the world's greatest living linguist, Noam Chomsky, has perversely chosen to conflate the two words as if they were merely synonyms for the same underlying concept. Thus, Grote's precise and accurate revival of the original Greek concept has been skunked forever by Chomsky's substitution of the word hegemony for the word empire, so that nowadays the two are used interchangeably, except for the fact, already noticed, that hegemony sounds so much more sophisticated than empire. Why use a word that ordinary people can understand, when there is a word, meaning exactly the same thing, that only the initiated can comprehend?
Saturday, February 12, 2005
Did Post-Modernism Spawn Rhetorical Use of History
Ralph E. Luker makes an interesting point in a recent post at Cliopatria:
What interests me about the way Churchill, Malkin, and some of Churchill's apologists use history is that if you can find a precedent for an action in the past (Malkin's Japanese internment; Churchill on Lord Amherst's use of smallpox) it becomes, on the one hand, a convenient excuse for similar action in the present; or, on the other hand, justification for blatant distortion of history because we know that there was holocaust intent anyway. Proyect makes his support of Churchill's holocaust argument quite explicit here. If you doubt it, you are a 'holocaust denier' and, yet, Proyect is finally persuaded that, in this case, the evidence denies it. Think about it. If past precedent justifies present action or blatant distortion of the historical record, we can repeat the 19th and 20th century's horrors; and we have, indeed, bought the post-modern notion that all the world's merely a text, to be construed as we will.I think he's on to something, but I'd add that sometimes the political use of history is not so much done as a reinterpretive excercise as an obfuscating, or ignorant, one. (See below).
Thursday, February 10, 2005
Sufism and the Future of Islam
Stephen Schwartz gives a history lesson on Sufi muslims in an attempt to discover whether they could act as a potential "Islamic unifier." Schwartz seems to conclude that there are some possibilities, but that the West should be careful:
. . . it would be an error to believe that Sufism can or should become a direct agency of political change, or an implement of Western strategy, in the transformation of the Islamic world. "Official Sufism," whether financed by the U.S. or not, would no more be appropriate than the merging of the state and the Catholic religious orders, such as the Franciscans, Jesuits, or Dominicans, in such countries as Nicaragua, Poland, or the Philippines. The esoteric nature of Sufism, which is necessarily private and personal, must be respected. Those are the lessons of the Jerusalem Sheikh Bukhari's rejection of a seat on the Palestinian Waqf, and it is in their spirit that Westerners sincerely seeking the betterment of the Islamic world should approach him and the millions like him.
Selling History to Save It
The Rhode Island Historical Society has decided to sell off historical artifacts in an effort to keep a balanced budget and maintain funding levels. The decision is based on the recommendations (PDF) of Financial Options Task Force of the RIHS.
According to the story published today in the Providence Journal:
According to the story published today in the Providence Journal:
In a letter sent to members earlier this week, the Rhode Island Historical Society says it may be forced to sell as many as 40 objects from its permanent collection, including a rare Colonial-era desk once thought to be from the famed Townsend-Goddard workshop.It is a tough situation and the story of tight budgetary constraints on an historical society is familiar. In the particular case of the RIHS, the sources of their financial woes were summarized in the aforementioned Financial Options Task Force report
The proposed sale, which society officials say can still be called off if enough money is raised over the next few weeks, follows last month's $8.4-million sale of a Townsend-Goddard tea table from a local private collection, as well as plans by the Providence Athenaeum to sell an original folio edition of John James Audubon's Birds of America.
"I hope people realize that nothing less than the survival of the historical society as an institution is at stake here," said executive director Bernard P. Fishman. "We've made all the staff and budget cuts we can make. Unless we can raise enough money from outside sources, we have no choice but to consider selling parts of the collection."
Fishman said that cost-cutting measures, including sharp reductions in staff and a decrease in hours at the society's library facility on Hope Street, had resulted in a balanced budget in 2004.
The potential price tag: about $10 million, most of which would be used to improve care for and access to the society's collections, and to shore up the society's $4.5-million endowment.
Fishman said he would gladly call off the sale if enough money could be raised to cover the society's expenses and bolster its endowment. Otherwise, the society, which has already solicited proposals from several auction houses, plans to move forward with the sale.
"We truly hope the sale can be avoided," Fishman said. "But before making our plans public, we did pursue private inquiries regarding potential donors. Needless to say, the results weren't encouraging."
According to Fishman, the bulk of the sale would involve objects that may be valuable in the eyes of collectors, but which have little connection to Rhode Island and its history. . . But the star of the sale would be a "nine-shell" block-front desk once attributed to the Townsend-Goddard workshop and now thought to be by Providence cabinetmaker John Carlile.
. . . Unlike other pieces under consideration, the desk has impeccable Rhode Island credentials: commissioned by Providence merchant Joseph Brown in the late 18th century, it remained in the Brown family until 1944 when it was donated to the historial society. . . "There's no question the desk is an outstanding piece," Fishman said. "But you have to weigh that against the future viability of the historical society itself. What point is there in holding on to the desk if the result is that the society ceases to exist?"
But others, including some prominent historical society members, aren't so sure.
Edward F. Sanderson, executive director of the state Preservation & Heritage Commission and a Rhode Island Historical Society board member, said he supported the decision to sell pieces not directly related to Rhode Island. But he expressed dismay over the potential loss of the Brown desk, which he called "a cultural artifact of the first rank."
"This isn't just a rare piece of furniture, it's the rarest of the rare," said Sanderson. "As far as I know, there's only one other nine-shell desk in existence. It's literally in a class by itself."
Former Rhode Island School of Design Museum decorative arts curator Thomas S. Michie called the Joseph Brown desk "arguably the most important piece [of furniture] in the historical society's collection and one of the greatest pieces of Colonial-era furniture" made in Rhode Island. . ."The fact that it was probably made in Providence and not in Newport is hugely important. . .While great Newport pieces are rare, great Providence pieces are virtually unheard of."
. . . Pieter Roos, executive director of the Newport Restoration Foundation, said the sale might even threaten the society's museum accreditation.
"Basically, you're selling off your birthright to pay your expenses," said Roos, who is also the Rhode Island representative of the New England Museum Association. "That's a very dangerous precedent."
Briefly summarized, the RIHS was hurt through the 1990’s and early 00’s when growth in programs was not matched by commensurate growth in the financial underpinnings of the organization. Long the possessor of excellent collections and for many years the beneficiary of an admirable level of state funding, in the late 1980’s and early 1990’s the Society’s saw its reach begin to exceed its financial grasp. Programs grew, but revenues and endowment did not keep pace. Significant effort was devoted to the Heritage Harbor project, which did notSelling some of its holdings is one way the RIHS can maintain financial security. To coopt a marketing term, they also need to make their product more appealing to the public, which would also increase their revenues. After years of ad-hoc management, the RIHS has implemented a long term strategic plan (PDF) that lays out goals on a set timeline. It seems that the RIHS has righted its own sinking ship, though it is still bailing water. Unfortunately, to stay afloat, it needs to throw some of its collection overboard. The sooner they can stop the sinking, the better.
materialize. Development was under funded, and state aid ominously began a slow decline after 1997. The gap was filled by increasing the endowment draw. During the stock market boom of the late 1990’s this appeared to be sustainable, but with the end of the boom the increased draw coupled with declining asset values rapidly took their toll, and the endowment began to shrink at a precipitous rate. Between 1998 and 2004 the endowment fell from over $6,500,000 to $4,564,000, more than a 29 % reduction.
In 2002 the Board took several decisive actions by changing the Society’s executive director and authorizing him to begin reducing the operating expenses to match the decreasing revenue stream. Much work was done pursuant to this decision, and by 2003 the Society was operating in a much more balanced financial state. This year, 2004, the endowment draw is back down to approximately 5.3%. Ominously, however, the RIHS had to spend four months fighting a 50% reduction in its state funding, equivalent to approximately 15% of the operating budget. While restored at the end of the budget process, the proposed cut was a clear sign that the Society’s days relying on automatic state funding are decisively over. . .
The unfortunate side effect of all of these actions has been to substantially reduce the Society’s programs, collections support, and public presence. The proposed state budget cuts are evidence of this. The Society’s ability to invest in new programs to generate additional revenues has been severely crippled – neither the funding nor the staff is available any longer for these purposes. (Financial Options Task Force Report, p. 2)
Wednesday, February 09, 2005
Niall Ferguson: Bush Should Learn from Wilson - Don't Withdraw Too Soon
Niall Ferguson approves of President Bush's refusal to set a timetable for troop withdrawal and calls on the historical example of Woodrow Wilson for support
Woodrow Wilson was a man who lived to have his illusions shattered, though not long enough to see the complete collapse of his new world order into a Second World War. President Bush needs to learn from his example. And the first lesson he needs to learn is that just getting people to vote is no more than a beginning. Get the follow-through wrong and you can easily end up with 'one man, one vote--once.'Ferguson also called President Bush the first idealist-realist. "Part of him understands very well that the success of American policy in the Middle East depends on tenacity and the credibility that comes with it. But another part of him is excited to the point of unrealism by his own grand visions of a democratic revolution throughout the Middle East." I think by "unrealism" Ferguson refers to Bush not following the "realist" school of foreign policy with regards to the Middle East. Hence, President Bush is being "unreal" (not of the realist school) when he believes in "a democratic revolution throughout the Middle East." It must be this, because if said Revolution does occur, wouldn't it indeed be "real"?
Lesson No. 2 concerns the duration of American military interventions. Wilson finally took America into World War I in 1917. Yet by 1919 the troops were on their way home from Europe, leaving the Europeans--in effect the French--to police the peace treaty. Premature U.S. withdrawal from Iraq in the wake of last week's elections would run the risk of leaving no one to police the peace.
That is why the president is more right than he knows to reject calls for an arbitrary departure date. The price of liberty in Iraq will be, if not eternal vigilance on the part of the United States, then certainly 10 years' vigilance.
Greenspan Praises Adam Smith
Alan Greenspan praised Adam Smith
. . . saying that 18th-century philosopher Adam Smith was "a towering contributor to the development of the modern world."(via Political Theory Daily Review)
Greenspan, who this month began his final year as Fed chairman, delivered the Adam Smith Memorial Lecture at Fife College in Kirkcaldy, Fife, Scotland, where the early proponent of free-market capitalism was born in 1723.
"In his `Wealth of Nations,' Smith reached far beyond the insights of his predecessors to frame a global view of how market economies, just then emerging, worked," Greenspan said in a text of his remarks that was released in Washington.
"In so doing, he supported changes in societal organization that were to measurably enhance world standards of living," Greenspan said.
Smith, who died in 1790, argued that while free markets might appear chaotic, they actually were guided to produce the right amount and variety of goods by what he called the "invisible hand" of supply and demand. If a product were in short supply, its price would rise, prompting more producers to enter the market, Smith argued.
"It was left to Adam Smith to identify the more general set of principles that brought conceptual clarity to the seeming chaos of market transactions," Greenspan said. "Most of Smith's free market paradigm remains applicable to this day."
In his 17 1/2 years as Fed chairman, Greenspan has promoted his own free-market views that the economy works best with less government regulation.
Smith's arguments proved powerful support for proponents of free markets and free global trade, Greenspan said. These developments, he said, have helped boost average per capita global economic output by 1.2 percent annually since 1820, enough to double living standards every 58 years.
Greenspan said the Great Depression of the 1930s did provide support for a time to those who argued that communism, with its government control of economic decisions, represented a better approach.
But the argument between free markets and government-controlled economies ended with the fall of the Berlin Wall in 1989 and the collapse of the Soviet Union, said Greenspan, a Republican first appointed by President Reagan.
"There was no eulogy for central planning. It just ceased to be mentioned, leaving the principles of Adam Smith and his followers ... as the seemingly sole remaining effective paradigm for economic organization," Greenspan said. "A large majority of developing nations quietly shifted to more market-oriented economies."
Does Historian Rashid Khalidi Believe that "History Repeats"?
In a review article titled Reason: Imperial Waltz: Is American power good, bad, or distressingly reluctant?, Michael Young points to the faulty "reasoning" of Rashid Khalidi, author of Resurrecting Empire: Western Footprints and America’s Perilous Path in the Middle East, as he attempts to fall back on "historical lessons of empire" with regards to the attempt by America to foster democracy in the Middle East.
Rashid Khalidi, who holds the Edward Said Chair in Arab Studies at Columbia University (and who dedicates his book to “EWS”), is one of those who doubt the sincerity of this project. His Resurrecting Empire is a tribute to the headlock of history, the idea that the lessons of the past must somehow invariably apply in the same way today. Khalidi comes to readers from the commanding heights of expertise, arguing that what “seems so painful to those with any real knowledge of the region” is the unwillingness of the U.S. to accept that it is stepping into the boots of past imperial powers, and that “this cannot under any circumstances be a good thing and cannot possibly be ‘done right.’”
Khalidi thus offers a very different view from that of Perle and Frum, for whom the messenger can alter the nature of the message. Where the latter see American power as a force for good, Khalidi, who has no doubts about America’s imperial bent, rejects the possibility that it might represent something potentially constructive.
If this is the use to which history is put, it is stifling indeed. Khalidi is unimaginative when it comes to seeing the possible advantages of American power in the Middle East. Instead, he falls back on a standard template of Arab criticism, arguing that the Iraq war was part of “a new form of hegemony over the region, in collaboration with Israel.”
The first half of that judgment is possibly true, but the second is based on scrawny evidence—mainly that in 1996 a group of American neoconservatives, including Perle, helped write a policy paper titled “A Clean Break” for Israeli prime ministerial candidate Benjamin Netanyahu. They outlined a vision for the region very much to Israel’s advantage, including going after Iran and Lebanon’s Hezbollah, “weakening, containing and even rolling back Syria,” and bringing the Hashemites to power in Iraq.
The only problem with Khalidi’s theory is that the paper sought to influence Israeli rather than American behavior. Much in it was never implemented. This was not because neocons wouldn’t have liked to see a Middle East in that image but because policy is not made in the way Khalidi suggests. Position papers rarely have a direct influence on grand strategy; contending bureaucracies kick in to muddy the waters. As the neoconservative publicist Max Boot put it, describing the influence of his Bush administration brethren: “While neocons temporarily won the policy argument in some areas, the president and his inner circle are hardly marching in lockstep with their agenda. As in all administrations, there are competing factions at work, and no side will ever win all the policy arguments.”
The Israeli link to Iraq is important to Khalidi because he sees it as proof that the Americans are hypocritical democratizers. The pity is that Khalidi never asks how Arabs and Palestinians have benefited from the overthrow of Saddam, arguably the worst tyrant modern Arabs have known. Was it never conceivable that a democratic and multiethnic Iraq would provide Arabs with a contrast to their usual condition under dictatorship? Or that it would highlight Israel’s mistreatment of the Palestinians? Or that it would prove that Islam and democracy are compatible?
Alas, the know-how of Arab intellectuals has rarely generated democratic change in the Middle East during the last half-century. Many, like Khalidi, came to reject transformational fantasies about the region, over time becoming de facto guardians of the status quo. It was not a status quo they liked, but one they accepted after the failures of their preferred alternatives, the most obvious one being Arab nationalism. Frustration was palliated by a perception that the region was far more complex than the uninitiated suspected, and that to understand its dynamics one had to be an expert. And so Arab “expertise” slowly bred sterility—most flagrantly in Iraq.
Security is a word rarely seen in Khalidi’s text, nor does one ever get a sense from him how the 9/11 attacks shaped U.S. Middle East policy. If America’s war in Iraq is old-fashioned imperialism, then it cannot be a preliminary effort to change a region that, intentionally or not, dispatched 19 young men to kill 3,000 innocents. The administration botched its justifications for war in Iraq, and probably post-war revival, but underneath was a sensible view that Middle Eastern autocracy had generated frustration and much hatred for America, and that therefore it was necessary to change the situation.
Ironically, Khalidi and his comrades long blamed Washington for failing to do just that. So how does Khalidi react to America’s ambitions in Iraq? He says that if American support for democracy and human rights were “lasting and consistent throughout the region,” it would be welcomed. But sweeping change doesn’t occur in a flash; piecemeal progress is necessary. Yet accepting this would mean that Khalidi would have to embrace the transitory advantages of the U.S. invasion of Iraq (advantages that may be more difficult to discern today thanks to bungling post-war policy), which would imply that American imperialism might occasionally be a force for good. Yet he has already made clear that this proposition is unacceptable. The sage has boxed himself in.
FDR Believed in Social Security Privatization
Duane D. Freese uncovered an interesting quote from Franklin Roosevelt concerning his vision of Social Security.
In the important field of security for our old people, it seems necessary to adopt three principles: First, noncontributory old-age pensions for those who are now too old to build up their own insurance. It is, of course, clear that for perhaps 30 years to come funds will have to be provided by the States and the Federal Government to meet these pensions. Second, compulsory contributory annuities that in time will establish a self-supporting system for those now young and for future generations. Third, voluntary contributory annuities by which individual initiative can increase the annual amounts received in old age. It is proposed that the Federal Government assume one-half of the cost of the old-age pension plan, which ought ultimately to be supplanted by self-supporting annuity plans. [Franklin Roosevelt, Message to Congress on Social Security on Jan. 17, 1935]In short, FDR believed privatization of a portion of Social Security was the proper course to follow.
Partisanship of the Founders
Pejmann Yousefzadeh gives a history lesson to those who decry the current political climate and long for the stateliness of the political dialogue used by the U.S. Founders.
In the past few years, there has been a revival of interest in the lives and political careers of the Founders -- a revival that is helped by the many books that have come out on the lives of the Founders. Two of the most recent books are Joseph Ellis's biography of George Washington and Ron Chernow's biography of Alexander Hamilton. Both books are instructive when considering the state of partisanship in America.
Ellis's biography of Washington expertly captures the reasons why upon his death, Washington was eulogized as 'first in war, first in peace, first in the hearts of his countrymen.' Washington was no intellectual but he was possessed of superb judgment, tact and discretion. He was also undeniably courageous and his feats of valor on the battlefield justified the respect and adoration he was given. Washington was the natural choice to chair the Constitutional Convention, and was considered indispensable as the first President of the United States. His near decision to quit after his first term raised alarm bells, and his decision not to stand for a third term -- which he surely would have won -- laid down the example of selflessness we expect of American politicians.
And yet, Washington was oftentimes subject to some of the most vicious calumny imaginable -- calumny that almost caused Washington's retirement from politics after his first term, and impelled him to gladly quit the Presidency after a second term. The Father of his country was accused of being senile, a puppet in the hands of Alexander Hamilton, a closet monarchist who sought to become an American Caesar, and so on. These were not just occasional jibes but part and parcel of a concerted campaign in the Jeffersonian Republican press that sought to demystify the first President so as to make it easier to campaign against initiatives like the Jay Treaty, or the financial reforms implemented by Alexander Hamilton as Secretary of the Treasury. Indeed, one of the newspapers most responsible for seeking to trash Washington's public standing was the Aurora, a Republican newspaper headed by Benjamin Franklin Bache, the grandson of Benjamin Franklin (the bitterness between the Founders apparently extended to their descendants). So rancid was the campaign against Washington that he ended up breaking off relations with two fellow Virginians -- Jefferson and Madison -- because of the part they played in seeking to ruin Washington's reputation. When the first President of the United States finally passed away during the Administration of John Adams, his funeral was mostly peopled by Federalists. Thomas Jefferson -- then the Vice President -- actually boycotted the funeral.
Chernow's biography of Hamilton reveals just how cutting and wounding the political debate became during the age of the Founders. With Hamilton as the de facto head of the Federalists, and with Jefferson and Madison commanding the Republican political machine, political vitriol reached nearly frightening levels. Federalists accused Republicans of being "Jacobins" who were all too willing to excuse the bloody excesses of the French Revolution and the diplomatic depredations of Citizen Genet and Charles Maurice de Talleyrand -- who prompted the XYZ Affair. Republicans on the other hand gleefully portrayed Hamilton and the Federalists of being "monarchists," "Anglomen" who were too easily seduced by Great Britain (and probably wanted to return America to Britain's political orbit), rapacious swindlers who wanted to use banks to oppress the agrarian population (creating a central bank was, of course, one of Alexander Hamilton's chief projects). It did not help that the angry political debate between Hamilton and Jefferson was colored by accusations of personal scandal. In one of his many pseudonymous writings, Hamilton -- who was an astonishingly prodigious writer and who probably would have celebrated the advent of the Blogosphere had he lived to see its development -- made what now appears to be clear and insulting reference regarding liaisons between Jefferson and his slave Sally Hemings.
In revenge, Republican pamphleteers -- also writing pseudonymously and often -- gleefully harped on Hamilton's involvement in an adulterous affair with one Maria Reynolds, who collaborated with her husband James to garner hush money from Hamilton to keep the affair secret. Initially, Republicans believed that Hamilton paid the hush money because James Reynolds possessed information relating to Hamilton's abuse of his position as Secretary of the Treasury for pecuniary gain. In order to clear his name, Hamilton was forced to make a public and humiliating admission that the hush money was to cover up the affair -- which naturally led to more vitriol directed at Hamilton. Needless to say, all of this anonymous pamphleteering caused political opposition to spill over into blind hatred between the Founders. Hamilton and Madison -- once collaborators on The Federalist -- eventually became bitter enemies. Because of his illegitimate birth, Hamilton felt that he had to be especially protective of his reputation, meaning that anytime that anyone attacked him, instead of ignoring the attacks, Hamilton fired back with even greater literary and oratorical pyrotechnics. Sometimes these pyrotechnics ended up leading to challenges to duels -- one of which, of course, ended Hamilton's life at the hands of Aaron Burr. Consider that: a former Secretary of the Treasury and the head of the Federalist Party ended up dueling with the sitting Vice President of the United States. And to think that we got excited when Vice President Dick Cheney told Senator Patrick Leahy to perform anatomical impossibilities upon himself.
Tuesday, February 08, 2005
Dresden: Don't apologise - understand
James Woudhuysen has written a scholarly article that attempts to remind of the historical context surrounding the bombing of Dresden and to remove any sense that it was a unique action --one at odds with "standard" Allied bombing policy-- in an attempt to destigmatize the event and convince us that an apology for the Dresden bombing is unwarranted.
Critics "strip the Allies' behaviour from its historical and social context, and, in place of that, give the unchanging mantra of 'man's inhumanity to man' an accent that concentrates always on individual emotions."
They propogate the meme that all war is absurd, that the bombing of Dresden in particular was "over the top" and irrational.
They imply "a kind of moral equivalence between the Allies and the Axis."
He then provided some historical context. First, bombs in WWII was hardly "smart" and bombing was an imprecise venture. Second, it was the stated aim of Bomber Command to target labor sources (read: average German citizens and their families) to help cripple the Nazi war machine. Finally
The enormity of Dresden means that it deserves sober assessment. Yet the recent discussion and calls for an apology provide the opposite. The further Dresden retreats into history, the more it is viewed as a timeless allegory of human evil, for which current and future generations must feel guilt, and atone. The actions of one man, 'Bomber' Harris, are singled out as barbarities that epitomise the depths to which man can sink in the pursuit of war. Such a treatment sheds no light on the Second World War as an historical event, but is all too revealing about the crisis of self-doubt among the Allied elites today.Woudhuysen also offered a few reasons as to why the Allied bombing of Dresden has become so negatively viewed:
The purpose of this essay is to examine what Dresden really meant in the context of the Second World War, and what makes the contemporary understanding of Dresden problematic.
. . . the growing worry about military adventures today finds echoes in the understanding of the Second World War in history. While the Nazis are always used as a sure symbol of Evil, there is less of a sure sense that 'our side' was quite so Good as it was assumed to be. The contrast between the Allies' willingness to bomb Dresden and its failure to bomb the railway lines to Auschwitz caused no small amount of breast-beating around the recent anniversary of the death camp's liberation. Of course, the Allies' motives were never quite as pure as they have been painted. But it says a lot about the profound sense of self-doubt in American and British society today that even the Second World War has now become subject to the same kind of moral relativism that informs discussion of more recent conflicts.
He then provided some historical context. First, bombs in WWII was hardly "smart" and bombing was an imprecise venture. Second, it was the stated aim of Bomber Command to target labor sources (read: average German citizens and their families) to help cripple the Nazi war machine. Finally
By focusing on the horrors of Dresden, too many critics in practice whitewash the rest of the Allies' actions in the war - not just the use of the atomic bomb, but also, for example, Churchill's manoeuvres in the Indian sub-continent, which cost millions their lives, or the betrayal of Partisans in southern Europe, or the fake 'de-Nazification' of Germany after 1945.He also sees the change in perception as a result of domestic British and international politics and notes how confusion has crept in
Shortly after Dresden, a few British clerics and obscure Labour MPs issued feeble protests. Thereafter, it suited the postwar Attlee government to distance itself from Churchill. By criticising what it took to be his 'excesses', Labour could reinforce the mistaken perception that the Second World War was a just war, whose sole aim was the defence of democracy against fascism.He concluded
By representing the strategic bombing campaign as questionable, Labour could confirm postwar British society in the view that the rest of the tactics deployed were therefore just, too. This, the most important and lasting domestic political legacy of the war, is the one that critics completely ignore. They are part of the problem of the Second World War, and will not aid any clarification of its nature.
It also suited Joseph Stalin to go on about Dresden. . . at war's end, Stalin had control of East Germany [and] he liked to make propaganda about the rapaciousness of his wartime partner.
Ironically, Jörg Friedrich sees Dresden as a diversion from an interest Britain in fact shared with Germany - that of ridding the world of Stalin. For their part, right-wing apologists for Dresden can agree that Britain declared war not just against Germany, but also against the Soviet Union. The upshot is that today, people remain baffled as to what the Second World War was really all about, or should have been about. Instead of clarity on the most important event of the twentieth century, confusion reigns.
An analysis of the context in which the Dresden bombing took place shows how simplistic most explanations of it are. The Second World War was not a straightforward moral battle of Good v Evil, in which Dresden formed a necessary part. Nor was Dresden an abomination that took place outside of British military policy at the time. And it should certainly not be read, as it tends to be today, as a tale of universal, unchanging human depravity - a depravity symbolised by one man's actions but one for which we all continue to remain culpable.I would argue with Woudhuysen's attempt to downplay the degree to which WWII really was "a straightforward moral battle of Good v Evil," especially as it concerned the battle against Nazi Germany. However, I agree that Dresden was not a unique instance of a misbegotten Allied bombing policy formulated by one man.
Dresden needs to be understood, not apologised for. It demands a careful historical perspective about a unique set of circumstances, not an emotional spasm of wailing about the intrinsic aggression of all humankind.
Monday, February 07, 2005
Political Correctness: Enemy of Art
Maureen Mullarkey reviews Roger Kimball's Rape of the Masters: How Political Correctness Sabotages Art in which he provides a "pathologist's report" on Art History.
Addendum: I've included a link to reviews by "regular" readers, above. Perhaps I should do this more often as there is value in reading how other, presumably non-academic (albeit self-selecting), readers viewed a book.
Addendum II: I just discovered that I wasn't the only one as Dartmouth scholar Mikhail Gronas has already done some research in the area.
Art history was once an esteemed participant in the methods, values and goals of humanistic inquiry. Its purpose was to yield the broadened literacy that results from genuine scholarship and encounters with great art. But it has become a polemical tool for dismantling the concept of greatness and, with it, the conditions of civilized life. Roger Kimball puts it starkly: “Its enemy is civilization and the … assumptions on which civilization rests. Its aim is to transform art into an ally in the campaign of decivilization.”Many non-scholars apparently enjoyed the work, though one believed that Kimball set up "straw men" by using the most extreme examples of "art-crit" to make his point. For more, ESR (Enter Stage Right) interviewed Kimball last month.(via Justin Katz and Political Theory Daily Review)
Specifically, it is Western civilization that draws fire from academics hostile to the source of their privileges and unmindful of the origins of their own cultural assumptions. As phrased by Keith Moxey, distinguished professor of art history at Barnard and Columbia: “All cultural practice is shaped by political considerations.” So it follows that art history is—must be—”a form of political intervention.”
We have heard this before. In 1963, Leonid Ilyichev, Kruschev’s spokesman for the arts, declared: “Art belongs to the sphere of ideology.” Addressing a meeting of Party leaders and workers in the arts, he insisted that “art always has an ideological-political bent that … expresses and defends the interests of definite classes and social strata.”
He might have been addressing the College Art Association. Certainly, traditional art history still survives; but increasingly, it is practiced against the odds and the mental habits of tenured art appreciators. Kimball’s angry alarum is an extended postscript to “Tenured Radicals”, his 1990 chronicle of humanities departments corrupted by politicized agendas. Beneath the veneer of donnish rationality, lies a drive that is, at heart, a mad endeavor: the compulsion to abuse and discredit traditional values—including standards of achievement—as they manifest themselves in art.
Kimball warns against the debasement of intellectual life by opaque theorizing that sets out to mystify, shunning all obligation to clear thinking. He concentrates on the visual arts because this is the prime arena where intellectual pretension joins rhetorical inflation to promote a crack-pot cleverness that denatures the object it studies. In the arts, there is no brake on confusion between the verbal and the visual. Ornate utterance intervenes to keep us from recognizing what we see with our own eyes.
Addendum: I've included a link to reviews by "regular" readers, above. Perhaps I should do this more often as there is value in reading how other, presumably non-academic (albeit self-selecting), readers viewed a book.
Addendum II: I just discovered that I wasn't the only one as Dartmouth scholar Mikhail Gronas has already done some research in the area.
Gronas, an Assistant Professor of Russian Language and Literature, is interested in literary tastes. He wants to know why people read certain books, what drives those reading decisions, and what lies behind readers' reactions. Sociological surveys are fine, he says, but the answers are shaped by the questions. With online book reviews, like those at Amazon.com, he can begin to get a quantitative measure of taste (from the number of stars assigned by readers to a book) along with a qualitative assessment (from the personal commentary provided by readers). . .(via Political Theory Daily Review)
"Amazon.com book reviews are not based on literary theory," he says. "They are written by everyday readers, not scholars, who bring a new perspective to the topic of taste. Since online reviews are voluntary, they offer honest opinions that aren't prompted by specific questions." [emphasis mine]
British Anti-Slavery Movement - Why Then?
Adam Hochschild's Bury the Chains is reviewed in the Economist.
Why an ancient practice, condemned neither by the New Testament nor by Christian tradition, was recognised as unacceptable by growing numbers of men and women in the second half of the 18th century has long puzzled historians. Mr Hochschild avoids big-picture answers and concentrates on the extraordinary characters involved. . .
His mainly British cast is a large one. . . The objects of their concern were by no means all helpless victims. Slave rebellions rocked the West Indies throughout the 1790s and beyond. After the French abandoned Santo Domingo to the British in 1793, the army's attempt to put down Toussaint L'Ouverture's slave revolt cost more soldiers than it lost in the American war of independence. At Westminster, even MPs who approved of slavery questioned its expense. . .
t once was fashionable to explain the ending of slavery as an economic consequence, and to treat changing attitudes as secondary. Slavery, it was argued, was ceasing to be profitable. With industrialisation, investors in slave ships and plantations had better places to put their money. Reformers, in effect, were pushing at an open door. Even if the dates worked better—and there was money in slavery well into the 19th century—mechanical stories of this kind would explain at best lack of resistance, not anti-slavery pressure.
Opponents of the slave trade agitated not only for new laws. They badgered courts to look at old law in fresh light. . . [Hochschild] remind[s] us how a committed minority can persuade a majority to see what at first they cannot or do not want to see. In one of many vivid passages, Mr Hochschild describes a simple but electrifying piece of evidence that Clarkson placed before an enquiry into the slave trade by the Privy Council in 1788. It was a diagram of a slave ship, the Brookes, showing slaves tightly packed and chained in rows. For many people, this was perhaps the first time that the reality of the slave trade had impinged upon them: with their own eyes, they could see its cruelty.
Sunday, February 06, 2005
Enlightenment and Islam in the Arab World
George Shadroui analyzes a few questions while attempting to answer one big one, is democracy possible in the Muslim and Arab world? He concludes it's possible, and comments that intellectual leading the way is indeed a surprise
Ironically, it has been George W. Bush, a man some criticize for wearing his Christianity on his sleeve, who has embraced the notion that the Islamic world is open to change, that Muslims do want to self govern and that Muslim women do crave full participation in culture and politics. Bush’s formulations can be crude and even troubling at times, but surely he is not wrong to encourage open-mindedness where the hopes of millions of Muslims and Arabs, not to mention Israelis, are concerned. It won't be imposed, but it can be encouraged if we are realistic and tough-minded. If not for their sake, what about our own? Though I wish the president was a little less starry-eyed in his idealism, in the aftermath of 9/11, it remains a fair question.
Himmelfarb on Trilling, Eliot and the origins of "neoconservatism"
Gertrude Himmelfarb remembers the effect that Lionel Trillings 1940 essay, "Elements That Are Wanted," on T.S. Eliot in the Partisan Review, had on her and her fellow young "neo-Trotskyites."
I had never read Eliot's essays or the journal he edited . . . I was, however, a faithful reader of Partisan Review, which was, in effect, the intellectual and cultural organ of Trotskyites . . . Many years later I remembered little about Trilling's essay except its memorable title, "Elements That Are Wanted," and the enormous excitement it generated in me and my friends. . . it was a revelation, the beginning of a disaffection not only with our anti-Stalinist radicalism but, ultimately, with liberalism itself. Trilling has been accused (the point is almost always made in criticism) of being, not himself a neoconservative, to be sure, but a progenitor of neoconservatism. There is much truth in this. Although he never said or wrote anything notable about the "practical sphere" of real politics (he was not a "public intellectual" in our present sense, commenting on whatever made the headlines), he did provide a mode of thought, a moral and cultural sensibility, that was inherently subversive of liberalism and thus an invitation to neoconservatism."What struck Himmelfarb was that
Trilling . . . did not believe morality was absolute or a 'religious politics' desirable. But Eliot's vision of morality and politics was superior to the vision of liberals and radicals, who had contempt for the past and worshiped the future. Liberals, in the name of progress, put off the realization of the good life to some indefinite future; radicals put off the good life in the expectation of a revolution that would usher in not only a new society but also a new man, a man who would be 'wholly changed by socialism.' Marxism was especially dangerous, Trilling found, because it combined "a kind of disgust with humanity as it is and a perfect faith in humanity as it is to be." Eliot's philosophy, on the other hand, whatever its defects and dangers, had the virtue of teaching men to value "the humanity of the present equally with that of the future," thus serving as a restraint upon the tragic ambition to transcend reality. It was in this sense, Trilling concluded, that Eliot bore out the wisdom of Arnold's dictum. Eliot's religious politics, while maleficent in the practical sphere, contained elements wanting in liberalism--"elements which a rational and naturalistic philosophy, to be adequate, must encompass."Further, Himmelfarb reminds us, particularly historians, that literature can help us understand the past.
Trilling . . . did not reflect much upon the kinds of moral questions, or "moral values," that occupy us today: marriage, family, sex, abortion. What interested him was the relation of morality to reality--the abiding sense of morality that defines humanity, and at the same time the imperatives of a reality that necessarily, and properly, circumscribes morality. He called this "moral realism." . . . Trilling wrote about "the dangers of the moral life itself," of a "moral righteousness" that preens itself upon being "progressive."
. . ."Moral realism" is Trilling's legacy for us today--for conservatives as well as liberals. Conservatives are well disposed to such realism, being naturally suspicious of a moral righteousness that has been often misconceived and misdirected. And their suspicions are confirmed by the disciplines upon which they have habitually drawn: philosophy, economics, political theory, and, most recently, the social sciences, which are so valuable in disputing much of the conventional (that is to say, liberal) wisdom about social problems and public policies.
The element that is still wanting, however, is the sense of variety, complexity, and difficulty--which comes, Trilling reminds us, primarily from the "experience of literature," and which at its best informs the political imagination as well as the moral imagination.
Friday, February 04, 2005
Carnival of History #2
Hm. Well, I missed Carnival of History #1, but thanks to Glenn Reynolds, I've been led to #2. As a medievalist (at least I'm minoring in it with an MA thesis pending...yeah, a bit different), I found Hugo Holbling's review of Inventing the Middle Ages by Norman F. Cantor very interesting. I've not read the book, but based on Holbling's review, I may have to pick it up. There's much more to dig into, and I intend to do just that. In short, these History Carnivals have something for everyone.
Toward a Secular Theocracy
Eric Cox reviews Paul Edward Gottfried's Multiculturalism and the Politics of Guilt: Toward a Secular Theocracy.
To Gottfried, multiculturalism is not a dry academic philosophy but rather the political tool of an identifiable ideology being used to redefine Western notions of normality for the purpose of seizing and exercising central state power. Examples of specific items on the multiculturalists’ agenda, Gottfried argues, are affirmative action; so-called hate crimes legislation and workplace anti-discrimination policies that exclusively target white males; gay marriage and adoption; educators’ inculcation of sensitivity toward all religious beliefs and lifestyles (except Christianity); and bans on expressions of Christian faith in public places. Gottfried acknowledges that people can support any of these policies without being multiculturalists—that is, without having the intent of denigrating whites, heterosexuals, and Christians—but he contends that multiculturalists promote them with the express purpose not merely of winning votes from various minority groups, but of the much larger goal of remaking Western societies, in the same sense, one might say, that the National Socialists and the Communists attempted to do so. . . Gottfried also disputes the commonly held notion that multiculturalism is merely a philosophy of moral relativism advocating “tolerance” for all categories of people. Instead, he writes, it is an ideology that views with utter contempt anyone who belongs to the wrong category of humanity, such as white-male-Christian-heterosexuals, and actively attempts to undermine institutions that support or reflect their values and lifestyles.
Psychoanalytical Critique Of Therapy Culture
Patrick Turner reviews Rob Weatherill's Our Last Great Illusion: A Radical Psychoanalytical Critique Of Therapy Culture, a Freudian look at the "therapy culture. . .[that] tackles the subject from a psychoanalytic perspective informed by postmodern cultural theory. Intended for a general audience, the book nonetheless assumes a fair degree of familiarity with a wide range of thinkers and critical concepts." Indeed, if the review is any reflection of the book, readers better prepare for some thick reading. Nonetheless, Weatherill's conclusion, as interpreted by Turner, is interesting.
What we now have in the progression from modernity to postmodernity is the absolute hegemony of a generalised, new age therapeutic ethos of 'care' and 'well being' that has dissolved all previous boundaries between private and public self and is impervious to the ideological divisions of an earlier age. For Weatherill, this triumph of a liberal, as opposed to radical, progress finds its apogee in the current emphasis within government public policy on promoting emotional skills and self-management rather than equality and control. In the commercial sector the dominance of the therapeutic can be seen in contemporary forms of marketing, customer care, product design and service provision that speak to a desire to be looked after, flattered and stroked. The explosion of personalised, 'new age' forms of expertise that offer eclectic strategies for gaining 'emotional intelligence', self mastery and overcoming barriers to achievement in any domain imaginable from sex to creativity to work is further evidence of the triumph of the therapeutic.Is this an intellectual way of wondering whether it is more important to be free or well-cared for? Perhaps.
The Hidden Inequality in Socialism
David R. Henderson, Robert M. Mcnab, and Tamas Rozsas have a new article (pdf) for The Independent Review about The Hidden Inequality in Socialism. According to the authors
In recent years, researchers on transition economies have concluded that income inequality increased in the former socialist countries of eastern Europe and central Asia despite the liberalization of political and economic life. This judgment, however, places too much credence in the data reported by socialist planners and underestimates the cumulative effect of the myriad inequalities present under socialism.
Bicycle History
Steve Weinberg takes a look at Bicycle: The History by David Herlihy.
[T]o call the book a traditional history is misleading. Herlihy uses brief boxed asides, artwork, photographs, cartoons, technical drawings and other tools to dazzle. The oversized format could qualify the tome as a coffee table book, except that I think of that term somewhat negatively, connoting something rarely read, and for good reason. 'Bicycle,' on the other hand, is compulsively readable.In an interview about the book, Herlihy spoke of the technological and social impact of the bicycle
Q: What was the impact of the invention of the bicycle?Seems like an interesting topic to consider within the context of Turner's Frontier school of History.
A: The bicycle had a substantial technological impact. It is not an exaggeration to say that the bicycle business of the 1890s spawned the automotive industry. During the peak year of production in 1896 some three hundred firms in the United States alone produced nearly two million bicycles, and many of these companies went on to make automobiles using the same highly advanced production systems. Many automotive pioneers, including Henry Ford, started out working with bicycles. And bicycle technology also helped produce the first airplanes. The Wright brothers were bicycle mechanics; they used bicycles for wind tunnel experiments and built the Wright Flyer in their workshop.
Q: What about the social impact of the bicycle?
A: The bicycle changed social life in all sorts of ways--for women in particular it provided a justification to dress more sensibly and a means to travel without supervision. And in the early twentieth century, when cars were still prohibitively expensive, millions of working-class people relied on the bicycle for everyday transportation. This is still the case in the developing world. And of course the bicycle has long provided healthy and fun exercise to people of all ages and backgrounds.
Thursday, February 03, 2005
"The Lessons of 1787"
Leslie H. Gelb fears that the newly-elected Iraqi Assembly doesn't adequately represent Kurds and Sunni's. Thus, she looks to the Lessons of 1787 learned by the U.S.: form a committee for the job.
. . .the new Assembly should forgo drafting the constitution and establish a special constitutional committee for that purpose. Such a committee would be selected to better reflect both Iraq's population and its power elites.To assume that civil war naturally follows from such a "stalemate" is presumptuous and ignores or underestimates the foresight of those who will draft the constitution. Wouldn't one think that compromises would be accepted prior to risking civil war? Gelb may be correct in believing a committee is the only feasible way to construct a constitution, but she still underestimates the political acumen of the Iraqi people. After coming so far, it is hard to believe that any committee would allow any civil war "trip-wires" to be written into the constitution. But many in the West have been underestimating the Iraqi's for quite a while now, haven't they?
It's easy to keep the process legal and ensure it does not subvert the election. The election law gives the Assembly the responsibility for putting together the constitution. But it does not say the Assembly has to draft the document itself, or forbid it from assigning the duty to another body. Of course, the special body would still have to submit the draft for the Assembly's approval.
Members of this special constitutional committee would be chosen by the Assembly itself and could be Assembly members as well as appointees of the new government. The composition of the committee is critical. It should include Sunni Arabs in sufficient numbers; if they are not given a stake in the new Iraq, most will continue to help their vile insurgent brethren, willingly or unwillingly.
The committee must also engage Iraq's James Madisons and Ben Franklins. The constitutional committee has to include the real power brokers in religion, politics and commerce. It's not at all clear how many of these types were elected on Sunday. American officials probably don't know them all, but Iraqis do.
As a practical matter, these local leaders would provide the political cushioning necessary during the yearlong drafting process and would be essential to the final passage of the constitution. The public vote on its approval comes a year hence and requires a nationwide majority. But Iraqi leaders have agreed that the constitution can be blocked by a two-thirds vote in three of the nation's 18 provinces. That could happen in the three Kurdish provinces or in the four controlled by Sunni Arabs. With such stalemate would probably come civil war.
"Gay-braham" Lincoln?
Cathy Young at Reason surveys the "was Lincoln gay" debate from both left, right and center.
Wednesday, February 02, 2005
Demographics and the Culture War
Stanley Kurtz reviews four books under the heading "Demographics and the Culture War" in the most recent Policy Review.
Bush=Wilson?
In 'W' is for Wilson?, James Pinkerton worries that a "Wilsonian" foreign policy will lead President Bush into other liberal policies.
Blinkered History
David Aaronovitch has written a commentary, "Potty history" dealing with the call in Britain to make some "History" standards. It is a good intro to what a historian deals with when analyzing events. One must balance the view from the top with the view from the bottom. Aaronovitch, an ideological "lefty" who tends to favor "peoples history," has a good outlook on the value of teaching a shared history
I understand the need for a shared story that both is and isn't history. [Tim} Collins's speech also included this rather good passage: 'We cannot be surprised that some within the next generation do not value our parliamentary democracy if they know nothing of the English civil war, do not vote if they are not taught about the struggles to widen the franchise, and do not value any authority figures if they are not told the inspiring tales of the national heroes of our past.'Again, it must be remembered that he is talking about Britain, but the warning is worth remembering. However, I must confess that I think that what can be taken from this is that History is a balancing act. Youth are best served if their historical instruction is in the form of a holistic narrative, as opposed to an ideological one. One can raise up heroes of all classes, both soldiers and diplomats, farmers and soldiers, without denigrating any of them.
But such an approach also has dangers. Palestinian schoolchildren are not taught about the extermination of the Jews. The decision has been made, I imagine, that this story would be obliterating. It would over-shadow their own national myth and make the sudden disaster that befell them in 1948/9 seem somehow understandable. The result of this omission is, and can only be, a complete failure to comprehend what has happened. I imagine that Israeli education similarly denies the real experience of the Palestinian Arabs.
We could easily make the same mistake here. Mr Collins is asking the historian Andrew Roberts to draw up a prospectus for what children should know about British history. It seems to me that this remit is too narrow.
However bad it may be not to know what Nelson's ship was called, isn't it infinitely worse that virtually nobody in Britain would be able to correctly answer the question, 'After the Soviet Union, which country lost the most citizens in the Second World War?' It was China. The same China which is, at long last, beginning its rise to world prominence. So what will be more important for every schoolboy to know about: Trafalgar or the Cultural Revolution? Zaì-jiàn.