Wednesday, July 27, 2011

Fourteen: Bourne in the USA



Randolph Bourne is long forgotten and so not much missed – well at least by anyone besides me. But, should someone who writes the following about education be forgotten by anyone who has an inkling of an interest in education – now, especially, when we live in these educational dark ages (EDA): “If the school is a place where children live intensively and expressively, it will be a place where they will learn” – should such a person not be missed. Bourne continued, “The ideal educational system would continue with the adult all through his or her active life, sharpening skill, interpreting experience, providing intellectual tools with which to express and enjoy.”(Education and Living, p. vi)  How far is this from the practice in our schools? What school allows kids to live intensively and expressively? Where do people see education as something that is lived beyond the school and “school age”? Are schools not places that merely teach conformity and a few skills (of the narrowest kind)?  Have we not, to quote from Bourne, “clos[ed] off the school and box[ed] up learning,” and in the process, “really smothered education”?(p. vii).

Back in 19whatever, he wondered aloud, or at least in print, “Are we not getting a little restless over the resemblance of our schools to penitentiaries, reformatories, orphan asylums, rather than to free and joyous communities?”(p. 3) Education, at its best, was a prescription for life – as if such things could be prescribed; how much better would it not be if life were a prescription for education? What would that be; what would it look like? “The problem of American education,” indeed, “is now to transform an institution into a life.” It may seem like a cop-out, not answering the question of how this is to be done but, in a way, Bourne, working in the early days of public education, was at a better vantage point to see what education and living might look like.  After about one hundred years of EDA, it is pretty hard to say now what it should look like.

One thing is certain, however, these words of Bourne constitute a prediction of where we have arrived today, and where we need to slam the machinery into reverse and back out from so we can try again: “[T]here is a danger," he wrote, "that we shall create capable administrators faster than we create imaginative educators. It is easy to forget that this tightening of the machinery is only in order that the product may be finer and richer.” Instead of the EDA maxim that education should make us “finer and richer”, we need to promote “the creative life.” As Bourne continued, “Unless it does so result in more creative life [education] will be a detriment rather than a good. For it is too easy,” he concluded, “to make the running of the machine, the juggling with schedules and promotions and curricula and courses and credits, the end.” 

What we have is a system that is on autopilot that really doesn’t do anyone much good. Even if it were supposed to make us finer and richer by making us more competitive, it isn’t doing that. And why would it? In conformity, after all, we have a commitment to past practice. And past practice is merely the recipe for the replication of disasters we have already experienced. Thus, when you think about it, much of what one learns in a business school is how to make profit from prophecy – i.e., in the financial world. Wealth – social wealth – is not created as a result of business schools. Generally speaking the same can be said for any vocational school – it is teaching past practice and becomes, by and large, a recipe for redundancy, and the inability to respond to experience because of trained incapacity.

The pillars of the American educational system are modes of transportation and businesses that are virtual dinosaurs – and when one uses the word dinosaur one is being unfair to these beasts, because while one certainly believes that these modes of transportation should be consigned to the dustbin of history (one could only wish), at least the real dinosaurs weren’t responsible for their own destruction. Oil and the automobile, the products of the combustion engine – human creation – are dragging humanity along a suicidal mission (I write this on an airplane, so one can only marvel at one’s own ability to deny reality, but hey, who’s perfect?). But, instead of really doing something about this, really saying “STOP, STOP, STOP, we need to think about what we are doing," the EDS teaching institutions merely prescribe more (testing) and better (results). C’est la vie, or c’est la mort, as they have the Gaul to say.

We have the death of humanities occurring before our eyes, at almost every humanities-based institution in the United States, and along with it we can predict the end of humanity.  Some of the most forward-looking institutions in the country, some of the most progressive and imaginative – one of which I feel I am lucky enough to be employed at – fight an up-hill battle in the face of the forces aligned against all valuable and significant educational ventures.

In 19whatever, when Bourne was writing, there may have seemed like there was an opportunity to really attempt something different. Then came two world wars (one of which the Americans should not have been involved in, and the other the product of the former), a hot Cold War, and a plethora of other ideological wars that have constrained society in quite fundamental ways. So one is left feeling that that dream is over. Not even the 60s could bring it back; not the end of empire; not the fall of the Berlin Wall; the Arab spring; nothing.  

And nothing shall come of nothing. 

Saturday, July 23, 2011

Thirteen: Job on the Market



There was a man in the land of Uz, whose name was Job; and that man was perfect and upright, and one that feared God, and eschewed evil. 
The decade of the 1990s was an interesting time for the historian on the academic job market. And for no one was this more so than for the African-Americanist. Indeed, if someone wanted to understand many of the dynamics of race in the United States at that time there were few better ways of doing so than spending a few years searching for an African-Americanist position at a college or a university. Certainly, I am very grateful for having had this experience, regardless of its outcome. Had it turned out differently, had I been given a position at one of the many institutions to which I applied, I might well have lost as much as I would have gained. I wouldn’t be able to write what follows, for example.
When I was ending my graduate studies at the University of Pennsylvania and beginning the process of applying for a job, I was often told about the difficulty of getting such a position because I was not African American. Indeed, I had known this would be the case because my advisors had told me to rethink my thesis topic for this reason. But I had forged ahead with my thesis on Philadelphia’s African Methodists anyway, mainly because I remained under the illusion that I wasn’t really at graduate school for the purpose of getting a job. I had some rather dated and romantic notion that graduate school was about intellectual inquiry and that all of the students there were absorbed by their desire to understand ideas rather than preoccupied with crass material considerations like how to get a job. I had found a topic in which I was interested and that was good enough for me. 
But, if there had been some truth to this romantic picture of graduate schools in the early years of study, when it came close to time to consider what to do after graduate school everyone seemed to change, or at least I began to see that most students hadn’t really been that interested in the ideas per se, but rather in what those ideas might do for them, if they were suitably packaged and endorsed by the appropriate thesis advisor (one who had some pull on the dreaded market). Once I had recognized this change, I soon started to pick up on that constant banter relating to the job market. And what I heard most often from my white Americanist colleagues was that the deck had been stacked against them, or us. All the positions, they proclaimed loudly, almost shamelessly, were going to black candidates less qualified than they believed themselves to be. Interestingly, the ones I heard this from most frequently were all men, who, almost to a man, ended up in very good positions in solid research universities. Their white male patrons had made an extra effort on their behalf, seeing as how the deck had been so stacked against them.
I, however, either dissented from this opinion or listened to the banter from the sidelines. I had been studying race after all, and it seemed the height of hypocrisy, on my part at least, to seek a job teaching about African-American history and experiences and not recognize the realities of the market for which affirmative action was small compensation. I have never come to accept that affirmative action is reverse discrimination and even at this early stage of my quest for employment quietly reminded people that there may be a disconnect or disjuncture between what they (social historians all) taught and what they complained about. But in not joining this chorus of complaints, and in not endeavoring to resituate myself on the job market as an Americanist (though with my topic this was difficult, as one wasn’t supposed to think of African American history as American history), I ended up metaphorically on the outside looking in at the celebrations that would be held in honor of the complainants who secured their tenure-track positions. 
Nonetheless, one has to be prepared to face the fact, before simply lamenting the fates, that one contributes to this saga – we are not mere victims, we make our own history, we are not Job with God and Devil conspiring against us. There would be plenty of instances when I would make a misstep, when a choice was laid before me and I made the wrong one, or even refused to recognize the choice. I probably should not have made fun of cricket at my interview at Oxford University; and, constantly bringing gender issues into my analysis of migration was not calculated to endear me to members of my audiences who wanted me to confine myself to the issue of race. Sometimes writing a less than honest and more positive book review of a person who might have supported my candidacy would have been more judicious; and, ending presentations to historians with words from Toni Morrison’s Jazz, was ill-advised indeed. In short, one pays a price for appearing to be an arrogant bastard! I knew these things, but persisted in asking myself, “Can that which is unsavoury be eaten without salt?” Perhaps we are Job after all.
But, even with all these marks of my agency, I still had some bizarre experiences on the job market. I was lucky initially to secure a temporary position as a lecturer at Princeton University from 1989 to 1991, so even I had friends with a little influence. And during those two years I returned again to the job market with a book contract in hand and Princeton University emblazoned on my lapel. Quite consistently, therefore, I would get strong initial responses only then to receive the cold shoulder as prospective employers learned more about who I was. There was no doubt that white professors were treating me in this way (I rarely received these responses from black professors who seldom happened to be the ones in a position to decide whether to hire me). For example, there was one convention of the Organization of American Historians that I distinctly recall. A fellow student at the University of Pennsylvania came up to me and informed me that she had just had coffee with the chair of the Penn State history department. The chair had asked her whether she knew me and whether I was black. When given the negative response to the second question, the Comfy Chair replied, “That’s a shame. He would have been perfect for the position.” I received rejection letter number 47 (or thereabouts) later that spring after having heard nothing more from Penn State.
The strangest incident involved a two-year encounter with Bates College. After putting in my application one year, I received a hand-written letter on small Bates stationary from the chair of the history department.  It read thus:
Dear Mr. Gregg –
Thank you for your interest in Bates. Your teaching and scholarly interests suggest a very good fit with our opening.  I look forward to the complete dossier from Penn….We will attend the AHA in late December, and I’m already convinced by what you’ve sent so far that we shall want to meet you.
Good luck to us all.
OK, so this was not quite sufficient for me to immediately put down a deposit on a house in Lewiston, ME, but it certainly wasn’t your run-of-the-mill response to an application.
I happened to be visiting a friend up in Maine a couple of weeks later and so visited the college briefly, and wrote back to the chair mentioning him this. A similarly effusive letter returned, noting that he was “very pleased re apparent level of interest.” I duly received my phone call from the Chair and a meeting was arranged for the AHA convention in Chicago. Of course, when I entered the hotel room in which the interview was held I discerned that the Chair did a double-take – giving me the same look I receive from some students as I walk into my African American history courses on the first day of each semester. Nonetheless, the interview went well, I recall, and I left the room feeling that I had compensated for my deficiency.
Perhaps I ought not to have been quite so sanguine about my chances of getting the position after having received “the look”, especially after I heard a colleague mention at the convention that the chair of the Bates search had said in his interview that his (the chair’s) job had been made so difficult because so many candidates “looked black on paper.” But I persisted in my fantasies about this position even while I was getting nibbles at other colleges. By February, however, after having had a less than enjoyable experience at Harvard University, I thought I should contact the chair to find out what the status of the search was. I received another hand-written response back. It read:
Dear Rob
I wish I had better news for so strong a candidate and so attractive a person. But the first hard news of any sort for me won’t come before the first week in March, and the signals aren’t good. Perhaps this is all moot by now, and you are signed up elsewhere. For your sake, I hope so. For our sake, if we do not get better news in March, I’ll let you know. 
Needless to say, I heard no more from him. But I did learn that the college did not hire anyone that year.
The next September the college advertised the same position and once again I applied, though not under any illusion that I would be likely to get the job. I did not hear back from the college until I received a letter dated prior to the AHA convention that indicated that the college had already appointed someone to the position. In a two-page, single-spaced typed letter the new department chair wrote a very defensive piece accounting for the choice his colleagues had made (which ought to have been unnecessary), and explaining to a candidate such as myself (there must have been others), that I was too far along in my career and could no longer be considered “entry level.” Facing unemployment, one can still be too far along in one’s career.
Bates, it turned out, had learned from its debacle of the previous year. If the college was going to attract an African American to fill the position, especially because of the college’s location away from any major city, it would have to get into the market early and secure a person (someone who had not finished their dissertation) before another college came along to snatch her or him up. “Is there iniquity in my tongue?” I said to myself. “Cannot my taste discern perverse things?”
But, the most annoying aspect of this, which made me begin to sour quite considerably with the predicament I was in, was the fact that in the first year the college had deliberately not hired anyone, rather than fill the position with someone who wasn't black. This showed me that in many cases the departments were not really concerned about teaching African American history. Rather, they merely wanted to fill some kind of quota, often with very negative implications for the person who was eventually hired. That person would often be hired without the PhD degree, would be dragged onto every committee as the “token” black, and would be required to represent and cater to a whole community, all the while receiving very little sympathy and support from his or her colleagues. Few, it seemed to me, would end up getting tenure under such conditions, or achieving what they might do otherwise in terms of publishing. But life goes on, and the search for a job continues, for, after all, “Is there not an appointed time to man upon earth? Are not his days also like the days of an hireling?”
Before continuing on to our denouement at Columbia, I should digress and tell you about pleasantries at Harvard University, since it was here (as well as at Oxford University, courtesy of an Empire scholar) that I learned of the clear need for affirmative action; I shall elaborate on this briefly because this is germane to the main point I wish to make, that for all its problems a system of racial and gender preference is necessary to counter the prejudices of many people, who, left to their own devices, would only hire people who resemble themselves. 
For some reason, unbeknownst to anyone but himself, the chair of the Harvard search committee, decided it was perfectly appropriate to express his opinions on all issues in front of all candidates. In the process of doing so, he quite openly declared to me (and his search committee) that there had not been any good black or women candidates for the position to which I was applying; the women’s topics in particular had all been rather “silly” and the candidates’ self-presentation in the interviews left a lot to be desired – apparently they were less forceful than the men, and he didn’t like the way they dressed. If he hadn’t done so already, he soon proved to me that his opinion was not worth listening to when at our on-campus lunch he turned to his colleague, who happened to be the one professor who taught about gender in his department (I had expressed interest in her courses, having been involved in teaching the same courses at Princeton), and said, “well, I suppose, that at the first class you ask the students who’s a feminist, and you give an A to all those who put up their hands.” I later learned that the Chair in question had been censured as a result of his behavior. But that didn’t stop the department placing him in charge of another African American search the following year (though this may have been his punishment, so low did they hold the position in the first place). Once again, note, a department hadn’t filled the position when it had failed to locate an African American candidate.
The year I was at Mount Holyoke, 1991-92, a position opened up in African American history at Columbia University. On my inquiring about the position, the chair of the committee was up front about the search, and for this I was very grateful. He did not want me to have high hopes of securing the position. But, he insisted, even if there were no qualified African American candidates Columbia University would definitely be making an appointment. He knew well (as I had told him) what I had been through during the previous year with Harvard, Bates, and some other colleges that I don’t remember anymore, and he did not want to be party to a search that ended in the same way.
But things don’t always turn out as we would wish. I made the short-list of three and was invited to an interview on campus. I spent many days leading up to the interview trying to get my presentation just right. This was made more difficult when, a few days before the interview, I broke my finger attempting to make a steal in basketball. But, as is the case with all such papers, it got written, and while it had some rough spots, I felt quite confident about it. The day before the interview, then, I drove down with my family to New York City to stay with my in-laws. The only problem was that, in grabbing all the accoutrements connected with toddlers necessary to bring my son to stay at his grandparents’ apartment, I had neglected to place my own workbag in the trunk of the car. It was sitting by the back door where it wasn’t going to do anyone any good, least of all myself who was sitting three hours away in a New York City apartment staring down on 1st Avenue wondering what I should do.
There was in fact only one thing to do, rewrite the paper. This I did, hand writing on lined paper an outline and some notes for a forty-minute paper. In some ways, this was fortuitous. While I might have been trying to overcome a case of nervous anxiety, and unable to sleep, I was busily rewriting a paper. I may have slept very little in the end, but in the morning I went uptown on the bus towards Columbia feeling almost elated, sensing that everything which could have gone wrong had done so, and that I now had nothing to lose. I should just try to enjoy myself, let the French Fries fall where they may.
And the day turned out to be an absolute triumph for me. The paper I gave, entitled “Invisible Migration, Invisible Church: Gender and Religion in the Great Migration,” seemed to benefit from being presented from my notes. I seemed to communicate better with the audience than I no doubt would have done in reading from a text, and some of the potential deficiencies of the paper could be glossed over with flights of fancy. It was a great success. The question period was animated and I was warmly congratulated as we left the room by scholars, who seemed astonished when they learned the conditions under which the paper was created. I would hear several weeks later from a graduate student I knew at Princeton, that she had heard great things about this paper – especially regarding my comments about the relationship between gender and the study of migration. I even seemed to make a connection with the students. A week or so after the outcome of the search, I received a letter (which I still have) from one of them enclosing a paper he had written, saying how much he and the other graduate students had enjoyed meeting with me, and that the general consensus among them was that I should be given the job. 
But, it was not to be. A few days after returning to Mount Holyoke, I received a call from the chair of the search during which he indicated that he was extremely disappointed, but that the Provost had decided to close down the search, and that Columbia would not, after all, be making an appointment that year. I learned later from a member of the department, via a mutual connection, that the History faculty had in fact taken a vote and had decided to appoint me to the position, but when they had gone to present my name to the Provost he balked at hiring a white man for the job. Naturally, the search was reopened the next year, and my credentials were now insufficient to warrant an invitation to the university. I never received a token of their gratitude for my performance that day. It would be long forgotten, never to be repeated, except here. “My skin is [not] black upon me, [but] my bones are burned with heat.”
With so many departments coming up empty in their attempts to hire African-American African Americanists, colleges and universities devised some pretty interesting ways of ensuring that they would be able to hire the candidates they wanted. Just as Bates had done, they started to offer positions earlier and earlier in the year. I learned this again to my displeasure in another experience while I was still at Mount Holyoke. I came down to an AHA convention and was invited to meet with two professors from Hamilton College for breakfast. This was not to be a real interview, because they had already offered the position to someone else. However, they wanted to keep their options open and talk to several other candidates because, while they had made their offer very early in the season, they hadn’t been able to persuade their appointee to accept the position. Eventually, the members of this breakfast club would learn that their efforts had indeed been in vain.
The “free” breakfast was a bit of a farce, of course, and all I remember really about it was the concern that one of these two professors, a prominent white Africanist, had with the idea of white people teaching African American history. The political climate was such, he felt, that it was no longer possible for a white person to reach an African American student. He gave me a hypothetical, asking me how I would deal with the situation if a black student refused to accept the idea that Columbus discovered the New World, and argued that Africans had preceded him. I no doubt quipped in response that the New World had been discovered by someone from my neck of the woods, Bristol (I didn’t yet know about the Basques and the Chinese), and questioned whether this was in fact as much of a problem as he felt it was. The cantaloupe was nice; and in eating “the fruits thereof without money,” I noticed the “thistles grow instead of wheat, and cockle instead of barley.”
And this question about the kinds of people who were required to teach particular kinds of history was the central question of that moment. Could white people actually teach African American history? In the following years there would be articles in Perspectives – coming both from white professors, like my breakfast companions (who were throwing up their hands at the prospect of being confronted by militant black students), and from a few more Afrocentric professors – basically suggesting that it was not possible for white scholars to teach African American history courses.[1] This was also at the time that Leonard Jeffries had made certain pronouncements and had been removed from his position at CUNY, and at a time when Asante was beginning to make his mark at Temple University. While I have endorsed Anthony Appiah’s impression of the latter as an “Egyptianist” elsewhere, I nevertheless did not find all of Asante’s and Jeffries’ work particularly upsetting. It didn’t seem to be necessarily qualitatively different from the “propaganda of history” emanating from the mouths of their antagonists.
But could I teach African American history? Interestingly enough, I have always enjoyed teaching African American history courses more than others. It is true that I always look forward to it with some trepidation, as it is always uncharted to some degree. But, once under way, I always get a sense that I am alive and learning something when I am in front of a class of students who want to learn about African American history. Seldom are students in the class because it fulfills some requirement; they are there because they are intrigued by the subject matter, and they seem to be as alive as I feel. In addition, I have always had very mixed classes, racially, so that wherever I have been, Mount Holyoke, University of Pennsylvania (as a visiting professor), or at Richard Stockton College, my African American classes have been some of the most integrated on the campus. 
And generally the feedback has been very positive. I don’t really want to flesh this out too much, because after all I am not trying to get a job here. But suffice it to say that my evaluations have not been lower in African American courses than others that I have taught; rather the contrary, they have been higher. Of course, the question might remain, can an Englishman teach American history? Probably not – but I’m an American.
In the end, as we contemplate our experiences and ultimate good fortune, we are left with the nagging question, “Canst thou draw out leviathan with an hook?” The racial code was a leviathan and it was as clearly present in the liberal academy during the 1990s as anywhere else in American society – perhaps more so. Does it remain in place today? I would have trouble imagining that it isn’t the same for those entering the market today. For, have you seen anyone drawing out leviathan with a hook? Ai, there she blows! The doubloon is mine. Fate reserved the doubloon for me.

Twelve: Never the Twain



This was the text for a sermon I had used in applying to be included in the Internationalizing American History conference in La Pietra -- run by history wallahs/operatives from NYU. Not surprisingly, I failed to gain entrance into that esteemed company, allowing me the opportunity to write one of the pieces for which I have the greatest fondness – “Making the World Safe for American History” (which I will no doubt refer to again). Fun, fun, fun – especially when Sam Clemens is around!

___________________

On arriving at a Bombay hotel in 1896, Samuel Clemens witnessed a German cuffing a “native,” his servant, on the cheek. This at once reminded Clemens of his youth: he had not seen any similar act for almost fifty years. He recalled his father cuffing a slave in this manner, and remembered another occasion, somehow connected, when he had witnessed the death of a slave by his master. At the time, these acts had seemed to him quite natural, just as this German’s act seemed natural to the German now. The connections between Colonial India and the Old South, although separated by many decades, were brought together quite clearly in Clemens’ mind. He wrote:

It is curious – the space-annihilating power of thought.  For just one second, all that goes to make the me in me was in a Missourian village on the other side of the globe, vividly seeing again those forgotten pictures of fifty years ago, and wholly unconscious of all things but just those; and in the next second I was back in Bombay, and that kneeling native's smitten cheek was not done tingling yet! Back to boyhood – fifty years; back to age again, another fifty; and a flight equal to the circumference of the globe – all in two seconds by the watch!

Such is the nature of global interaction and connection. Two separate worlds inextricably linked, distance and proximity jumbled in perceptions of layered meaning, at their source inexplicable and defying assumptions of difference, but through contrast of now and then, West and East, providing both with some meaning. 

Historians witnessing events from Mumbai to Baghdad from Sarajevo to Oklahoma City now need to be as mentally agile as Clemens in this Bombay hotel, circumnavigating the world in seconds and traveling back and forth in time to provide meaning for events that occur in front of their eyes. But this has always been the case. While the end of the Cold War may have made political commentators and historians alike more sensitive to “globalization”, this phenomenon was evident even before the slave trade made it obvious.

Let this be a guide to those who would “internationalize” what is already internationalized, namely American History. It will not be done by adding to our well rehearsed American History narratives; neither is it done by comparison of us with them; it begins with the recognition that the connections and comparisons have been there all along, but like Mark Twain we had forgotten them until they had slapped us around the face (well, not us, maybe a servant or graduate student).

Eleven: Eau Contraire (2003)



Now that it has been proclaimed abroad that “Ideology” is to be found exclusively at Strawbridge’s, Karl Mannheim readers want to know where that leaves us in our search for Utopia?

With the release of a new fragrance called “Eau Contraire”, historians may have formulated an answer to this vexing question. This fragrance has been developed at one of our leading universities, in what must be considered a departure from its history faculty’s strict pursuit of social scientific endeavors.

A marketing strategy has been developed also. A picture of a man dressed in jeans and tweed jacket sitting at his desk in a scholarly pose is accompanied by the text: “Nothing comes between this historian and his Calvin Kleins.” Beneath this emblazoned in bold, red text are the words, “Eau contraire!”

We wish this university great success in its new marketing venture and hope that the proceeds will lead to the creation of many more chairs for its senior faculty.

Ten: The Propaganda of History


W.E.B. Du Bois’s Black Reconstruction in America was published in 1935. It was a book that was largely ignored at least until the 1960s. Those who weren't unwilling to accept that an African American might have important things to say about the history of Reconstruction in the United States, found the author’s adoption of Marxist terminology off-putting. But Du Bois’s text certainly withstood the test of time. Eric Foner, for example, explicitly used it as a theoretical starting point in his writings on Reconstruction. Like Du Bois, Foner accepted the radical nature of this period, and he retained the earlier scholar's desire to account for the actions of African Americans – not merely recounting the deeds of major political figures. But where Du Bois still remains preeminent perhaps is in his understanding of the fact that Reconstruction was part of a larger story. And this was where Marxism became an important aspect of his analysis, for it pushed Du Bois beyond his concern merely for issues of race in the United States, towards a recognition of the class basis and commonalities of all world conflicts across the color line. This poem, then, is founded on (with not much deviation from) the final section from the chapter entitled “The Propaganda of History,” with which Du Bois closes Black Reconstruction. It clearly highlights this sense of the transnational dimension to and connection with American conflicts and their interpretation in the American academy.
I wrote this poem in 1980, or thereabouts; I recently had occasion to read it, along with several other poems I had written based upon the work of Du Bois – the response was very positive indeed – more positive than I would have imagined (explaining why I had never read these poems in public before).

The Propaganda of History

The truer, deeper facts are read with a great despair;
it is at once so simple and so human
– and yet so futile.
There is no villain, no idiot, no saint;
there are just men:
            men who crave ease and power,
            men who know want and hunger,
            men who have crawled.

They all dream and strive with ecstasy of fear
and strain of effort, balked of hope and hate.
Yet the richer world is wide enough for all,
it wants us all and needs us too.
So slight a gesture – a word –
might set the strife in order
            – not with full content
            – but with the dawning of fulfillment.

Instead roars the crash of hell;
and after its whirlwind a teacher sits in academic halls,
learned in the tradition of its elms and elders;
he looks into the upturned face of youth
and in him youth sees the gowned shape of wisdom
            and hears the voice of God.
Cynically he sneers at “chinks” and “niggers.”

Immediately in Africa
            a black back runs with the blood of the lash;
in India, a brown girl is raped;
in China, a coolie starves;
in Alabama, seven darkies are more than lynched;
while, in London,
            the white limbs of a prostitute are hung with jewels and silk.

Flames of jealous murder sweep the earth,
while the brains of little children smear the hills.

Welcome to “History 12.”

Nine: Gatekeeping -- A Guide for our Profession's Newly-Appointed Officials (2003)



While staying at the convention hotel at a recent historians’ conference, a young waiter came up to me and handed me a document that he said had been left in the conference room at one of the previous evening’s sessions.  I am not sure why he thought I might be interested; perhaps he thought everyone at the convention was supposed to have a copy; perhaps he was a provocateur or some kind and thought that I looked disgruntled and disaffected in some way; maybe it was just serendipity.  But whatever it was, I found the document fascinating and thought that I should share its contents with as many of my fellow historians as possible.  It is printed below.
----------------------------------
Congratulations on rising to the highest levels of your profession.  This is a position of considerable responsibility so we hope that you will take a moment to study the following brief guide to help you fulfill your duties most effectively.
Acting as one of the profession’s gatekeepers requires tact and diplomacy.  Remember that appeals to civility and respect (the tense but tender ties of your profession) will contain and channel those who would advance alternative methodologies (advocates of revolutionary methods and subversive ideas, engagers in polemical critique, evil doers).  Remember also that your position of authority can be used even to fashion historiographical debates, helping to determine the answer to the question "What is history?" itself.  You now have considerable power and authority; use these wisely. 
Here are some rules to follow as you minister to the needs of this profession (note that you may be able to learn from the practices of colonial officials in the past, as well as other wielders of power and holders of privilege, so deploy your own grasp of History to supplement this brief guide).  The main principle underlying these rules is: 
Do not engage the opposition in direct debate but use various forms of subterfuge to achieve your objective.
Here are some examples to help you understand this general rule:
– You can use a position of authority within the Organization of American Historians or the American Historical Association, on committees perhaps, to shape aspects of the profession (and a presidency of the AHA or the OAH is in the future for one who serves the profession with great honor and distinction).
– As an editor of the profession’s leading journals you can push for the publication of particular kinds of articles meeting “appropriate” standards – E.P. Thompson's father once described these as “solid highways” and, while we perhaps should be moving on to imagery of a more environmentally appropriate kind, we feel that this is a helpful metaphor nonetheless.
– You can use the anonymity of the tenure or manuscript review to declare that only certain kinds of people and certain kinds of works reflect the proper standards of the profession. 
– You can also set demands for your own students to meet, molding them in your own image, securing positions for them at universities where they can mold others in that same image.
And
– You can then support candidates in the job market who come from graduate stables that you trust, and who have not, as far as you are aware, in any way rocked the boat. 
If all these things are carried through effectively, would-be scholars who fail to conform to the profession’s civil code will be isolated, their own pronouncements will be discredited as products of the prominent chip they bear on their shoulders.  “They don’t have graduate students,” may perhaps be the word put out about them.  Or perhaps they will be discredited because their excessive attention to theory makes them difficult to work with and results in what we would consider poor students with yet poorer dissertations.  Or we may collectively dismiss them as upstarts, easily discredited because they have lost touch with “real people” (the “masses”), and so on, and so forth. 
These methods are, it must be noted, far more effective than the direct assault in writing and open engagement with an adversary’s ideas, which only brings the person more attention than can possibly be helpful.  Let us provide some examples of where this has been shown to be true.
Officials who have been following attempts to stamp out subversive work by the method of direct assault will remember the essays by two of our British Associates criticizing the work of a South Asian interloper from a subversive Subaltern Studies cell (though it calls itself a collective).  They may also know of a lecture made by yet another such associate in England that endeavored to accomplish the same thing, largely through ridicule.  All these efforts were noticeably unsuccessful.  They brought sympathy to this man (we feel uncomfortable using the word scholar) as the potential victim of his senior colleagues’ assaults.  They contributed to making his ideas more widespread, especially since his rejoinders turned out to be more persuasive than the original assaults.  [It was a rather messy business, really.  There was some effort made to show that the interloper was endeavoring to ride two horses simultaneously, but this was very much turned back on the officials, who, much to our chagrin, found that it was their horsemanship that was being brought into question.] Not so long after these interventions, the same man was invited to contribute the first article on Subaltern Studies ever to be published in one of our flagship journals.  We must consider this a failure of intelligence of the highest magnitude.
Fortunately, however, the discipline was rescued by the handling of an agent working out of one of our mid-western offices.  S/he published an essay in our journal which did more to immunize historians against “French flu”, poststructuralist excess, and “post-foundationalism”, than all the British protestations and tirades combined.  Instead of the special issue bringing into widespread usage ideas employed by the members of this South Asian cell, this agent managed to show how they ought to be appropriated and used by proper, mainstream, historians. 
So that you may quickly develop the ability to pull off this maneuver for yourselves, let me elaborate on how this agent accomplished this for us.  On the surface her/his contribution appeared to be a friendly and complimentary appraisal of the output of this truant cell. [The agent, rather deftly we feel, alluded to the earlier assaults by saying that both sides made some very compelling points.  S/he thereby reestablished the value of the critique in a situation where it really wasn’t possible for both sides to be right – this was inserted almost unnoticed into a footnote.  Ah, the beauty of footnotes!]  Indeed, a quick read of this article would leave anyone with the impression that it was a fair and judicious essay, attempting to “rethink” history by incorporating into the study of the agent’s area of expertise the work of a previously neglected group of historians. There was a small problem owing to the fact that many who read the work also felt that the agent went a bit far in suggesting that s/he was in a position to make these pronouncements, but we feel (and think most others agree) that this was justified by her/his years in the profession and her/his “mastery” of an incredibly large body of literature.  Meanwhile, working almost at the level of subliminal messages (most especially in the footnotes, which our officials have found to be an exceedingly helpful tool over the years) and other subtle code, the agent finally revealed to the more diligent reader (those who needed convincing, rather than those who were merely happy that this cell had received the recognition it needed and now it was possible to return to more normal pursuits) that s/he had come, not to praise Caesar, but to bury him.
Another couple of examples can be given to you budding officials of the profession.  Let us take a hypothetical situation.  You are invited to a conference as a distinguished guest.  You find, however, that you are being unduly (and, given your present stature, we can be certain that this is the case) criticized in one of the papers being presented.  There are two approaches you may take to dealing with this unfortunate occurrence (though we should note that these are infrequent, as we do take pride in the quality of our officials).  If you feel that you are on pretty safe ground, and that most of the audience finds this paper unfair in its presentation of your ideas, or they just don’t really want to be associated with the attack on your scholarship (essentially your tutelage over the historiography), then you can take the direct approach.  In this case, use all your years of training to greatest effect and do not take prisoners. The danger here is that if your assault is not a surgical one, or if it should be misdirected so as to hit another panelist, or even an unsuspecting member of the audience who had not appreciated their connection to your target, then you will possibly do more harm than good.  There is no room for collateral damage in this profession.
On many occasions this will not be possible, and if you have reached the top of your profession you should be able to read the situation properly.  If you cannot take the direct assault, you need to develop a “Plan B”, as the lawyers might call it.  This is a kind of diversion play.  Let us say that you are faced with three people on the panel, only one of whom you find offensive.  Your best bet is to ignore entirely the person who is making comments relevant to your own work, and focus on the work of another of the panelists.  Since you are a person of considerable influence it should be quite within your powers to turn the questions in the direction of another issue that may be totally unrelated.  The additional benefit of this ploy is that by ignoring the real concern you effectively reassure everyone that you don’t consider it really worthy of your attention.  In so doing you convince members of the audience that it is not worthy of their notice either.
This is obviously not the end of the process.  Damage control needs to occur at other levels of the conference.  You need to lobby other officials to round up their disciples; you may even ask other scholars to make facetious and disparaging comments in their own presentations, ridiculing the offensive critique or the person who makes it in some way.  You possibly should bring pressure to bear also on the organizers of this conference.  They have invited you for a reason, perhaps to attract other people to the conference; they are in your debt.  You need to make sure that they are aware of this debt.  Let us say that there is to be a volume of essays that will come out of this conference, you must make it clear to the organizers that your paper and the one that you have only just skillfully and publicly ignored cannot be published under the same cover.  If any other prominent official can back up your demand, then you are in a strong position to effectively silence the upstart.  Should this person persist towards independent publication, perhaps you may be able to intervene as a reader of the article or book, and failing that maybe take up the review (again not to engage the ideas but to sit on it so that the book does not receive timely attention in your professional organ).
Remember, the subaltern will only speak if we let him (though let us remember that the subaltern may also be a woman).  Of course, the subaltern will always be able to speak (this is a free country and a liberal profession, after all), but has he (remember, she as well) really spoken if no one is there to listen?
Strength lies in numbers.  Go forth and multiply.

Eight: The Wonders of American Exceptionalism



When I first came to the United States I had occasion to interact with the American medical system on a number of occasions. This was the best system in the world – so it and its representatives claimed – and yet it seemed entirely dysfunctional. One would go to a health service for some minor reason and the first thing you would get would be a referral to see a specialist; no one seemed to be able to do the basic nuts-and-bolts of medicine. A blocked ear passage needing to be cleared, would result in a visit to an ear-nose-and-throat specialist, who would order up expensive tests; one would go away not wanting to have the tests, not getting the treatment so that the problem would only get worse and require more expensive treatment later. A gastro-intestinal reaction to some food would result in all kinds of EKGs, fears of drug addiction, and a whole bunch of other problems “that needed to be ruled out” – all at great cost and all coming back negative. The problem could never be the simplest and most likely cause – or this could only be acknowledged if everything else that it might be, from the rarest and most dangerous to the most obscure and improbable, had been ruled out. What a crazy way to run a system!

What always stuck in the craw somewhat was that whenever one interacted with an American doctor they could not treat you, or even just talk with you, without mentioning how bad the National Health Service in Britain was, and how great their medical treatment was by comparison. They had had no experience of the British system, so their comments merely reflected their ideological beliefs – or their nervousness that any suggestion that a public system could be effective might undermine them in some way. Whichever. I was in a position to correct their error, of course, having experienced both systems and finding the British one preferable, but didn’t do so – they were treating me after all, so it didn’t seem sensible to challenge their misconceptions.

I was reminded of these things by an article I picked up that my father, Ian Gregg, had written back in the 1960s. He had been provided various grants to go to the United States in 1965 to observe the way asthma and chronic bronchitis were being treated in American hospitals, and he wrote up some of his findings for the South London Faculty Journal (connected with the Royal College of General Practitioners). It should be noted that he was very much committed to the National Health Service, at one time being forced out of a practice in Kingston-upon-Thames owing to the fact that his partners wanted to practice some private medicine while he vehemently opposed this. He also noted that whenever he came to the United States and made favorable comments about the NHS, American doctors accused him of being a communist. Peculiar in many of his beliefs he may have been, but a communist he most certainly was not!

A couple of the points he made in his article were telling in this regard.  The first reads thus:

A good illustration of the differences between the British and American outlook was provided by one of the papers which was read at the Assembly [at a conference in Winnipeg] by a Professor of Medicine from Chicago. He described emphysema as “a major public health problem in the United States, a crippling and prevalent disease”. No mention was made of the fact that the disease is usually preceded by a productive cough for many years. Had the lecturer been a British physician, he would almost certainly have referred to chronic bronchitis as being the major public health problem and he would have described emphysema as one of its principal complications.

No prizes for guessing which of the two, emphysema or chronic bronchitis, is easier and less expensive to treat! It also doesn’t take a great medical practitioner to determine that treating chronic bronchitis would reduce the incidence of emphysema, while chronic bronchitis rates would not be affected by simply treating people who had emphysema.

The other contrast between the two systems was found in a story told largely for humorous effect:

The next centre which I visited was Buffalo, where I had been invited to visit the University of New York [he means SUNY] and to be a guest speaker at a Medical Grand Round. A patient with asthma was presented and I was asked to discuss her management. This was not without its amusing moments which were greatly appreciated by the students in the audience. An allergist, who had treated the patient privately, made the intentionally provocative remark that, under the National Health Service, British doctors have no time to explain to patients the details of the treatment which they were receiving. He implied that one advantage of private treatment was the he could spend a great deal of time with patients which was so important in the management of asthma. The effect of these remarks was rudely shattered some moments later when the patient, in answer to one of my questions, admitted that she had no idea what treatment she was having, and whenever she had an asthmatic attack she just took any tablets which were to hand. Later on, it emerged that the reason for her being so markedly Cushingoid in appearance was that for some weeks she had been taking a large dose of prednisone (about six times the usual maintenance dosage) because she had not been told to reduce the number of tablets!

Oh well, the wonders of ideology; no need for actually knowing anything about one system or another – just repeat what you believe, just as if you were a doctor in Soviet Russia parroting the party line.  Glad those days are over, and that we can move forward with a sane and rational system of health care available for all – a right, not just a privilege for a few. Oh no, right, we didn’t get that system did we! 

Friday, July 22, 2011

Seven: Scattered Sparks


Mark Elder always seemed to me to be one of the coolest conductors, and it wasn’t just because I was friends at school with one of his brothers. His comments in 1990 about the traditional Proms selections – e.g., Land of Hope and Glory – being inappropriate in the context of the beginning of the first Gulf War certainly endeared him to me. Whether or not one supported the war, there was certainly nothing to relish in its outbreak, and the fact that he was dismissed from his gig as a conductor for one of the Prom performances only made him the more revered as a martyr to common sense and decency. I am certainly glad that he rebounded from this setback and has had a very successful career since, earning a knighthood a few years back.
I just googled Elder as a result of thinking about classical music and war. I had happened upon descriptions of Sir Malcolm Sargent and his work during the Second World War using classical music (frequently German music, one might add) to rally the British in the face of the Blitz.  My grandparents wrote in their letters to their two sons (who were serving or about to serve in the armed forces) about the various concerts they went to, some of them put on by Sargent’s London Philharmonic Orchestra. That had led me to look up Sargent, whose music I had listened to (most particularly his version of Peter and the Wolf, narrated brilliantly by Sir Ralph Richardson) growing up. It turned out that when war broke out Sargent was in Australia and he had just been offered a contract to work for the Australian Broadcasting Corporation; he decided instead to return to Britain to use classical music to raise public morale. Sargent became a household name largely as a result of the reputation he gained during the war, playing on in defiance of bombs dropping, and so forth.
This connection between classical music and war intrigued me. Many of those to whom the LPO took their music would have been unfamiliar with classical music concerts – but the element of “we are all in this together” may have helped transcend class boundaries, in the way that the Royal Air Force – the main resistance to Nazi Germany at the time – with its much more open and less hierarchical structure and ethos (than the Army), was also doing. Flash Harry, as Sargent became known, clearly presented a different image of Britain to the British public than that with which they had been familiar.
But what stuck with me was the significance of classical music at the time and the attempt being made by Sargent to use it for a larger public good. It was the sittlichkeit that would bind a fragmented civil society together in its hour of need. Thinking about this role for classical music reminded me of Mark Elder’s brush with the establishment in 1990. This brush might be viewed as a product of Elder’s concern (though of course he might see it altogether differently) that music might be used in inappropriate ways in the pursuit of jingoism.
Interestingly, I suppose, Elder’s biography had elements of Sargent's in it. Elder made his name first in Australia and also has been the conductor for the Hallé symphony orchestra based in Manchester, as Sargent was for a brief time in the war. Be that as it may, a mere coincidence, I googled Mark Elder, partly to see what had become of him recently, but also to find out what more I could learn about the 1990 events.
What I came upon was a rather interesting Guardian article from October 2001, three weeks after 9/11.  He had put together a series of concerts for Bridgewater Hall in Manchester that would comprise compositions that had been influenced by or composed during wartime; his article discussed the various pieces and the context in which they were now to be performed. The list of composers included a number of different fellows, Shostakovich, Prokofiev, Britten, Elgar, and Richard Strauss among a host of others. This was a timely series – prescient even – given the attacks on the World Trade Center and the Pentagon that occurred on September 11th. Elder ended his article with a flourish:
Whenever I conduct a piece of music, it is important for me to understand the circumstances that brought it to life. But I am still puzzling over the "Leningrad" symphony: Shostakovich was such an enigma as a person, and he conceived his music to contain more than one layer of meaning. This past three weeks, looking at the pictures in the papers of the rubble that was the World Trade Centre, I have found myself wondering what else those attacks will leave us. How will our creative life respond, over the next 20 years, to what happened there?
This seems an interesting question to me.  Ten years later, how are we responding to those events in our creative lives?  While Sargent would certainly be seen as someone who was brought to prominence by WWII and then symbolized in many ways a Britain that was trying to stave off decline, I wonder what Elder might symbolize for Britain, for the west, etc., in the aftermath of 9/11?

One other thing that occurs to me, though, is the need for relevance — understanding “the circumstances bringing music to life”, does seem important; but also its corollary is very important, ensuring that music has life in current circumstances.  One of the things that Sir Mark does in Manchester, apparently, is ensures that all area children have access free concerts (and he has also managed to break the tradition of evening wear for audiences, which is also striking at some of the elitism of classical music culture). 

Music, History – it’s all very important stuff. There’s a time and place to be a Sargent, calling for courage in the face of a Blitz; and a time to be an Elder statesman, calling for reason in the face of blind allegiance; but there is always the need to make in-roads into the lives of people – whether it is “Classical Music in the Slums” or El Sistema, to open the doors to opportunity – to provide a bit of sittlichkeit for those caught in the quagmire of civil society. Scattered sparks, perhaps, but important ones without a doubt.

Thursday, July 21, 2011

Six: Narrative is Theft


There goes the sad stuff.  The bad stuff.  The-things-nobody-could-help stuff.  The way everybody was then and there.  Forget that.  History is over, you all, and everything’s ahead at last.
                               – Toni Morrison, Jazz (NY: Knopf, 1992), p. 7.


“– I create my first principle.”  Narrative is theft. 

I will demonstrate this principle by adapting an old formula thus: narrative = property = theft.

Some historians have a special attachment to narrative, so much so that they label themselves “Narrative Historians.”  These people range from the immoderate (those who think professional historians should always tell a good yarn) to the moderate (who feel that historians should use whatever methods they can to tell their histories, and one of these may involve telling stories). Of course, calling oneself a Narrative Historian does rather situate one in the immoderate camp, because one is cutting off the possibility of using other methods of approaching a subject that might be more useful. So, a moderate Narrative Historian is an oxymoron to some extent; a historian who employs narrative is pretty much standard issue.

We all, as historians, use narratives of different sorts.  There will be moments when we turn to give an account of some event (whether or not we involve ourselves in the telling of this story). This may take the form of recounting a discussion, telling the story of the event from a particular perspective, or the use of any number of other narrative devices. But the extent to which a historian relies on narrative will depend on several things – audience, politics, and pragmatism.

If a historian is writing primarily for fellow members of the profession, with the intention of persuading such people to think more in line with how that historian feels they ought to think, then the analysis will move pretty steadily into the historiographical, the theoretical, and the tendentious. Such historians will employ narrative along the way to make some points, but unless they are going to rely heavily on footnotes to make their arguments, narrative will not predominate.  If, by contrast, a historian is writing for the lay person, from dentist to anxious patient sitting in the waiting room, hoping also that the book will be picked up by the browser at Barnes and Noble or Borders, then narrative is more likely to predominate over theory. 

Politics may also affect the use of narrative.  Even James Goodman, who promotes narrative history with great persuasiveness, will have a radically different understanding of narrative from that of a narrative historian like James MacPherson.  In Stories of Scottsboro, for example, Goodman provides many narratives, examining a particular set of events from the many actors’ different vantage points. This is not, as he would be quick to point out, the classic nineteenth-century narrative typical of the novel in its early years. And many, wedded to this earlier narrative form, would even dispute Stories of Scottsboro’s claim to be narrative history. And to some extent, this dispute would be framed by politics. Goodman is aware that the classic narrative structure is one that provides aid and comfort to the conservative historian. It seems to suggest a degree of certainty about historical events with which the radical historian ought to feel uncomfortable. Consequently, Goodman argues that by using narratives, and playing these off against each other, he may be able to emulate the post-modern novelist, using multiple vantage points, sometimes even questioning the position of the author, thereby destabilizing the singular, positivistic, sometimes reactionary, historical account. But perhaps such an assertion misses the fact that often the literary figures using these techniques have themselves been trying to disrupt the narrative, and so emulation of such methods, might more justifiably be called Anti-Narrative History.

And then there is the pragmatic attachment to narrative forms. As historians, we may have political objections to narrative. We may find it deeply problematic when we see simple narrative used from textbooks to museums, restating imperial or nationalist mythologies, but we continue to write textbooks and consult with museums.  We may sometimes feel that we should be making a historiographical argument, but, hey, this is a neat story that seems worth the telling. We may, in our darkest moments as historians, yearn to communicate with others through the telling of nifty anecdotes, especially those that have spiced up some dinner party conversations of late. We may have a great screenplay within us, and after finishing up some historical tome, weighed down by impressive footnotes, we now want the release of reaching for a wider audience. We may even want to reap the financial benefits of story-telling, which if it brings the browser to the checkout desk, will also improve our bank balances immeasurably (and attract the possibility of a film producer snatching up the film rights). We toil in the classroom trying to raise students’ consciousness; is it not time, we ask ourselves, that we do something that will enable us to enjoy our leisure time?  We imagine the cameras zooming in for the close-up: 

Voice-over: “Struggling History Professor, now that you have written your best-selling book, what are you going to do next?”
Struggling History Professor: “I am going to the Colonial Theme Park.”

The fact is there are many reasons for turning to narrative, a lot of them perfectly reasonable. Few historians will, in the final analysis, akin to the Monty Python skit, plead again and again, “No, not the bestselling book. No, not the comfy chair!”

This said, however, there are certain things that need to be remembered about narrative, which returns us to our original proposition or formula. How might the narrative be stealing? Here are a number of aspects of the theft.

Anyone who has read about photographers from the wealthiest nations of the world going to less economically prosperous regions will know of the quite common reaction of people to being photographed.  Many people are reluctant to have their photograph taken, not because they know little about this magical technology and fear its powers, but because they object to the kind of control that is being placed over their image (even narrative) by some unknown person or organization. It is not necessary to capture “indigenous” peoples and cart them back to the Old World, or build a display for them in the NaturalHistory Museum. Now all you need is to snap a few shots and put them in some photographic display or album, or place them straight onto the web. You have captured the subject of the photograph and turned him or her into an object of your narrative. Generally speaking it is your creation, it becomes your narrative. You, the itchy-fingered one holding the camera obscura, have almost total control over when you will load your instrument, what you deem important enough to shoot, and how many snapshots of the thing or person you will take. You then have almost total control over what you consider a good picture, worthy of inclusion in your album or on your site, or a bad one, which you discard.  All these things will be shaped by your assumptions and your understanding of ‘proper’ narrative structure. The right of the ‘subaltern to speak’ will be denied, stolen from him or her. You may promise to send a photograph to the individual through the agency that escorts you around the slum or village, if that is where you happen to be.  But you will probably forget, or believe the picture wasn’t that good so it isn’t worth sending, or any number of things (“will they really be able to find that poor young person?”), and so not do so.

Anthropologists worry about these things constantly, I believe, partly I suppose because they are reliant on ‘native informants’ to a great extent. But it seems to me that historians ought to be equally concerned about those they are describing. Which images will they decide are too blurred to be incorporated into the narrative?  Which ones will seem to tell the story so well that they should be accentuated, blown up to five times the size of all the others, and placed on the front cover? 

What this points towards is the idea of narrative as a bundle of silences, as Michel-Rolph Trouillot has argued in Silencing the Past – those silences around the edges of the narrative that give it shape and (often political) meaning. In this light, the creation of narratives becomes an act of stealing from silences to add to a particular noise, stealing from darkness to give to the light. And as historians’ role become more mercenary or “propagandist” (however legitimate) we find them stealing narratives from history to create something to which readers in abundance can relate -- that they find satisfying, uplifting, cathartic.

Can the Subaltern speak through narrative history? Generally historians have a hard time ‘rescuing’ particular kinds of experience from the historical record. They can, as E. P. Thompson showed for the English working class, recover the life experiences and political aspirations of certain artisans. But recovering the thoughts of the woman who is being subjected to the experience of widowhood in India, or rape, or any number of other indignities that have occurred in history, may be more difficult. The weight of the “recovered” narrative will most likely bear down on the narratives of those beneath it. These narratives, in effect, will become the silences that give that recovered narrative its form.

Another related aspect of this is the idea of narrative as a property. There is something very Anglo-American about the obsession over agency and narrative. The Anglo part is probably something post-colonial. What does it mean to be English when the tide has gone out on the British Empire? Is there something that can be learned from the “peculiar” history of the English, when their position in the world has diminished? If it is to be found in working-class history also, in “a liberty tree” (what, according to some, the English-speaking peoples brought to the world), how has it been expressed through the agents of that history?

This line of thinking (derived from E.P. Thompson) has also had particular appeal for those who have written about immigrants in American history.  One reason for this is, in my view, the fact that some earlier immigration narratives, particularly that presented by Oscar Handlin in the Uprooted, made immigrants appear altogether too much like slaves and their descendants. In response to this “lack of respect” for the immigrant, social historians tended to make agency into a tag that they could attach to a particular group indicating that it had somehow achieved success. There is something slightly anachronistic about this: a particular group made it, so it must have had agency. It is the marker of movement out of, away from the ghetto – the historical equivalent of a nicely landscaped suburb. Narrative, the agent of agency, becomes property. The historian accentuates the right decisions made, the purposeful sacrifices, and everything (like property itself to the property holder) makes a lot of sense. Those who died through misfortune on the boat going over to the United States are not summoned up to characterize the larger group. Nor are others who became prostitutes, starved on the street, or died in a mining accident (except as martyrs in the larger narrative of industrial protest).

If, then, narrative is a property, how might it be a theft? It is theft, because history is relational and comparative; a suburb cannot exist without a city to make it distinctive. There is no meaningful statement that can be made about one group (or section thereof) that does not, implicitly or explicitly, rely on a comparison with others. The very construction of “the group”, in fact, relies on a comparison of members within that group with other members, privileging one set over another. Moreover, the group comparison will depend on one group being configured positively (accentuating the successful over the unsuccessful, or the “moral” over the “immoral” in a veritable felicific calculus) and another being configured negatively (inverting the above). Through particular use of narratives, some will be accorded greater cultural capital while others will be left at the level of the dysfunctional or pathological. In this process, narratives are taken from one group and given to another, historians robbin’ one ’hood to provide for another.

If narrative is a property is a theft, how important is this? Should we not eschew narrative altogether? This is rather like the question whether we should opt out of engaging in political activity because a political system is corrupt and undemocratic. The puritans among us will decide to do so, no doubt. But the rest of us live in the real world and are continually making compromises of one sort or another. There is no categorical imperative that will lead us to be cast into the hell fire for making such compromises. But we do, surely, have to recognize the limits of our analysis, the limits of our narratives, their sometimes self-serving aspects? Recognizing that our narratives are suspect – that they steal from the poor to give to the rich, from the inner-city to the suburb, from Brewster Place to Linden Hills, from the slum to Malabar Hill, from third world to first, from South to North, and so on – should give us pause, especially as we make that leap from historical analysis to public policy.

This is not a recipe for quietude. It is a call for more comparison and interrogation of narratives as well as of ourselves. Introspection can be quiet, certainly for the observer looking on, but the noise of self-examination is loud indeed. Recounting narratives can be loud to those listening, but it will often be the kind of noise that silences other narratives and other voices.