Monday, December 20, 2010

Theory, Method, Style

Google must love how we all immediately jump on their new toy and blog about it. (I got it from Brayden at OrgTheory.) Well, it is a cool app.



theory,method gives a not very surprising, but still intriguing, result. Theres's an interesting “event” from about 1860 to 1875, where method seems to reverse its declining fortunes by hitching itself to the rise of theory. Method drops off around 1960 (after rising much faster than theory and holding steady since the 1920s) and, very predictably, theory overtakes method in mid-1960s.

I agree with Andrei (in the comments at Orgtheory) about the recent trends. It just looks like the corpus is smaller in the database. That's confirmed by this chart, too:



It also shows that while "theory" and "method" are subject to the caprice of fashion, style is a rock, a stable foundation.

Sunday, December 12, 2010

Early Christmas Break

I broke my wrist on the weekend. So this blog is going to be resting for at least two weeks. The cast comes off in four. Merry Christmas and Happy New Year to all who observe such things! And happy holidays to those who observe other traditions.

Friday, December 10, 2010

The End of Work

Jonathan has found an excellent post from 2008 at 37 Signals called "Fire the Workaholics". It is a response to this post by Jason Calacanis, the founder of Mahalo, who originally advised start-ups to "fire people who are not workaholics" and told those people to "go work at the post office or Starbucks if you want balance in your life." An appropriately caustic response from TechCrunch, which interpreted the message to be "fire people who have a life", forced Calacanis to rethink his position somewhat, though not nearly enough. He now says you should "fire people who don't love their work". This morning, I want to address the system of values that seems to inform Calacanis's advice.

"Judge people by how much they get done," Jonathan reminds us, "not by how over-worked they seem to be." I would add that you should judge yourself by the same standard, and that you should judge a workplace (in our case, the department you work at, or the PhD program you're in) not by how hard it works you but by how much it lets you get done. If you are working under the impression that your workaholism is part of your job qualification, quit. Indeed, if you are a workaholic, just as if you are an alcoholic, your first problem is precisely quitting. You must stop engaging with the forces that are taking over your life.

If you "love your job" you have, I'm sorry to say it, a twisted sense of love. Not a few commenters in the discussion offered some version of "Mahalo is a joke", i.e, it does not necessarily deserve anybody's love. (I don't know anything about it, but it's in any case just a product.) What's the gold standard of love? How you feel about your family. And it is telling that Calacanis (at least at the time of writing) didn't seem to have much of one:

I DON’T expect folks to check their family at the door. In fact, some of the most productive folks on staff have families, spend tons of time with them, and ARE workaholics. It seems to me that folks with families somehow get much more focused and do more in less time, or find strange hours to work. I can’t explain it (anyone with kids want to check in?!).

Truth be told, I’ve never asked anyone to work harder than I do, and I work seven days a week. I never stop thinking about whatever project I’m working on, and I don’t consider what I do work–never have. Sure, I’ll go on vacation, but that’s when I get my inspiration and when I do a ton of thinking about solving problems. In fact, the entire post was around how to make folks lives BETTER by bringing in food, getting them great equipment, providing resources, and buying the good coffee.

Notice the complete lack of any logic in what's he's saying. "I’ve never asked anyone to work harder than I do," he says. But he works seven days a week, he says; and he never stops thinking about a current project. You can't work harder than Calacanis! He then says that people with families "do more in less time" and therefore find lots of time for their kids. But because he won't admit that he's wrong, he nonetheless describes them as "workaholics". Either they are workaholics and their families are suffering for it, or they're not. They sound like they're not, so if he's seeing them as such, it's probably because they think he'll fire them if they don't pretend to be all stressed out. He doesn't sound like someone I'd like to work for. (I wonder if Bob Sutton has studied this guy's style of "bossing".)

Good work is done by people who can stop working. These are people who understand that the purpose of work is to get something done. That is, work must by definition be completed not merely performed. The end of work is the end of work. Get the thing done and stop thinking about it. Go home. Relax. Every once in a while, take a vacation. You don't need to think about your "problems" while on vacation because, precisely, your work is done. All this goes for scholars too, and especially for PhD students. Don't try to prove that no one has ever cared more deeply about your subject than you. Don't try to prove (to your peers or your supervisor) that you "love" it. Don't be a workaholic. It's a stupid waste of your time; and everybody else's.

Important note to department heads, deans, and PhD supervisors: Do not give the impression that you'll fire people who aren't workaholics. Do not valorize mindless commitment to the job. Valorize thoughtful work completed.

Wednesday, December 08, 2010

A Clear and Present Grammar

May may be Mary. Mary may be at stake. Mary may Mabel Mabel may be fairly May Mary.
Grammar returned for instance.
Account for it.
Grammar. Spindles audacious a reading desk copies an obstacle to interesting him here.
Leaving a sentence.

Gertrude Stein
How to Write


Well, clearly, if everyone wrote like Gertrude Stein, deconstruction would not be necessary. Grammatology would never have been invented. There would be no logocentrism (logos would live at the periphery). The metaphysics of presence would hold no sway.

Writing would not make a différance.

Grammatology, as Derrida conceived it, is the science of writing, the study of significance through traces of différance and supplément. Grammatology is a reading, for instance. It may be Mary, however. Reading is at stake. It accounts for it. Grammatology, we might say, presumes a text and that a text has already been read and that the reading has already given the text a meaning, a more unitary meaning than necessary. It presumes a presumption of meaning. Reading reveals a supplement. It tells us that there is more to the text, but not outside it. It shows us what writing does.

Writing ordinarily represents. Ordinary writing represents. Representation writes order. Then there is deconstruction. Be open to it, but do not let Gertrude tell you "how to write". (She was being ironic.) Let others deconstruct what you have written. Let others expose your "dangerous supplements". Get in there and make the difference. Go.

Too many grammatologists fail precisely to acknowledge the danger of what they are doing. They are, in fact, a clear and present danger to "academic writing", writing that attempts to represent. They are utopians who would (they mistakenly think) prefer to live in an always already deconstructed world (they have made too much of Rilke's lament over an "interpreted world"). They would prefer to live without clarity and presence. Without metaphysics, if you will. So they think.

Someone has to write in the ordinary way. Someone has to say that this is the case, damn you, and this. Then others can come along and say that there is so much more to it than we think. I hope it is clear that I respect their work too. I think deconstruction is necessary—or almost necessary. Perhaps it is precisely never quite necessary but always already happening anyway. I am not against grammatology; I am for grammar. I am for writing.

I presume the botanist is not against plant life. The geologist has no objection to stones. Nor does the gardner fear the botanist, nor the quarrier the geologist. Plants grow. Stones lie there. In any case. Indifferent. We write. How Mary.

Monday, December 06, 2010

Academic English

I have been invited to participate in a workshop about the role of the English language in organization studies next week. The general theme of the workshop seems to be "critical"; it questions the wisdom of using English as the default language for international conferences and international publication. My approach (my wisdom, if you will) has always been "practical" in this area. So, before answering the question of why we should work largely in English (even in Denmark), which I will write a blog post about next week as I prepare for the workshop, this morning I want to write something about how to write in English.

Many people whose first language is not English feel a "language barrier" in their struggle to write for publication. For obvious reasons, they have an easier time expressing themselves in their native language and so they imagine that the difficulty lies in their lack of mastery of English. What they forget, in my opinion, is that the difficulty of writing for international publication cannot be felt in, say, Danish. Exaggerating a little, we can say that they are experiencing what a hockey player might call a "skating barrier". "This business with the stick and puck would be so much easier if I didn't have to wear these skates!" Of course, they forget that the ice is a given. What they really said was: "If only I didn't have to play on ice, I could use my native talent for walking and running to play this game."

There is only one way to break through a competence barrier like this: practice. People sometimes ask me whether I give courses in "How to improve your academic English" or whether I can recommend a good book of English grammar. I try not to. Instead, I tell people to read published work in their field (in English, of course) every day and to write for at least half an hour in English every day as well. I presume that the person who is asking knows enough English to read and understand what is being written in the field. After all, if the problem is learning English in the first place then I would give the same advice, but on a different level: listen to English every day and speak it every day for at least half an hour.

Also, expose yourself to criticism of your language on a regular basis. It doesn't have to be every day, but as often as every week can be quite useful. To this end, I offer our PhD students "piano lessons". They work on a paragraph once a day for half an hour; which is to say, they write a paragraph of prose to support a key sentence we have decided upon in advance. They write that paragraph five times over the course of a week and then bring their best version to me. I then read it out loud for them and critique the language. Depending on the progress they've made, we either assign the same sentence for the coming week or pick a new one. It is always about something they know, something related to their research project.

The reason I don't offer courses and books is that I don't want to do anything to foster the illusion that language skills are a kind of knowledge. Rather, language is a way of knowing—not a form of knowledge, if you will, but the form of knowledge. You don't acquire language skills, you get yourself into linguistic shape. You shape your self linguistically. Those who reject English as their working language (those who lament the fact that English is the default language for academic work) are rejecting this project of self-fashioning (a term Jonathan Mayhew taught me in his comment to this post earlier this year). Some try to present their resistance in this regard as a "political" stance, a critical posture. I try to get them to see that it is a practical issue. What they are rejecting is not a particular regime of subjugation (or, more technically, subjectification) but a particular kind of a labour. Mastering a language is hard work.

Friday, December 03, 2010

Critique and Scandal

There is an aspect of the Frank Fischer plagiarism case that intrigues and disturbs me. It has to do with the role that journal editors played in turning it into the Fischer-Petkovic affair—though I hasten to add that both Fischer and Petkovic had hands in that as well. As background, keep in mind that Fischer’s “sloppy” scholarship existed prior to Petkovic’s discovery of it; also, it turned out to be serious enough to warrant being brought to the attention of the public. (It will be interesting to see what actions Fischer’s university and publishers take after their investigations are completed. But, as I said after looking into it myself, ignorance of these errors is in any case not to be preferred to knowledge of them.) But instead of merely correcting the (many) errors that seem to characterize Fischer’s work, as critical scholarship should, the publication of Petkovic and Sokal’s report has caused a minor scandal.

It seems to me that the correspondence that Petkovic and Sokal published in their report (a longish PDF file) identifies the exact moment when a critical engagement turned into an academic affaire. The crucial decision was made by neither Fischer nor Petkovic but by the editors of Policy Studies Journal, Peter deLeon and Chris Weible. Petkovic sent them his paper on May 18, 2010, and received a mail from Frank Fischer on May 20, 2010, warning him that if he did not drop the issue then his journal’s publisher, Routledge/Taylor & Francis (which publishes Fischer’s journal Critical Policy Studies), would take legal action against him. As a quick aside, I want to note that if this is true it does not reflect well on Routledge, though the threat seems to have been empty. My issue here, in any case, is with the actions of deLeon and Weible at PSJ, who apparently sent both Petkovic's paper and the identifying information about the author to Fischer.

When I asked him about it by email, Peter deLeon explained that after they had decided to desk reject the paper (quite understandably, I would add) they contacted Fischer “to alert him of the emerging possible confrontation, in hopes that [Petkovic] and Dr. Fischer could reach an amiable resolution.” According to the published correspondence, however, they seem to have informed Petkovic that they would not publish his paper wholly eight days after alerting Fischer. Indeed, they appear never to have informed Petkovic themselves that they would pass (or had passed) his submitted manuscript along to Fischer. I'm not sure how common that is, but it seems very irregular for a journal to circulate a rejected manuscript without permission of the author.

Moreover, Fischer quotes deLeon and Weible’s description of his paper as a "jeremiad" in an email that he cc's to Petkovic on May 26, 2010, which also states PSJ’s intention not to publish. But this is still two days before any reply has been made to Petkovic even confirming his submission, let alone the rejection of his paper. Moreover, this very critical assessment of Petkovic’s paper is not mentioned in their rejection letter. "While your paper is interesting and potentially of value to the public policy community," they say instead, "it is beyond the scope of PSJ." PSJ seems to have been rather more direct with Fischer about why the paper wouldn't be published than with Petkovic.

Also, it should be noted that Petkovic had been quite open about his motives when he submitted the paper to PSJ, and had even asked for some "initial editorial thoughts", or advice about how to proceed with this sort of critique:

What I am submitting to you as an attachment to this letter is not an orthodox academic paper, although it contains a part which might be labeled more or less as such. It is simply a bad experience I had with the new public policy journal called Critical Policy Studies. I want to share that experience with the community of scholars devoted to policy analysis and public policy research. I have read in your short web mission statement that you “welcome initial exchanges if a potential contributor has an idea and would like some initial editorial thoughts.”

If you would be so kind, I would like some of initial thoughts on this experience of mine, or at least on my interpretation of that experience. I really want this story to get out in public. (Page 65 of Petkovic and Sokal's report)

Petkovic is a (presumably) young and (demonstrably) cantankerous PhD student and, it seems to me, obviously in need of a great deal of advice about how to develop (or, some would argue, abandon) his position. This "advice" was of course offered anyway—by Fischer—in the form of the thinly veiled legal threat I mentioned, which was perhaps the least constructive way to tackle the issue we can imagine, especially since Petkovic had not yet made his critique public. He was at this point still looking for a journal that would publish it.

I have written two similar papers, one of which was rejected a number of times by a number of journals before finally being published. I have found the rejections I got much more constructive than what Petkovic has experienced. At no point did the subject of the critique contact me directly, and I assume this is because he had not seen the manuscript until the decision was finally made to publish. When he did see it (to be given an opportunity to respond in the same issue), I was fully informed that that is how the editors had chosen to proceed. I would be quite troubled (and somewhat angry) if I discovered that the journals that had rejected my work had also "alerted" the author I was critiquing and, especially, if they had in any way passed around the unpublishable "jeremiad" I had composed.

I normally encourage the authors I work with to send papers to journals even against their better vanities. That is, they are sometimes worried that if their paper is not extremely brilliant, journal editors will begin to develop an opinion of them as mediocrities instead of offering them ways of improving their work and leaving it at that. It has never seriously occurred to me that another possible risk of submitting work for publication is that it will be passed around and mocked by peers without our consent or knowledge.

It is true that the anonymity of peer review is intended mainly to encourage the frankness of the reviewers. But I have always believed that part of the reason for concealing the identity of the author is to encourage submissions. We imagine that even if our manuscript is deemed highly defective in some way our public reputations will not be tarnished. It is only if the paper is deemed good enough to be published that we must also face the public criticism of peers. (Edward Johnson rightly says that authors want to be "protected from criticism", i.e., irrelevant criticism that needlessly interferes with the message.) To have our reputations depend on what happens to our papers after they are rejected, i.e., in private, not public, circulating essentially as rumours about our ideas, rather than our own public statements about them, and without any way of engaging with the "critique", is a troubling prospect.

In my view, a paper is either worth talking about or not. If not, then it should be rejected and forgotten. But if it is worth "alerting" anyone about then it is also worth at least some serious "initial editorial thought". It may even warrant some suggestions for revision, and even ultimately publication (depending on how those revisions go). If this simple procedural principle is not observed, you get the situation we in fact got: a now very disgruntled, very minor scholar seeks (and finds) the support of sufficiently major one to go up against a "clique" that unfortunately seems to be not wholly a figment of the minor scholar's imagination. It is important, after all, to keep in mind that if Petkovic’s critique had been treated with greater respect by the policy studies community (represented by its editors, including Frank Fischer) then he would probably not have sought out Alan Sokal for support. And if Fischer hadn't (unwisely, to my mind) chosen to threaten Petkovic directly, but simply kept the information about Petkovic's discovery to himself, Fischer would have gained a distinct advantage over Petkovic in future confrontations. Indeed, he would probably be in a position to affect Petkovic's career trajectory without his knowledge. Again, it is troubling that an academic journal would facilitate this possibility.

Peter deLeon assures me that it was the hope of the editors of PSJ that “Dr. Fischer and Mr. Petkovic could resolve their dispute in an amiable manner” and that they “regret that their ‘resolution’ has turned so sour.” I wonder if they also regret the act of alerting Fischer without Petkovic’s consent.* PSJ here had an opportunity to mediate a dispute within the policy studies community that now risks becoming a serious embarrassment for it. It did not seize that opportunity, to say the least.

_____________
*I gave the editors of PSJ the opportunity to comment on this post. They politely declined.

Wednesday, December 01, 2010

More Dreams from Papa

(See also this post.)

Ernest Hemingway is the archetypical modern writer. He famously spoke of his art as "work"; he described himself as a "professional". "How does he write?" asked Robert Harling in 1954 (CWH, page 83) and got this answer:

In the early morning. Much of my life has been lived in the early mornings. You get going early for hunting or fishing and get into the habit. In any case, my eyelids are thin, I'm told, and it's better for them in the morning. I get up around six, six-thirty, and start work—or to try to—by eight. I work until ten-thirty, perhaps even midday. Then the day's my own. I can forget work.

As an aside, Hemingway clearly inspired Mordecai Richler's approach, even down to this idea that one works, or rather, that one tries to work. Does Hemingway also forget his characters after his work is done for the day? asks Harling. Yes:

Put 'em right out of my mind. I must—if they're to come alive again the next day. Every writer has his own way of working. That's mine. I take a drink before dinner. Afterwards I try not to. That can spoil things. Then, through the night, through sleep, the subconscious works with the characters. They're alive again in the morning. You understand? Ready for work.

Maybe it's true that there are other ways to work. I doubt it. Certainly, everyone should try Hemingway's approach for a few months. (16 weeks is a good test period.) Obviously, as a researcher, you can't put your subject matter out of your mind after lunch every day. But you can forget the particular argument that you are working on, its particular claims and concepts, its particular empirical materials, its particular theoretical themes. (Hemingway might have been writing a story about marlin fishing or the patrons of a bodega, for example; that did not prevent him from going on a fishing trip or from going to the café.) Or you can try to forget them, anyway. Then try to work the next day. Let them come alive.

What Hemingway understood is that writing takes energy ("juice", he sometimes called it). And as his policy of not drinking after dinner suggests, he understood also that mental energy is connected to other forms of energy. You need to manage your energies intelligently if you hope to write well and with reasonable ease. Finally, Hemingway understood that good writing is not based wholly (or even mostly) on conscious mental activity. Most of the "work" gets done by the subconscious.

I have noticed this in my children's increasing mastery of sports. My son plays hockey; my daughter figure skates. It is always remarkable to see the improvement that seems to take place between practices, i.e., when they are not on the ice. Clearly they skate—try to skate—one day, and then their subconscious "works with" the moves they have been practicing, also through the night. Two days later, they make the same moves easily that they could barely execute at the last practice.

I think all competence develops in this way. And progress can be hindered by working yourself too hard, by not taking breaks, by not letting your subconscious catch up. In fact, competence can be destroyed by not shifting between the effort to work and the effort to forget the work. I'm not here just talking about your ability as a writer, by the way. I mean that your knowledge of your subject matter will develop in a more robust and healthy way if you write consciously about for a few hours every day, and then stop thinking about it (what you're writing about) and turn your mind to other things (like reading about related but different matters, or going back into the field, or teaching). The whole trick is to engage with very complex materials every day after having made a serious effort, for a few hours in the morning, to find what Ezra Pound called "the simplest possible statement" of your understanding of a small portion of those materials.

Monday, November 29, 2010

Show Me the Type

Happy writers are all alike; unhappy writers are unhappy in their own way.* This wasn't quite how I responded to a question at a Writing Process Reengineering seminar on Friday, but it was what I had tried to say. The question was based on the idea, apparently promoted by the writers of some writing manuals, that there are different writing temperaments, different types of writers (the artistic type, I presume, the scientific type, the engineering type, the military type?), and that they should approach the task differently. Didn't my program of "outlines" and "schedules" assume a particular kind of writer (a very orderly and "linear" one)? I was asked. Shouldn't other kinds of writers do things differently?

"There may be different kinds of unhappy writers," I said. "All happy writers do it this way." That is, there may be many different reasons that people are not productive (expressed with sentences that begin "I'm the type that..."), but there is only one reason that they are productive: they are working on a regular schedule, writing paragraphs that fill out an outline. I suppose what I was really saying, however, is that the academic writer is already a "type" and if that's what you want to be, and be it happily, then you will have to experience the joy of writing paragraphs that defend claims one at a time. That's the only way.

If you are the "type" of writer who needs inspiration to write, or the type of writer who needs to read more before you begin to write, or if you are the type of writer who can't write for a half hour or an hour at a time but needs several days to get started, or if you are the type of writer who can't write when you're also teaching, or if you're the kind of writer who worries about how "original" you are or does not not know (or want to know) who your readers are, or if you're the kind of writer who can only write under the pressure of an immediate deadline, well, then, you will be unhappy (as an academic writer) in exactly that way. But if you write every day, always to a thesis and for a readership of your peers, one paragraph at a time, then you will be the "typical" happy academic writer.

Unhappy writers have their own approaches to writing. Happy writers do it my way.

_________
*An aside for those who may or may not recognize this sentence. I am famously a stickler for plagiarism. Why don't I reference this allusion to the first sentence of Anna Karenina? I would argue, and I think rightly, that what I have done here is as obviously not my own (i.e., an obvious allusion to Tolstoy) as if I had written "To write or not to write: that is the question. Whether it is nobler in the mind to suffer the slings and arrows of outrageous critics..." (Nor is my allusion, as a bit of Googling will show, very original either.)

Friday, November 26, 2010

The Crisis of Representation 2

The other day, while casually reading The Foucault Effect, I paused when I got to this sentence by Daniel Defert: "Wage-earners liked having the right to find employment where they pleased" (FE, p. 227). It is an odd sentence because the reader suddenly thinks, "How does he know?" Or, to put it in the terms of my last post, "Who is he to speak for the wage-earners?" Consider Deleuze's remark in "Intellectuals and Power":

A theorising intellectual, for us, is no longer a subject, a representing or representative consciousness. Those who act and struggle are no longer represented, either by a group or a union that appropriates the right to stand as their conscience. Who speaks and acts? It is always a multiplicity, even within the person who speaks and acts.

And Foucault's:

[U]nder the ancient theme of meaning, of the signifier and the signified, etc., you have developed the question of power, of the inequality of powers and their struggles. Each struggle develops around a particular source of power (any of the countless, tiny sources- a small-time boss, the manager of "H.L.M.,"' a prison warden, a judge, a union representative, the editor-in-chief of a newspaper).

And here is Deleuze again:

A theory is exactly like a box of tools. It has nothing to do with the signifier. It must be useful. It must function. And not for itself. If no one uses it, beginning with the theoretician himself (who then ceases to be a theoretician), then the theory is worthless or the moment is inappropriate.

What puzzles me, then, is how a historian who has been "affected" by Foucault can unproblematically say, first, that wage-earners worked "where they pleased" and, second, that they "liked" having the right to do so.

Defert, it seems to me, does not feel "the indignity of speaking for others" when he writes that sentence, which does not make him feel sufficiently ridiculous. Indeed, there is nothing in the text to suggest that he even appreciates the difficulty. It is entirely unclear where he gained access to the likes and dislikes of nineteenth-century workers. It's probably not really a problem in the text we're talking about, of course, but, like I say, something about exactly that sentence made me stop up and question Defert's authority (after having taken it for granted up until then). That's probably very much a consequence of the "crisis of representation". After all, before 1968, "under the ancient theme of meaning", historians simply had the right to say this sort of thing. Today, we expect them to struggle a bit more for the right to speak.

What I wrote a couple of years ago still applies:

"Modern thought," said Deleuze in 1968 (in his preface to Difference and Repetition), "is born of the failure of representation, of the loss of identities, and of the discovery of all the forces that act under the representation of the identical." Today, we normally call this "postmodern" thought. Deleuze probably meant "contemporary" or "our thinking today"; he was drawing attention to something that was only just becoming clear to philosophers at the time.

What we call "modern" (sometimes "classical", here "ancient") thought is born of a faith in representation, of the maintainance of identities, and of the repression of all the forces that act under the representation of the identical. "Repression" is a strong word. "Discipline" might be better. Modern thought takes representation for granted as an orderly process. It assumes that the forces at work under a representation are well-organized, that they can be trusted to dependably make one thing (the sign) take the place of (signify) another thing (the signified).

Wednesday, November 24, 2010

The Crisis of Representation 1

I've noticed that quite a few people come to this blog searching for information about "the crisis of representation". This morning I want to write the first of two posts on that subject, picking up on what I said in some posts from early 2008 (this one and this one).

In March 1972, Gilles Deleuze and Michel Foucault recorded a conversation that was then published in a special Deleuze issue of L'Arc as "Intellectuals and Power". At one point in the conversation, Deleuze says the following:

In my opinion, you were the first—in your books and in the practical sphere—to teach us something absolutely fundamental: the indignity of speaking for others. We ridiculed representation and said it was finished, but we failed to draw the consequences of this "theoretical" conversion-to appreciate the theoretical fact that only those directly concerned can speak in a practical way on their own behalf.

There is much to think about in this description of the intellectual transformation that took place in the 1960s in France. First, Deleuze describes the problem of representation as "fundamental"; second, he suggests that representation (speaking on behalf of others) is ridiculous; third, he rightly describes this as a "theoretical conversion" that implies a "theoretical fact"; fourth, that this theoretical crisis, however, must ultimately be faced in practice. Finally, notice that this last point is a consequence that they had originally (before Foucault "taught" them otherwise) "failed to draw" from their critique.

Since 1972, of course, much has happened; almost forty years have passed. Indeed, I was one year old when this conversation took place and have therefore, let's say, lived my whole life in "the crisis of representation". Reflecting on it now, it seems very accurate to describe representation as "ridiculous" or at least "ridiculed". It lacks dignity. Intellectuals much more often balk at the idea of representing things and people in their writing than actually critiquing (in a Kantian sense) "the conditions the possibility" of speaking for them. It is as if they know, first and foremost, that representation is laughable or, more specifically, that the desire to represent and the presumption that we can do so is laughable. The crisis of representation is not so much a thought as a feeling. One feels ridiculous.

I feel ridiculous even now as I bring it up. Most of my adult life has been lived in awareness of the crisis of representation, i.e., against the background of "the indignity of speaking for others", and even of speaking for things. But I have always liked Bertrand Russell's description of Wittgenstein's central question in his introduction to the Tractatus Logico-Philosophicus. "What relation must one fact (such as a sentence) have to another in order to be capable of being a symbol for that other?" He wrote that in May of 1922, almost exactly 50 years before Deleuze and Foucault sat down to talk. Russell called it a "logical" question (as opposed to the psychological, epistemological, and scientific questions that also pertain to language). We might say that the "crisis of representation" in its modern form, i.e., the form that is familiar to us, arose when this logical problem was converted into a practical one. I think we still too often fail to appreciate "the consequences of this conversion"; in a sense, it is what Richard Rorty very aptly called simply "the consequences of pragmatism".

Foucault later says to Deleuze: "The two themes frequently encountered in the recent past, that 'writing gives rise to repressed elements' and that 'writing is necessarily a subversive activity,' seem to betray a number of operations that deserve to be severely denounced." As someone who struggles with the problem of writing every day, I have come to appreciate not so much the moral indignity (ridiculousness) of representation, nor even its theoretical impossibility (crisis), but, again and again, its practical difficulty, which was precisely what Deleuze and Foucault were talking about. "What relation must one fact (such as a sentence) have to another in order to be capable of being a symbol for that other?" I don't think Wittgenstein's question is ridiculous. But it is certainly not an easy one to answer, just as the problem it marks is not easy to solve. I hope my operations are not going to be too often denounced.

Monday, November 22, 2010

Errata

When I was an undergraduate I was a very poor scholar. I would reproduce ideas from classes and public lectures without consulting the books that were being talked about, which led me, for example, to misspell Gödel when talking about the incompleteness proof (it wasn't quite as bad as "Göbbels", but it tended in that direction). I once answered an exam question about the three major developments in military technology in the eighteenth century by carefully describing the three major developments in military technology in the nineteenth century (or vice versa; I don't recall). One that will always make me cringe was my attempt to compare mental processes to bodily functions, choosing the sufficiently shocking business of going to the toilet to illustrate it, but spelling "bowel movement" less shockingly as "bowl movement". My professor did not pass this mistake over in silence.

I once rushed to see a professor about a paper idea that had just come to me—the philosophical aspects of Macbeth's "is this a dagger I see before me?" He listened patiently to the idea for a few minutes and then interrupted me to ask what recent work in philosophy I was going to draw on to position my analysis in the literature. (That was the assignment.) "What have you read?" he asked plainly. I hadn't gotten to that part of the research yet and I was no doubt vaguely imagining I could skip the library on the strength of my "idea". I will never forget the pained look on his face as he buried it in his big hand and groaned. "This is so irritating," he said, and sent me on my way.

Formative experiences. They are easier to talk about at this remove, but remain with me as small pangs in my memory. They basically reveal my impulse to fraud and sham, to passing myself off as more knowledgeable than I really am. Cynical observers of academic life would of course have predicted a great career in scholarship for me. The ability to speak with confidence on a given subject, after all, i.e., the ability to overcome one's cognitive and epistemic insecurities and say something, is crucial to scholarly success. One must try to get things right, of course, but one must not be paralyzed by the possibility of being wrong.

Cyril Connolly talked about being "too vain to do something badly"; he said that "vanity inhibits us from facing any fact which might teach us something". I agree with him about this, although there is a sense in which my youthful arrogance had to be checked, later in life, by a middle-aged vanity. So now I cringe about other things I do badly, or at least not well enough.

In a recently published paper, for example, I reference a book published in 1999 as one that was published in 1993. A few years ago, in a working paper, I attributed an E.M. Forster quote to Lewis Carroll and then accused Karl Weick (who had not cited Forster or Carroll) of being unaware of the source. Last week, I told a seminar that David Braybrooke and Weick were colleagues at Cornell in 1964, though Weick was not there at the time and Braybrooke has never been. (I was thinking of Lance Sandelands and Karl Weick who are colleagues at Michigan and had been so when Weick published a paper in 1993 in the Administrative Science Quarterly, which is based at Cornell.) Finally, not long ago I read a paper that failed to reference a story by Franz Kafka properly. Because the story shared certain themes with another Kafka story, I assumed that they were getting that story very wrong (to the point of simply making it up), and I began to talk informally about the paper as yet another piece of poor scholarship in the managerial sciences. As it turns out, the story the paper was referring to does exist, so I found myself writing a few emails to retract my slanderous remarks.

Notice the different forums in which I made these mistakes—from published papers to water-cooler talk—and notice that the errors get less and less serious the more "published" my work gets. I'm not actually defending any of these mistakes as excusable "given the circumstances"; I am only saying that we should expect errors when our work is "in progress" and make our best effort to remove them as we go along. We need our peers and colleagues to help us. We must tell them what we think before we can see that we're wrong. This process, then, will hopefully keep the worst mistakes out of the journal literature. But when they turn up there as well, we must not be so vain that they cannot be corrected.

Friday, November 19, 2010

How to Tell the Truth

A PhD student recently explained to me that her difficulties with academic writing stem from her being "too honest". She was certain that the relevant journals would not publish what she "really thought" about the central issues of her field. This elicited what I still think is a profoundly insightful remark from me.

Maybe, I proposed, she simply does not know how to tell the truth. She may be as honest as she likes, but how does she know she is competent to speak truthfully about what she knows? She may want to tell the truth, that is, but she may not be any good at it. After I said it, it occurred to me that good academic writing is simply the truth well-told. That is, the art of academic writing is the art of telling the truth. Research, after all, is intended to discover truths and research writing is intended to write them down.

The naturalistic fallacy is the error of sliding from "is" to "ought" and we might say that the "intentional fallacy" is the error of imagining that there is an easy path from "ought" to "do". In literary criticism, after all, we make this mistake when we imagine that what the author intended to say, rather than what the author actually said, determines the meaning of a text. That is, we assume that the author succeeded in realizing the intention—that we can pass easily from "do" to "succeed". What we forget is precisely that writing is a difficult business and that we may fail to express what we honestly set out to say.

The fact that we know what the right thing to do is does not guarantee that we will do it, and certainly not that we will do it well. Honesty, i.e., the desire to tell the truth, does not guarantee mastery, i.e., the ability to tell the truth. You spend your studies developing that ability. Indeed, even late in life, when working on a particular set of problems, researchers are only in part setting out to "discover" the truth. Much of the work required of a particular research project goes into finding a way to speak it. It is by no means easy.

Here's a simple way to appreciate the difficulty. Suppose you are visiting a foreign country and you don't speak the local language. Obviously, your honesty will not get you by. That is why it is so useful to think of research as a second language; it reminds you that you are learning how to talk in a way that you do not already know how to talk. The honesty you manifest in your first language, must now find expression in another one. It will take time before you can confidently say what you mean. Whether you intend to lie or speak the truth, you will need a new set of skills.

It is of course true that our personal opinions are not necessarily "publishable", but I think it is a mistake to think that our honesty, as such, is a barrier to publication. Rather, think of telling the truth in a particular area as a difficulty that your doctoral training is equipping you to face.

Wednesday, November 17, 2010

Strength in Prose

The downside of books—and blogs—about writing is that they leave the impression that there is something important to know about writing, that we (who know it) can tell you how to write well. People who have difficulty expressing themselves in writing come to feel, by the very existence of so much good advice about how to do it, that their problem amounts to not having been let in on the secret. Underneath their inability to write, that is, they imagine a profound ignorance. After I show them some little trick to getting it done, my authors sometimes say "Nobody ever told me that!"

While it is true that writing instruction (especially in Denmark) leaves much to be desired, it is important to emphasize that you do not learn what you need to know about writing by reading a book or listening to a teacher explain to you how a sentence, paragraph, or journal article"works". You learn how to write well by writing regularly, revising often, and presenting your writing to its intended audience for critique. Good writing is not something you learn but something you train; it is not so much knowledge but discipline that counts. People who "can't write" are not primarily stupid or ignorant. They are weak.

Your prose style, like your physique and posture, emerges from your training. People notice that you "write well" much like they might notice that you walk and stand with a certain kind of dignity, or that you are able to lift and move things with ease. Grace in everyday motion depends on having much stronger muscles than one "needs" for simple tasks, i.e., from being far from the limit of one's power when doing ordinary things, and these virtues of physical comportment (dignity, ease, grace) are of course virtues of style. Good prose, similarly, has a certain kind of strength.

The purpose of a sentence and a paragraph is to affect the reader's mind in some way, to "move" it. The writer pushes against the mental comportment of the reader, and the reader pushes back. While there are a lot "tricks" and "moves" you might learn to "handle yourself" in this situation (to "write with power", as Peter Elbow famously put it) there is simply no substitute for the strength you develop by training, i.e., by practicing this ability to push against the mind of another. A strong prose style develops by repeatedly writing with a relevant audience "in mind", imagining how it will push back, and by presenting it to that audience often, i.e., letting it actually push back.

Though you may have much to learn, your experience of not "standing up" in this shoving match is not primarily an experience of your stupidity or ignorance. It is an experience of weakness. And weakness is relatively easy to overcome. You simply have to work on it every day. Easy does it.

Monday, November 15, 2010

Care and Control

In Canada, the law against drunk driving says that you are not allowed to be in "care or control" of a vehicle while intoxicated. This allows the police to enforce the law against people even if they are not actually driving. That is, it allows them to intervene before the driver actually becomes a danger. And, interestingly, though "care or control" does have something to do with the intention to drive, it can be attributed to a driver even if he or she insists she was in the car for some other purpose.

Why is this important for writers? Well, if my hunch from last week is right, i.e., some writers are to their writing as some drinkers are to their drinking, namely, "out of control", then there is also a sense in which this is an issue of broader public concern. The writer who is unable to control the writing process is a "danger" to other writers. Drunks, after all, are notoriously "careless" about their surroundings, whether social or material. They hurt other people's feelings, and therefore not only their own social relationships, but the ability of others to form them. And they sometimes, of course, hurt other people physically. That's where the problem of drunk driving comes in.

Is undisciplined writing really just as bad as drunk driving? Of course not. Let's keep in mind that undisciplined drinking is not in and of itself a problem; the problem arises when undisciplined drinking is combined with serious activities (like driving) that can have consequences for other people. Writers who produce texts for publication are putting themselves "in care and control" of a text that other writers will have to deal with. If they are not actually very careful, or very much in control of their writing process, they are likely to make mistakes that will affect their peers. The most obvious example is plagiarism, arguably the most serious form of carelessness in writing, but we can also talk about everything from misreading to misspelling. All mistakes "count" in academic writing because they undermine the reliability and usefulness of a text.

This is why it is so important to give yourself conditions (in time and space) under which to write carefully. It is today illegal in many countries to talk on the phone while driving and the reasons are similar to the laws against drunk driving. It is about being a careful, controlled participant in traffic, which is a matter of public concern. The same goes for writing. While you are working on a text for publication, you should be thinking about that text very seriously, and not a lot of other things. Don't talk on the phone while writing a paper, needless to say. And you should not be under the influence of pressures (like promotion and tenure) in the moment (put that out of your mind while writing). You also shouldn't be drunk, which, unfortunately, does not go without saying. The best way to ensure these conditions is to control the space-time coordinates of your writing process. Know when and where you will be committing words to the page for publication.

That qualification ("for publication") is important. The writing that you want to have "care and control" of in this serious sense is your scholarly writing, i.e., the writing you are doing for others. Here you are building a particular kind of relationship, both social and material, with your surroundings, and you want to do this reliably, deliberately, carefully. There is a lot of writing that you can do much less carefully without causing too much trouble, mainly because it is read in the same spirit. Consider the difference between emails, blog posts, journal entries, and "thought writing", on the one hand, and the work you do on a book or a journal article, on the other.

The reason drunk drivers don't always cause damage is that they often have a very "routine" drive home, offering nothing they need to react to quickly. There are no critical moments, so they don't need their critical faculties to be functioning at the top of their game. The same is true of a text you are writing for publication. The weaknesses of a sloppily written text may not be apparent on a first reading or even during the review process. But once other scholars try to engage with it in a critical, detailed manner, issues will more than likely turn up. It may not cause any harm, of course, mainly because the writers around you may just start keeping their distance. Indeed, you are making an extra demand on their critical faculties.

Keep in mind that when you are writing with the intention to publish you are "in care and control" of something that other people have an interest in. Take care. Be in control. Or you will make others wary.

Friday, November 12, 2010

Control

I tell scholars that they should write every day, and that they should do so according to a schedule that is made well in advance. They should know for how long they are going to be writing and what they are going to be writing about every day of the week to come. Today it is Friday, so before the day is through, I should have looked at my calendar and determined exactly when (Monday, 9-10, Tuesday, 9-10, etc...) I'm going to write, as well as what section of what paper I will be working on. Ideally, I should know what claim(s) I will be supporting in prose during each of a total of between 1 and 15 hours next week.

When they hear this, people sometimes tell me I am naive about the amount of control one can have over one's time. Some say that the requisite amount of control is only available during a sabbatical, others say that it is certainly not available while you're engaged in field work, or during periods of intense teaching. To that, I say, "Nonsense!"

Most of the people that tell me they can't control when they will have time to write lead perfectly bourgeois, perfectly decent, perfectly middle-class lives. They get up in the morning. They get showered and dressed. They bring their children to daycare or school, they spend the day at work, where they meet their social obligations with a high degree of commitment and professionalism (they make appointments and they arrive at the appointed times and in the appointed places). "And when the day is through [they] always hurry to a place where love waits, [they] know." They pick up their children from school or daycare. They bring them home. They eat dinner at a decent hour. They get their children to bed on time. And then they go to sleep. Nonetheless, they claim that they "can't control" when they are going write and what they are going to say.

The other day, I discovered a passage in Karl Weick's Social Psychology of Organizing (1979) that gives me a bit of insight into why people say this. "G. W. Bateson," he says, "has argued that one of the major insane premises of Western thought is that we have self-control. He illustrates this by discussing alcoholics" (page 87, my emphasis). Now, that isn't quite true. Bateson's paper offers "a theory of alcoholism" (that's its subtitle), not a theory of Western insanity "illustrated" with an account of alcoholism. But it is quite telling that Weick's general claim, namely, that self-control is an illusion should be modeled on the delusions of self-control that alcoholics have. Leaving aside whether Bateson's theory still holds (it's forty years old), I would hesitate to generalize the fact that alcoholics can't control their drinking into a general "[falsification of] the idea that one is the captain of [one's] soul" (Weick 1979: 88). A theory of alcoholism is not a theory of mind—except, of course, for the alcoholic.

But here's the troubling thought. What my authors are telling me is that they are as much in control of their writing as alcoholics are in control of their drinking. It raises an intriguing but disturbing possibility. Some people (alcoholics) should stay away from drinking altogether because they simply can't control it once they begin. Can the same be true of writing? Do bad writing habits develop like a drinking problem? Do they develop into a lifestyle and then a pathology? We normally begin while we're in school. All this is worth thinking about. At some point, after years of binge writing and weekends spent letting our texts spiral out of control, the would-be scholar may have to give up being a writer. It may simply no longer be possible to control the process.

Wednesday, November 10, 2010

Trenchant, Salient

Some words I use so rarely that when I finally put them in a sentence I'm unsure what exactly they mean, though I feel vaguely like they belong there. This happened yesterday when I was writing an email. The sentence went something like this: "After deciding that the critique is trenchant, the editors should send it out for review." What I meant was some combination of "relevant and supported by argument" or, more colloquially, "spot on", and I suppose the context of the mail will make that clear. But then I looked it up and (as I had suspected) it turns out that "trenchant" means something quite different: "1 (of a style or language etc.) incisive, terse, vigorous 2 archaic or poet. sharp, keen" (Concise Oxford Dictionary.) My use of the word arguably cuts across the two senses: something like "incisive and keen". Etymologically, "trenchant" can be traced back, through "truncate" to the Latin for "maim". A trenchant critique is one that seriously wounds the object, not just hits it squarely, as I had hoped.

There's a song that uses the word correctly, though without teaching us its meaning:

Once I said, and I quote, I just read this thing that you wrote in college,
A trenchant critique of anthropology being accepted as a social science
And not the art of educated observation.

If one did not look it up, one might take "trenchant" to mean simply "solid", "on target", "relevant" or "timely". It means "cutting" but not quite "scathing", assertive but not yet shrill.

Another word that is sometimes used as though we know what it means, but haven't quite understood, is "salient", which means "jutting out; prominent; conspicuous; most noticeable". Here, again, because of the way we hear the word used, we sometimes think it is merely a term of praise, not also an objective characterization. We talk of making "salient points"* in a discussion or constructing a "salient argument". Again, our impression is that the word simply means "relevant" or "good" or "valid". A salient point or argument is actually one that "jumps out at you" or "sticks out". The etymology is useful here: from the Latin for "leap" (saliere).

"If you plan to use 'colubriform' in public," Hugh Kenner warns, "you'd best devote fifteen minutes to making sure it really means what you want it to." The same goes for words like "trenchant" and "salient". Look them up in a dictionary and try to find out how they are used by people writing in your field. Use Google Scholar and try to notice how these words appear in context. Sometimes they are, of course, used incorrectly even by scholars. But often, with the dictionary definition in mind, you can suddenly see that they don't just mean what you thought they meant. When Jones says of Smith's argument that it is "salient", he doesn't mean that it is good (though he often thinks so too). He means the argument leaps off the page, that it stands out from the other arguments that might be cited.

____________
*Note that this can have a technical meaning in statistical analysis too. That's not the sense I'm after here.

Monday, November 08, 2010

Textual Promiscuity

To be promiscuous is to "mix forth" in your relationships. This morning I want to talk about how scholars sometimes mix their writing forth, or "go forth mixedly" when writing, if you will. Last week, we looked at the Frank Fischer plagiarism case, which I think offers a good example.

Suppose it is true, as Fischer and his peers say, that he was merely "sloppy". What does that mean in this case? Well, it means that he has mixed together words and ideas from multiple sources without marking them properly. As Petkovic and Sokal show in their report (PDF, p. 44), Fischer takes two paragraphs from Giandomenico Majone's Evidence, Argument, and Persuasion in the Policy Process (1989) without even citing them, let alone quoting them. Here's what Majone wrote:

Argumentation differs from formal demonstration in three important respects. First, demonstration is possible only within a formalized system of axioms and rules of inference. Argumentation does not start from axioms but from opinions, values, or contestable viewpoints; it makes use of logical inferences but is not exhausted in deductive systems of formal statements. Second, a demonstration is designed to convince anybody who has the requisite technical knowledge, while argumentation is always directed to a particular audience and attempts to elicit or increase the adherence of the members of the audience to the theses that are presented for their consent. Finally, argumentation does not aim at gaining purely intellectual agreement but at inciting action, or at least at creating a disposition to act at the appropriate moment. (p. 22-23)

(…) A logical or mathematical proof is either true or false; if it is true, then it automatically wins the assent of any person able to understand it. Arguments are only more or less plausible, more or less convincing to a particular audience. It has also been pointed out that there is no unique way to construct an argument: data and evidence can be selected in a wide variety of ways from the available information, and there are several alternative methods of analysis and ways of ordering values. (p. 32)

In Reframing Public Policy, Fischer writes:

Whereas a logical or mathematical proof is either true or false (and if it is true, purportedly accepted by those who understand it), practical arguments are only more or less plausible, more or less convincing to a particular audience. There is, moreover, no unique way to construct a practical argument: data and evidence can be selected in a wide variety of ways from the available information, and there are various methods of analysis and ways of ordering values.

Practical argumentation thus differs from formal demonstration in three important respects. Whereas formal demonstration is possible only within a formalized system of axioms and rules of inference, practical argumentation starts from opinions, values, or contestable viewpoints rather than axioms. It makes use of logical inferences but is not exhausted in deductive systems of formal statements. Second, a formal demonstration is intended to convince those who have the requisite technical knowledge, while informal argumentation always aims to elicit the adherence of the members of a particular audience to the claims presented for their consent. Finally, practical argumentation does not strive to achieve purely intellectual agreement but rather to provide acceptable reasons for choices relevant to action (such as a disposition to act at the appropriate moment). (p. 190)

Notice that, in addition to appropriating the content of whole passages from Majone's book, Fischer appropriates Majone's style. For example: "...practical arguments are only more or less plausible, more or less convincing to a particular audience." That's nicely written, and flows much like speech. It is also exactly the way both texts put the point. Fischer takes the writing credit for himself (all references to Majone are made well away from this passage in Fischer's book, and never to the pages he here draws on. That is, he does not cite Majone for these words in any way.)

This way of writing cheapens the relationship between Fischer's body of work and Majone's. It shows that Fischer is mixed up about his textual identity, that he goes from one text to another and learns from it in a merely superficial way. He uses other people's writing in his own texts, but he doesn't let the encounter transform his understanding of the subject. Any sincere reader of Fischer, i.e., one who is going to let the encounter with a text "kick [his or her] ass with its transformative power", as Jonathan Mayhew once described his love of literature, will feel cheapened too. Cheated. After all, consider the reader who thinks Fischer just "nails it" it here, i.e., really captures the difference between argument and proof. Suppose he or she develops a profound respect for Fischer ("falls in love" with his work, let's say). Now, suppose, s/he comes across Majone's text. What will happen to the respect s/he had developed for Fischer's style. S/he would feel like taking a shower, I suspect.

It does not matter, of course, what Majone thinks about this case. He may be "fine with it". Indeed, he may be as promiscuous as Fischer (as Petkovic and Sokal in fact suggest in a footnote). And the whole field may even have loose textual morals (as both Fischer and his supporters find themselves almost arguing in his defense), it does not affect the point that the textual morals in this case are, well, loose. Majone may have known about Fischer's plagiarism for years and simply not cared. They may even have laughed about it. Or they may look at each other, today, like two colleagues who had a drunken fling many years ago at an office party, whose spouses are the best of friends, etc. Promiscuity is an issue not just for those who engage in the proximal textual act, the two writers involved. It is an issue for the whole community of readers because it interferes with our ability to form deep, lasting relationships with texts. That ability is called trust.

(More thoughts on why this matters here and here.)

Friday, November 05, 2010

To Discipline and Bully

Frank Fischer is a plagiarist. This is not something that Kresimir Petkovic and Alan Sokal are merely "alleging", nor is it something that they are "accusing" him of. It's a fact that they have demonstrated. It is a fact, for example, about pages 26-7 of his 2000 book Citizens, Experts, and the Environment, easily verified by reading them alongside page 139 of Alan Sheridan's Michel Foucault: The Will to Truth. That's the example that Tom Bartlett cites in his article on the case in the CHE, based on Petkovic and Sokal's report of their investigations. It clearly shows that Fischer has passed off Sheridan's work as his own.

Frank Fischer is also a highly respected scholar of public policy. Predictably therefore, his reaction to the publication of the facts was shameful. I use that word advisedly: he is clearly full of shame and rightly so. He ought to be ashamed of himself. In fact, already his reaction to the possibility of the publication of the facts about his scholarship was shameful. When Petkovic submitted his findings to a public policy journal, which then sent it along to Fischer, he threatened to sue Petkovic. As Petkovic rightly pointed out, it would be much more appropriate to sue the journal that made the editorial decision than the scholar that made the discovery. In any case, we'll see whether he makes good on that threat. If he does not, he has exposed himself as a desperate bully who hoped that a clearly weaker party would cower before his (phantasmagorical) projection of power.

Fischer's immediate reaction after publication, which was primarily to call the motives of Petkovic and Sokal into question, is also shameful. The facts speak for themselves and no personal motives are needed to explain going public with them. I, for example, didn't know who Fischer was until last night. I am now confident in calling him a plagiarist. My motives in doing so are strictly impersonal. I simply don't care why Petkovic and Sokal went after Fischer. They have chosen a perfectly good target (a scholar of some stature and influence in a field of some importance) and made a solid case for the shoddiness of his work. Petkovic and Sokal's work (which has taken a considerable amount of their time) makes us more informed readers of Fischer's writings, and, indeed, more informed analysts of policy. Their work is especially useful for PhD students who are struggling with the relationship between Fischer's analyses and the vast and often perplexing influence of Foucault on the social sciences.

Also predictably, the reaction of his scholarly community is shameful. Instead of adjusting their opinion of him in light of the evidence, they have distorted the evidence with a typical "nothing to see here, move along" sort of response, intended to assure Fischer's readers and students that everything is in order. "The essence of plagiarism is passing off someone else's work as your own," they say, "and Mr. Fischer did nothing of the sort: He clearly named the authors whose work he was drawing on." As Sokal has rightly pointed out, this is straightforwardly untrue. Fischer clearly did something of the sort, and there can in fact be little more dispute that he did something exactly like, say, passing off Sheridan's reading of Foucault as his own. Shame on them for lying in his defense (or, if this is not a lie but just a mistake, shame on them for defending him without looking at the basis of the charge). Shame on all sixty of them.

Fischer and his colleagues believe that Petkovic and Sokal should have dealt with this behind closed doors, using "due process". But Fischer had every opportunity to do all this in the privacy of his own office when he was informed about Petkovic's discovery; at that point he could have begun the painstaking process of checking through his self-admittedly "sloppy" work instead of making someone else do it for him. He could then have published a full re-evaluation of his own work. Certainly, no argument can be made for keeping what Petkovic and Sokal discovered about Fischer's books to themselves, or between them and Fischer.

This idea that charges of plagiarism are a "distraction" from real scholarship is truly outrageous. Identifying plagiarism, i.e. one kind of relation between two or more texts, is an act of practical criticism. To see this, consider the claim made by his defenders: "Fischer is an innovative thinker who has made a major contribution to the analysis of policy." This is a claim about Fischer's body of work relative to his field. It is therefore a claim one can only make on the basis of an adequate reading of his work that evaluates not just the quality of the thoughts he expresses but their originality, i.e., the "innovation" they suggest and "contribution" they make. The kind of detailed textual analysis Petkovic and Sokal provide is exactly the sort of reading one is (implicitly) committed to when one assesses the originality of a scholar.

What they have done is what I believe I have done (and am doing) in the case of a scholar in my own field. I have described this work as an attempt at what Harold Bloom calls "an adequate practical criticism" of "the anxiety of influence". In that light, it is interesting to note Fischer's alarm over the existence of this sort of scholarship:

[It] may do serious damage to academic culture. These are actions that could create a new environment of distrust and fear, in which self-appointed arbiters of scientific integrity initiate witch hunts against certain individuals and ideas in the name of righteousness. Many find it regrettable that The Chronicle of Higher Education has unwittingly facilitated this approach to academic judgment. One hopes that the anxieties about the impact of such attacks will not become a regular part of academic life. (My emphasis)

There is nothing arbitrary or self-appointed about what Petkovic and Sokal have done. They have discovered something about a number of texts and presented those results to the public. That's simply scholarship. Indeed, it would have resembled a "witch hunt" much more closely if they had gone directly to Rutgers University with their charges. Sokal has taken the opportunity to make a very important point, with which I fully agree:

Why did Kresimir Petkovic and I reveal our evidence to The Chronicle of Higher Education rather than, say, to the president of Rutgers University? The answer is that plagiarism is not principally an offense against one's employer—or even against the person whose words are plagiarized—but is rather an offense against the ethical norms of the scholarly community as a whole.

In fact, one hopes that this "anxiety" will provoke stronger readings (even if they must be misreadings) of Foucault in the future. Passing off another's twenty-year-old paraphrase as one's own "innovative" reading of the master will simply not do. It is shameful that a full professor at Rutgers, who threatened a graduate student in Zagreb with legal reprisals, now casts himself as the victim of "righteousness" when an equally established and respected professor steps in. Fischer is quite right that this is an "academic judgement", and Sokal and Petkovic are simply right to make it. Those who live in "fear" of such criticism do not belong in the academy.

Sorry Dr. Fischer, we cannot allow bullies to present themselves as victims. It makes for bad policy.

(See also Jonathan's post on this. And here's a follow-up on the anatomy of the "scandal".)

Wednesday, November 03, 2010

XXX

What thou lovest well remains,
the rest is dross
What thou lov’st well shall not be reft from thee
What thou lov’st well is thy true heritage. (Ezra Pound)

I don't know if you've ever noticed how difficult it is to find an honest, useful representation of human sexuality on film. Mainstream movies, of course, coyly cut away at the decisive moment (wherever its sought-after rating draws that line), leaving the rest to our so-called imaginations, albeit only after they have been dimmed, like the lights, with romance. Pornography is certainly "explicit" about the sexual act, but it is of course just as much a lie about what it's really like. The internet and developments in personal videography have fostered a large quantity of amateur pornography, which is either amateurish about sex or about making pornography or both. (It's either honest incompetence or incompetent lying or both.) The point is, it is very difficult to find a movie that tells you "how to do it" with any credible authority.

This blog is an attempt to be explicit about our textuality. It offers XXX, hardcore representations of writing. But it is also an attempt to be useful and accurate. In a word, I am trying to be honest. But is honesty really possible about such a personal matter as textuality? Will I not always offer only a fantasy (and worse, my fantasy) about what it is like to write?Indeed, am I even describing how I write? Moreover, is honesty desirable? Is our enjoyment of text not undermined by too much explicit talk about how it gets made? Should writing be approached as a craft, something one can be good at through practice? Is it not necessary to respect the romance of writing?

Leonard Cohen once wrote a poem inspired by the passage in Pound's Pisan Cantos in the epigraph to this post. It is about the reunion of two lovers, after years apart, in a hotel room. He talks about how their "outrageous hope and habits in the craft ... embarrass us slightly as we let them be known". He notes that they "no longer follow fashion" (in dress) and that "we own our own skins". This is the expertise, the craftsmanship, of "the perfect inflammatory word". Interestingly, Cohen is here re-appropriating the craft of love for lovers, which Pound had borrowed for artists, i.e., lovers of craft.

I don't believe that the moralist's and the pornographer's visions, the two opposing banks of mainstream "romance", are helpful, though they are unavoidably involved in our sex lives. (Do I believe the "theorist" is a moralist and the "methodologist" is a pornographer of academic writing? More later.) But there will always, of course, be some important difference between how you "do it" and what others tell you it is like. The trick is to appropriate your own skin. That is the sense in which textuality will always be a personal matter.

Monday, November 01, 2010

How to Write for Whom

"To know whom to write for is to know how to write."

I just noticed that Jonathan has added this sentence from Virginia Woolf's "The Patron and the Crocus" to his list of maxims. It is something that I've had the occasion to emphasize to the PhD students I work with many times. Some of them leave the question of what field (and often fields) they are making a contribution to open far too long. This no doubt stems from certain anxieties about the "originality" of their contribution, but it forgets that their research question didn't emerge in a vacuum. It arose from a set of problems that others are already working on. Those others are, provisionally, the "who" that you are writing for.

I usually say that the problem of writing, whether it is a paper or a dissertation, is defined by the one-sentence answers to two questions. What do you want to say? Who do you want to say it to? Much of what Robert Graves called "the huge impossibility of language", i.e., the difficulty of its use, stems from not knowing these answers.

"But how to choose rightly? How to write well? Those are the questions."

They are by no means easy to answer. Woolf's maxim is interesting because it suggests that once you know what you want to write, the "who" gives you your "how". Truly knowing who you are writing for, however, is not merely a matter of learning a few key references, naming the field, choosing a handful of journals, and attending a few annual conferences. You don't just have to know who your readers are, you have to know how they will read your work. To know how you will be read, we might say, is to know how to write. And you know how you will be read by familiarizing yourself with the sorts of readers you will have.

Unlike the public writer that Woolf's remark was addressed to, academic writers have a straightforward way of learning what their readers are like. Their readers, after all, are themselves writers, and they are exactly the kind of writer that the academic writer aspires to be. They are not all master stylists, and they are not all equally interesting to the writer, but their writings give us access to the expectations of readers, and understanding those expectations is the key to learning how you will be read. Your style develops as your understanding of your audience develops.

You don't have a paper to write if you have nothing to say, but you don't know how to write the paper if you don't know who you will say it to. The luxury of my job as a writing consultant (not a supervisor) is that I can demand answers to the questions of "what" and "who" before I can offer my services. I can't really help people who haven't settled those issues. (A supervisor's job, by contrast, is to help you settle them: to help you clarify what you want to say and help you understand what your readers expect of you.) All I can do is make it clear to writers that the reason they are struggling in their writing, the reason they feel they don't know how to write, is that they are unfamiliar with their reader.

A word of caution, however. One of the reasons that some people never get their dissertations written is that this problem of who the reader is remains the problem throughout the writing process. By equating the "how" and "who" so strongly, Woolf might mislead us into thinking that the only suitable response to not knowing how to write is to go off and read some more, i.e., to familiarize ourselves with our readers, i.e., to not write. This is a mistake for two reasons.

First, it too often sends us off into the reading of works by people who are not even possibly our readers: namely, the major (and often dead) figures of our field or, worse, of the Western canon. Telling yourself that you are developing your knowledge of how to write by reading people like Deleuze and Foucault, Kant or Leibniz, Weber or Marx, is, well, kidding yourself. What you should do is read current work in the journal literature. That's where your audience is to be found. Second, and more importantly, how do you know that you don't know how to write? The only way to know that is to be writing every day. It is by writing every day that you come to understand the difficulty of writing, and it is by reading published work in your field that you learn ways of facing this difficulty.

That is, Woolf's maxim works both ways. To know how to write is also to know whom to write for. You write yourself towards your reader, and you read yourself towards your writer.

Friday, October 29, 2010

A Critique of Pure Ressentiment

One of the most joyful messages available in the work of Deleuze and Foucault urges us to judge power, not by what it oppresses, but by what it produces or fails to produce. To value that which power oppresses is precisely the formula of ressentiment. It gives the game to power because it focuses our attention on the obsessions of power. Thus, "free love" did not so much liberate love from oppression as valorize the perversions that power, in its clumsy way, produced. Do not understand that too quickly, friends. Think about it.

If we instead looked at what power actually produces, at its effects, we would be in a much better position to choose our authorities in accordance with our own values. We would subject ourselves only to those configurations of power that gave us the strength to accomplish our goals. We would engage joyfully with alternative powers in becoming, not simply resent the "powers that be". We would understand, that is, that power is multiple, always in a position to be challenged, transformed, possessed, if always only in part.

Scholars who think that "the professional journals" are one thing, with one power, under the auspicies of one authority, live very impoverished intellectual lives, doomed always to resent the "demands" that "academic life" makes of them. Scholars who engage with the community of scholars more joyfully, respecting both its authority and its multiplicity, learning both from acceptance and from rejection, always developing the talent for thinking clearly, and speaking truthfully, well, needless to say, such scholars live happier lives, writing more productively, and, I would argue, much more effectively, which is to say, better.

This week I have been trying to push back against Levi Bryant's advice for writers, the core of which is aptly summarized in his title "You Can’t Write Before You Write". It is strange that I should object to advice that I would seem, at first pass, to agree whole-heartedly with. Here's what Levi says:

Writing produces the imperative to write more. This is because, as you write you discover new themes, new concepts, and things that need to be worked through. Like a growing crystal, writing expands. In my view, one of the biggest mistakes aspiring writers make lies in trying to write before you write. By this, I mean that many writers, myself included, try to have their ideas before they write their ideas. But things just don’t—at least for me—work this way. Now, of course, just as you need a seed to form a crystal in a supersaturated solution, you need a seed to start writing. However, the seed is not the idea. The idea is something that only comes into being in the process of writing. It is not something that is there prior to writing. The point is not to have the idea before you write, but to allow the idea to emerge in writing. And once you’ve produced a lot of chaff, you then get to the arduous work of polishing and organizing. In this regard, it is a necessity to write obsessively and all the time. This is where ideas are born, not before the act of writing.

Don't wait to have an "idea", says Levi; write all the time. Why does someone (me) who says "write every day" (like every writing instructor) object to this? That is what I would like explore this morning.

First, do not write "obsessively" and do not write "all the time". Write for a few hours every day according to a plan; write in a calm and collected way. Write responsibly. Second, it is not as true as it sometimes seems that ideas "only come into being in the process of writing". Ideas come into being when you least expect it, often quite unconsciously, and are always there in advance of the writing. Your writing simply presents your ideas. When writing, you are writing your ideas down, ideas you already have. It is true, as Levi says, that you shouldn't wait until you are aware of your ideas to write (you should write simply when your schedule tells you to) but the writing does not "give birth" to your ideas, it merely shows them to you.

This is especially true of academic writing, where you write, quite literally, "what you know". If you sit down every day and write down what you think for two hours, i.e., write about the ideas you already have, instead of forcing yourself "obsessively" through the barrier of your ignorance, then your knowledge will grow in a natural way. The next day, you can sit down, without ressentiment and do it again. The tree does not "overcome" itself when it grows; it just grows.

Trying to have ideas as you write (trying to give birth to them in writing) is as unproductive as trying to "write before you write". Levi replaces one joyless imperative with another: don't try to write before you write but do "write obsessively and all the time". Write, he says, to give birth to ideas. No, I say, write to make your ideas clear to you, and to your peers.

The "demand" of your peers is not that you write something but that you present your ideas clearly. This is part of the quality control system of the academy. When you are not writing, after all, you are an authority on your subject. Levi is an authority on Deleuze's philosophy and no doubt teaches his students what Deleuze thought. When Levi writes as an academic (for publication in academic journals) he is presenting his ideas to people who are qualified to correct him where he is wrong, people who know roughly as much as he does about the subject and are therefore able to contribute to the development of his thinking on the subject. To resent this "demand" for clarity, this requirement that we open our thoughts to qualified critique, is to resent the basis of our own authority. It is to abhor the sound, if you will, of our own voice as scholars. We should write to find out what our ideas look like and then submit those ideas, once clarified to our satisfaction, to review by our peers. If it passes the preliminary review, it is thereby exposed to critique.

Deleuze and Guattari say somewhere in Anti-Oedipus that "there are no contradictions, only degrees of humour," and somewhere in my reaction to all this, I hope it will be clear, there is some good-natured disagreement, not just a surly rejection of Levi's position. While we no doubt have different senses of humour, my issue here is with the advice he gives as a writer. I don't think it is sound at all. It is largely the opposite of the advice I give, mainly because I've seen what his attitude towards writing can do to perfectly promising writers and scholars. He is proposing, not to dismantle your ressentiment, but to ratchet it up into a full-blown obsession.

I'm not unaware of the difficult rhetorical space this topic occupies. Writing processes are highly personal matters, and Levi's readers, some of whom we meet in the comments, are right to admire his forthrightness about how his process works. Levi has made a certain attitude (even philosophy) of writing available for critique, and I have exploited that opportunity, now, for all it's worth. But I would caution against taking criticism of his approach (with which a lot of people identify) personally. Indeed, Joseph Goodson's comment (#14) offers a good indication of how difficult this conversation can be. Jonathan had said (#11), as I have, that he "totally disagrees" with Levi (albeit only on a particular point). In response, Joseph offers the following sarcastic retort:

The best way to start a conversation on the internet:

“Totally disagree with this.”

Also good are: “You’re completely wrong,” “what were you thinking?” and “you’re an idiot.”

Ah, the internet.

It is, of course, a version of "Who let this asshole into the conversation?" Notice that Goodson here equates the rhetorical effect of "I totally disagree with you" with the rhetorical effect of "You're an idiot". Jonathan was starting from a position of genuine, if complete, disagreement. Goodson is saying that Bryant should take offense rather than engage with this disagreement. Ah, the internet, indeed!

The "tics and phobias" Levi has bravely, although, I suspect, somewhat self-righteously, presented for us have helped to bring an object lucidly before us. He has, if you will allow me to wax Kantian for a moment, made it possible to investigate the conditions of the possibility of academic ressentiment. I hope only that I've made some small contribution to a critique of this pernicious sentiment, which, as Levi rightly says, seems to be founded on a kind of "transcendental illusion".

PS: Much of the traffic on my blog this week has come via this post at Perverse Egalitarianism. Thanks for the plug, Michael.

Wednesday, October 27, 2010

Authority and Originality

Kierkegaard wrote "without authority", "proprio Marte, propriis auspiciis, proprio stipendio". He never held an academic post and produced one of the most original bodies of work in the philosophical canon. He is rightly celebrated as an outsider that had enormous influence, a "scoundrel", if you will, and he himself identified strongly with Socrates, who of course also worked without institutional authority. What is sometimes forgotten, however, is that Kierkegaard strongly resented his exclusion from academia.

In continuation of Monday's post, I want to talk about the relationship of Levi's ideas about authority to his ideas about originality, staying focused all along on the problem of writing. Levi rightly talks about his views here as "tics and phobias" and as "an impediment to writing", but he manages to reconstruct these "fractal like symptoms" as signs of deeper virtues. Such a reconstruction (of weakness as virtue) is, of course, a classic operation of ressentiment. Let's see how it works here.

"In my core," Levi says, "I am profoundly anti-authoritarian, suspicious of any groups, and resistant to any demands." But he is not, of course, arguing that in order to become a more productive writer, a better writer than he, and a happier one, one should get over one's anti-authoritarianism. Rather, he is arguing that one should avoid writing in genres that are governed by authority. One should write blog posts and letters, and blog-posts-cum-letters. In a slogan, epistles not articles. (That, by the way, is what we're doing right now.) When one does write an article or a conference paper, i.e., when one does pretend to be an "author", one should "trick" oneself into thinking that one is really writing a long letter. If it is published, so be it, one still wrote it without authority.

(I am, of course, trying to draw the standard connection between "authority" and "authorship", which is also a running theme, if I recall, in Kierkegaard's The Point of View for My Work as an Author. )

The same insistence on not treating the disease beneath the symptoms of ressentiment can be seen in how Levi talks about originality. He begins with a clear sense of the weakness of his attitude (qua point of view for the work of writing):

[T]he biggest issue I struggle with when it comes to writing is originality. Am I saying something original? Do I have something original to say? The pursuit of originality, I believe, is one of the most paralyzing things for writers and among the greatest impediments to writing.

But after a digression on kudzu (a weed, whose growth he likens to writing), he reconstructs this impediment (namely, the pursuit of originality) as something altogether more productive:

[W]e suffer from a sort of transcendental illusion. We (or I) think to ourselves that if we have an idea it can’t possibly be original precisely because the idea is familiar to us. It is not new to us. But writing is not for us, but for others, whether those others be our own future selves or the self we are becoming in the act of writing (writing has the magical power to remake you) or for the others who might read our scratchings on bit of napkins. On the other hand, originality cannot be anticipated. If originality could be anticipated it wouldn’t be originality. Rather, originality follows the logic of Lacan’s tuche or chance encounter. Originality is something that occasionally takes place, but if it does take place it can only be known as having had taken place, it can never be experienced in the moment. We only ever know that originality has taken place retroactively. As a consequence, it’s important to surrender the desire to anticipate originality so as to clear a space in which the event or chance occurrence of originality might take place.

Don't worry, that is; you are probably more original than you think. In fact, don't think about it too much. Let it happen. If it does, it will happen "by chance". If people take your "scratchings on bits of napkin" as sure signs of genius, let them. But don't try to make that happen. Rather, trust that it will. I'll grant that that last bit of "faith" is not made explicit in Levi's post, but it is, I think, part of the attitude I'm trying to get at here.

There is another view of originality, of course. It ties the novelty of one's contribution directly and constructively to a respect for authority. In your field, especially as you enter it (note that Levi was unable to respect authority even in school), there are people who do the things you want to be able to do well, people who do those things much better than you. Your aim should be, first and foremost, to master the skills that they master, to learn what they know, to develop your talents in imitation of theirs. Why would you worry about, or even valorize, originality before you have attained basic competence? Indeed, the desire to be original, which, like I say, is transubstantiated by ressentiment into the presumption that you already are original ... or not, but the question is in any case out of your hands ... is too often simply an unwillingness to learn, to study, to pass through the humble (and, for some, humiliating) experience of apprenticeship.

Notice the problem with Levi's position: "originality" is relativized entirely to "the other" who reads your work. To be original, then, all you have to do is find a sufficiently ignorant, sufficiently incompetent audience. This may include your "future self", i.e., your own self once you have forgotten what book you just read your most recent brilliant idea in. The other view of originality holds you to a higher standard—the standard that is defined by the best work currently being done in your field. You should seek out those living masters, these authorities, and study, yes, under them, if only virtually, by reading them, and, importantly, by submitting your work humbly for review in the journals where their work is published. For most people at an early stage, enrolling in a middling university will do. The "trick" is to have some respect for those who already know what you are just beginning to understand. Do your assigned reading out of respect for this knowledge. Don't resent the accomplishments of those you aspire to become.

[Update: "The path to originality is to forget about originality, like Pierre Menard. Originality is tiresome if it is sought after, courted, forced.//Be derivative, like Robert Duncan" (Jonathan Mayhew).]