Saturday, November 2, 2013

How Not to be an Asshole: Search Committee Edition


I wrote before with some advice for job candidates on surviving the search process while preserving a modicum of sanity. Now that the work of my own search committee is underway, I've been thinking about what search committees can do to help job candidates in their quest for sanity. There's a lot about the search process that inevitably produces anxiety for candidates. When every job ad pulls in 100-250 applications (more for some 20th-century jobs), it's inevitable that good people will get cut even in the first round, many for reasons that have less to do with the quality of their work than with their ability to demonstrate that quality in the c.v. and/or in the language of the job letter. This isn't the fault of individual search committees: it's built in to the current structure of the academic job market. But, like so much else about academic labor, a large part of the suffering experienced by candidates on the market is a product of numerous small indignities inflicted on them, too often needlessly, by search committees. It's the death of a thousand cuts. Small changes in the way those of us on search committees interact with candidates could help those candidates a lot, just by making them feel like we view them as human beings.


Communicate with candidates. When you're searching for jobs, a million anxieties about procedure fill your mind. Did I send that letter to the right address? Did I address it to the right person? Did I send everything that the ad asked for? In the digital age, sometimes it's easier for candidates to double-check these things: you can also look through your email or check your interfolio receipt. But the digital age adds an additional fear: did the files I sent actually get there? At bare minimum, then, search committees should acknowledge all materials they receive. It's not hard to write up a quick, 2-3 sentence form letter that can be copied-and-pasted into response emails. In case anyone thinks this is too much work, I'll make it easy for you:

Dear [Candidate],

Thank you for your interest in our department's job opening in [name of field]. We have received all required materials for your application/In order for your application to be complete, we still require your [name of missing materials]. The search committee will be contacting candidates from whom we would like additional materials within [timeline].

Sincerely,
[Whoever's in charge of the emails]

Cut, paste, rinse, repeat. Similarly, if you know you are no longer interested in a candidate, let them know, so that they can stop hoping. While you know pretty well by the time March has rolled round with nary an email from a search committee that you probably won't be getting that job, there's always something particularly demoralizing about the flood of rejection letters that pours in after the campus visit window has closed. I know that some HR departments forbid search committees from sending out rejections until a candidate has accepted an offer, but I suspect that this isn't the majority of institutions. If your school doesn't have a rule against sending out rejections, then you can make it easier for job candidates to manage the emotional toll of rejections, and the terrible anxiety of not knowing, by sending a simple rejection email as soon as you know you will not be pursuing a person's candidacy.

For those of you who perhaps ask your department's administrative assistant to handle emails to candidates: make sure that you communicate frequently with that person to be sure that the emails are being sent out in a timely fashion. In fact, ask to be cc'd on the emails, so that you know that they are going out, and going out to everyone. Your administrative assistant likely has no first-hand experience of the horrors of being on the academic job market. This means that they might not understand the importance of those emails, and therefore might not make them the priority. More importantly, your administrative assistant is likely over-worked and under-paid. This is a good reason to do the work of sending out these emails yourself, but if you choose to ask your administrative assistant to do the work, it's still your job to make sure that the emails get sent. And if your administrative assistant helps you with this, make sure to bring them a bottle of wine or a gift card to thank them for doing the extra work--think of it as a gift from you and the candidates.

Be reasonable in your demands. This means a couple of different things. First, it means that you shouldn't ask for specialized materials at early stages of the process. The first cut in the search process is brutal: a committee wants to get a pool of 150-200 candidates down to 30-40 that they'll look at closely. A committee does not have time to read a job letter, c.v., teaching philosophy, 2 syllabi for job-specific courses, a research statement, a 25-page writing sample, and a teaching portfolio in the first round: thus, if you're on a search committee, you shouldn't ask for these things. I was once on a committee that asked for a lot of material upfront, with the reasoning that it would save the committee time after the first cut--they wouldn't have to ask for the materials. But in trying to save the committee the week or so it would take the candidates to send their additional materials (and usually it was a lot less than a week), the committee was creating more work for the candidates, both by asking them to have at the ready writing samples and recommendation letters (which always come in late) earlier in the process than is necessary, and by adding more materials that the candidates had to assemble (and thus worry about) in the early stages. In the search process, the committees have almost all the power, and the candidates almost none. For those of us who actually care about this power imbalance--which, in the humanities, probably should be all of us--here's a good rule of thumb: it's never okay to make less work for yourself by making more work for the candidates. It's just asshole-ish behavior. I don't think most search committees who ask for lots of materials necessarily think about the fact that they're creating more work for the candidates, but that's kind of the problem. If you're on a search committee, it's your responsibility to think about how your requests are going to affect the candidate. If nothing else, those of us on search committees should always ask ourselves: (1) why do I want this, (2) am I actually going to read it, (3) how much work does this create for the candidate, and, given (1) and (2), is it really necessary?

Second, being reasonable also means looking at the materials you do request with a certain sympathy for the candidate's situation. By this I don't mean that, because searching for jobs sucks, we should keep every candidate alive out of pity--that obviously won't work. But it does mean that we should be careful about the kind of criteria we use to bounce candidates out of the pool. I have known people to throw out applications for things like incorrectly addressed job letters (i.e., job letters accidentally addressed to a different school) and other typos. I've also known people to throw out applications for not sufficiently "responding" to the language of the job ad. Here's the reason I disagree with this kind of practice: in a good year, a search candidate is sending out 20-30 job applications (my biggest year was 42), all due between October 15th and November 1st. In the process of customizing each letter for each job, mistakes are bound to happen. If a person otherwise looks like a viable candidate, I'm just not too worried if they accidentally addressed the letter to a school that precedes mine in the alphabet. It's not a sign that, as one person told me, the candidate "is not serious about the job"; it's just a sign that they're applying to a lot of jobs. I'm generally going to start from the assumption that everyone who submits themselves to the tortures of the job market is serious about every (any) job. As far as the demand that candidates respond to the job ad specifically, this runs afoul of the rule of thumb above--it's asking the candidate to do a lot of extra work (given the number of letters they have to send out in any given 1-month period) to make the job of the search committee easier. This demand also creates a pressure for candidates to misrepresent themselves in order to show that they "fit." Recently, Rebecca Schuman devised an excellent plan for taking some of the time out of the search process for the candidates by moving to a "common dossier clearinghouse" in which candidates could simply upload their materials, and departments could access the materials of those who fit their criteria. This would do away with job letters that responded to specific job ads; it would also make the whole process a lot more straightforward for everyone involved.

Drop the "best person for the job" bullshit. Academics in the humanities are experts at exposing faulty assumptions, subjecting "common sense" to critique, and demystifying ideologies. (This is why we're so much fun at parties.) And yet, somehow, that critical mind flies out the window once we're on search committees, and suddenly we accept as truth the vague, ideologically-freighted assumption that we're looking for "the best person for the job." (For the record, I'm not innocent of this: I've used this phrase more than once, though I'm learning to hate myself a little bit every time it escapes my lips.) We would not accept a first-year paper that argued that people should watch hockey simply because it's "the best sport," without a carefully articulated explanation of what "best" means in this context. Yet somehow we don't hold ourselves to the same standard. The fact is that, if we actually knew what we meant when we said we wanted the "best" candidate, we wouldn't say that we wanted the "best" candidate: we'd say we wanted the candidate who fulfilled a specific set of criteria. Of course, I doubt any search committee could actually agree on those criteria, since we all tend to have very different ideas about what constitutes good scholarship and good pedagogy, so instead we resort to the language of "the best candidate" to hide that fact that we really mean a candidate who makes the most number of people on the committee/in the department happy while pissing the least number of people off. To be less cynical about it, there's an inevitably subjective component of the job search that is a product of the fact that a search committee is made up of people with specific perspectives, biases, preferences, and aversions. Even the best-intentioned of us can't remove all personal bias from a search, no matter how mindful we try to be of them. I've been on two committees in subsequent years searching in the same field for two different hires. We had a lot of repeat applications, and in some cases candidates who were cut in the first round in the first search made it significantly further in the second. The work didn't magically get better from one year to the other; the composition of the committee changed. So the language of "the best person for the job" really doesn't make a lot of sense.

We can argue that this is "just semantics," but I think this attitude actually has a negative impact on the search process and on the psyches of job seekers, because it allows us on search committees to justify our cuts by saying that candidates aren't "good enough": I find it hard to believe that this attitude doesn't affect the way we treat eliminated candidates in a variety of intangible ways. So I think we can do a service to job seekers by adjusting our perspective on the situation. In any given job pool, probably 1/4 - 1/3 of the candidates would be perfectly suitable to the job; and maybe half of those would be great at it. We might be able to make fine distinctions between them--it's in fact the work of the search committee to do so--but that doesn't change the fact that a lot of candidates have done the work and have the skill to do a great job in the position. The sad part of being on a search committee (and in part why, I think, we desperately try to convince ourselves that the search is a straight-forward meritocracy) is that we can only give one job when there are so many deserving candidates. This is what we should focus on when we are searching. We're not trying to find the "best" candidate: we're trying to make sure that the one job we have to give goes to one of the many people who will do a great job at it. When we reject people, it's not because they aren't the best; it's because the realities of the situation require us to, and so for a variety of reasons that have less to do with outright merit and more to do with the particularities of our situation, we've decided that we prefer one very qualified candidate over another.


Being on a search committee is hard--not anywhere near as hard as applying for jobs, of course, but hard because otherwise well-meaning people are put in a position where we have to say "no" to people who don't deserve to be rejected, and because we have to put people through a process that (if we got our jobs in the last decade and a half) we know to be dehumanizing and cruel. And because we're generally pretty smart people, we're very good at coming up with rationalizations for the process to protect us from the fact that we're participating in this cruelty, that we're inflicting suffering on others. Within the context of the search committee itself, there's only so much we can do about the structural evil. But the one thing we can do in that context is to be critical of our own rationalizations, assumptions, and process, in order to treat our candidates as humanely as the situation permits. We need to recognize that the people whose materials we look at are not initiates beneath us, only the best of whom will ascend to our ranks: they are colleagues, many of whom as skilled (some more so) in their fields as we are in ours, distinct from us only by virtue of one search committee's decision. Ultimately, then, when we look at candidates we need to have our own search-committee-version of the golden rule in mind: do unto candidates as you wish search committees had done unto you.

Saturday, September 14, 2013

Surviving the Academic Job Market

I've been lucky enough to have been off the job market for the last several years, but the debut of each year's MLA job list somehow still produces a thrill of anxiety, like the sound of the alarm clock that woke me up for school as a child. I spent 5 years "on the market," and I've watched many of the people I care about go multiple rounds with the academic beast. As a result, when the list comes out I feel bad to be part of a profession that makes its promising new members suffer through a process whose mechanisms are often mysterious and whose rewards often seem to be doled out at random. The systemic problems with the academic job market are myriad and well worth our concern, but the larger questions of how to "fix" the market don't offer much to the individuals who are trying to find academic jobs at present. So, for what it's worth, I've assembled here a few pieces of advice, based on my experience searching for jobs, talking to people searching for jobs, and participating on search committees, for surviving the process as it currently exists.

Remember that it's not personal. Not getting a request for more materials, or not getting an interview, always feels like a gut-punch, but the fact is that, a lot of the time, the reasons search committees don't pursue specific applications have little to do with the worth of the candidate. I have seen job applications thrown out for a lot of reasons, and only rarely has the reason been that the committee members felt the work was sub-par. Sometimes, people are thrown out because they are ABD and there's already a large pool of candidates with a degree in-hand. Sometimes, applications get tossed simply because the work of the candidate is too close to the work of someone already working the department. I've also seen candidates passed over because the work was too different from what was being done in the department (as in, "This person is too theoretical for this department"). On the more sinister side, I've seen applications thrown out because of ideological commitments and petty internal politics. As a graduate student serving on a search committee, I saw (what I thought was) a promising application scrapped with only the comment, "Ugh, a real formalist!" In that same search, I saw Committee Member A argue (successfully) against the candidate preferred by Committee Member B simply because Committee Member B had successfully argued against the candidate preferred by Committee Member A. This isn't to say that all search committees are insane: but even healthy search committees generally find themselves with more qualified, interesting candidates than they do opportunities to interview them, and so, even in well-meaning, self-aware search committees decisions get made not so much based on the individual worth of the candidate, but for more nebulous reasons of "fit." Everyone who goes on the job market gets more rejections than interviews, and so, while rejections are never easy to take (particularly when that school you'd give your right index finger to work at doesn't even ask to see your writing sample), it's important to remember that rejections have a lot more to do with the state of the job market than they do with your worth as an academic. Given that fact, it's important that you...

Focus on what you can control; try to ignore what you can't. This bit of advice applies to all areas of life, but I think it's particularly tricky on the job market, where the line between what can be controlled and what can't sometimes seems a little blurry. It's easy to get drawn into trying to anticipate what a search committee is looking for, whether at the application stage or the interview stage. I've seen people work themselves into fits trying to personalize job letters for specific schools, trying to word everything just right to please this member of the department or to avoid pissing off that other member. But the fact is that, even if you know who is in the department, you don't know (at least at the application stage) who is on the committee. You also usually don't know the internal politics of the situation, which, as I mentioned above, can have an unfortunate impact on the choices that get made. You cannot read minds, and you cannot be all things to all people. What you can do is put together application materials that present the main ideas of your project clearly, that explain why these ideas are interesting, and that demonstrate your commitment to your classroom. In other words, to adapt a phrase from Jersey Shore, you do you, and search committees do them. In the job market process, the only thing you have control over is your work and your presentation of that work, so focus on doing that well, and, as much as possible, try to avoid imagining what the search committees can be thinking. That way madness lies.

Have a plan B. Everything I've just said points to one of the biggest problems with the academic job market for those who are suffering through it: candidates have very little control over their fates. To a large extent, I think maintaining some measure of sanity on the academic job market requires finding ways to get a bit more control over your own future. And the more I think about it, I believe one of the best ways to do this is to actively search out other possible career paths that would make you happy. On a campus visit my second year on the market (for a job I did not get), I was exchanging job market stories with a junior faculty member, and he told me about how close he'd come simply to leaving the academic game altogether. After a few years of searching without a tenure-track offer, he'd made up his mind: if he didn't get a job on the next cycle, he was going to become a substance abuse counselor. Of course, in the next cycle, he got his job, but what struck me about his story was how much more relaxed he said he'd felt that last year on the market. It wasn't just that he'd decided, "It's this year or nothing"--that, I think, would have been more anxiety-producing that anything else, since it would have put all the eggs in one basket and then set that basket on explosives set to a timer. Instead, he'd come up with a concrete, alternate career path, and started looking into what it would take to follow that career path, even while he was continuing his search for academic jobs. One of the most disempowering aspects of the academic job search is the extent to which we feel trapped by it. Our fates are held hostage to a year-long search process during which we have no control over what the "gate keepers" of the profession will do, and for a lot of people on the market, it feels like there's no choice--or at least no good choice--but to subject ourselves to this mysterious machine. But the fact is that we are all talented individuals capable of doing many different kinds of jobs with our academic training: we shouldn't have to be held hostage to the dehumanizing process of the academic job search. Thinking about what else you can do--and what else you'd want to do--is a good way to take back some measure of control because it allows you to decide when enough is enough. Looking into alternate career paths does not mean giving up your dreams of being a college professor: it's simply a way (1) to remind yourself that you have worth regardless of whether the academic job market recognizes it, (2) to give yourself options in a situation where choice is power. The ability to choose to go another round on the job market, rather than feeling forced to do so, makes a big difference for your mental and emotional well-being. It also, I think, makes a big difference to your self-presentation, which can sometimes improve your chances on the academic job market.


Tuesday, September 10, 2013

The World's End

This post has absolutely nothing to do with academic issues; but it's my blog, and I'll digress if I want to...

My spouse and I recently went to see Edgar Wright and Simon Pegg's most recent film, The World's End. Having been fans of Shaun of the Dead and Hot Fuzz (the swan kills me even in memory), we were looking forward to another successful parody film. What I liked best about the first two was that they managed at once to be send-ups of a genre (horror film and cop movie, respectively) while also being rather successful examples of the genre. This has always been the hallmark of good parody to me. The World's End wasn't quite the same kind of thing, however. It might be called a parody of a sci-fi film, but, particularly given the similarities between The World's End and Shaun of the Dead, that seems like a stretch. The film also wasn't as laugh-out-loud funny as the previous two. Somehow, however, I walked away from The World's End liking it the best of all three.

My musings on the film might have ended there, were in not for the fact that, the day after I saw the movie, Simon Pegg tweeted this essay by A. D. Jameson that provides a fairly involved and interesting interpretation of The World's End. The link he makes between the film and the Medieval quest narrative in many ways solved the generic confusion I felt after the movie, and his analysis of the references within The World's End to the previous films makes me think that, if the movie is a parody of anything, it's a parody of the other two films. But what most struck me in Jameson's reading was his analysis of the main character:

King begins the film a tragic character, his many flaws all apparent. Only he recalls the past as glorious. Everyone else is glad to have left it behind, and now thinks him mad—a loser unable to function in the world of 2013. King’s biggest mistake, his error, is that he never moved on, never shaped up, never got with the program—he never grew up. As such, he’s treated like a child—as he later cries, complaining about the rehab center, “They told me when to go to bed!”

The message would appear simple: This is going to be a film about learning to mature. “You can’t live in the past, Gary King!”

But what if it turns that out one can? What happens if we take Gary King seriously?

Jameson proceeds to do just this--to take Gary King seriously--and in doing so he gets at what I now realize was for me the central attraction of the film: for once, the fuck-up doesn't have to either (1) "grow up" or (2) die.

I have always liked main characters who are a little bit (sometimes a lot) bad--who are subversive in some way, unwilling to "behave" according to rules determined more by social forces than common sense or necessity. (This is probably why I've spent a good part of my academic career on Milton's Satan.) But the consistent trajectory of such characters in our culture's narratives leads them either to be destroyed by their own destruction (think Cool Hand Luke, or Milton's Satan), or to be domesticated, tamed, and reformed into someone capable of maneuvering within a pre-existing framework (think Rebel Without a Cause, SLC Punk, or, as Jameson points out, Shaun of the Dead). Both options convey the same message: deviance will not be tolerated. But in some ways the second--the reform narrative--is worse because it tends to relegate rebellion to the realm of youthful exhuberance, a phase that inevitably we all have to grow out of to take our place in a world made by others--a place within what The World's End calls "the Network." The hero can live, but rebellion must die.

There's a part of me that prefers rebels without causes. It's not that I have anything against causes: quite the contrary. But to an extent the channeling of rebellious energy into a cause is just a remaking of rebellion into palatable form. We can accept the desire to fuck shit up only if there's ultimately a goal, the promise of a new order that will arise out of ashes of the destruction. This requirement that we place on rebellion to an extent echoes the tired refrain of Occupy critics. All this youthful energy is fine and well, but where's the message?, talking heads whined: what are their goals? Leaving the side the fact that the Occupy movement did in fact have a message that the media mostly ignored, the specious demand that Occupy articulate a specific, unified message was just a way to invalidate the anger. Being pissed off that the world sucks, apparently, isn't okay: you're only be allowed to be angry about it if you have a plan to fix it. We seem incapable as a society of imagining the possibility that the appropriate response to a problem might simply to burn everything to the ground, and it seems to me that this refusal to consider such a thing even possible grows more out of fear than reason.

Which brings me to the other, closely related related feature of The World's End that I liked so much. The apocalypse in the film isn't presented, as it is almost universally in films, as the worst thing ever. Things are harder now, Andy Knightley tells us at the end, but it's really not so bad. Most of us live in fear of order collapsing: both, I think, for the very real reason that the sudden demise of the structures that govern our world would likely cause immediate physical peril--how do we feed ourselves if the world goes dark in winter?--and for the reason that the structures by which we make sense of our lives would cease to exist. I am a college professor, and have always wanted to be: what happens to me when "college professor" isn't a thing any more? There's something therefore brave in the willingness to look at the prospect of total chaos, shrug, and say, "whatever--it's not so bad." In The World's End, the world's end is a comedy, not a tragedy. And the fuck-up hero gets to keep on being a fuck-up, only now in a world where that just doesn't matter. The rules end when the Network collapses, and suddenly what looked like a weakness in Gary--his utter refusal to conform to the norms of the "grown-up" world and his childish desire to lead a band of pseudo-knights--is a strength. He's not afraid to let the world burn for the right to keep being a fuck-up, and somehow, after it burns, being a fuck-up just isn't the problem is used to be. And the world keeps turning.

None of this is to say that I think it's a good idea to live like Gary King (or Cool Hand Luke, or Milton's Satan), or that this is what the film advocates. It's a film: I don't think it's advocating anything. But I think part of the charm of the movie for me is the way it stands our fear of rule-breaking on its head. We think that, if rebellion or aimless antagonism towards social boundaries goes unchecked, the world will descend into chaos. The World's End entertains exactly that notion, but it shows us an end-result where the chaos isn't the horror show we all think it will be. It asks us to look at the fear of losing order--of losing control--that is so pervasive in our culture (as evidenced by the recent popularity of apocalyptic and post-apocalyptic narratives) and laugh at it.

Wednesday, July 17, 2013

Random Thought - Faculty Exchange

Anyone who has spent any time in an academic department knows that departments have their own culture, their own character. A group of 10-30 people who all work together, who make decisions together, forms a kind of community that develops a distinctive way of thinking about problems and solving them. This leads to all kinds of good things, but the down side to this is that departments can develop a kind of myopia, particularly among the members who have been there the longest: if you've been part of a department for 10-15 years, you will be acculturated to that department, and I imagine it gets pretty easy to forget that other places may think about things differently--or, at the very least, it becomes easy to forget how others places think differently.

Departments recognize this fact: it's the reason (or one of the reasons) why we generally don't hire our own graduate students to tenure-track positions within the department. The practice of hiring one's own is called "incest" for a reason: a "healthy" department needs "genetic diversity"--which is just to say that it needs to bring new ideas and new perspectives into the community in order to avoid becoming so insular and self-affirming that the department simply loses touch with what the rest of the world is thinking and doing. In frowning upon hiring those we've trained ourselves, we recognize implicitly that people from outside our own department--our own small community--bring in, not just fresh critical perspectives (the hiring process too rarely looks for a candidate doing really different work), but different paradigms and assumptions about everything from decision-making processes to curricular issues that will enliven and improve an academic community by diversifying it intellectually.

But hiring people trained elsewhere isn't enough to overcome the myopia that comes with the other, positive aspects of departmental community. It's not enough, first of all, because hiring has slowed so much. I've interviewed in departments who were doing their first searches in 10 years. Administrative stinginess with tenure-track lines (at least in my field), combined with a disinclination among the most senior of the senior faculty to retire, means that departments are bringing people in from the outside far less frequently than is salutary. Relying on hires to bring in new perspectives is also not enough because it doesn't do enough to address individual acculturation; that is, while new hires may bring new perspectives to a department as a whole, they don't necessarily have much leverage when it comes to convincing the tenured ranks of the department to think in different ways. New faculty have the least power (particularly if we include the non-tenure-track faculty, among whom we are likely to find the highest turnover and therefore the most "new" people). Speaking for myself, I have more than once been new in a department, felt frustration at what seemed to me a limited way of thinking about a problem the department was facing, but also felt incapable of persuading my senior colleagues to change the terms of the debate (despite the fact that these colleagues have generally been lovely people that I feel lucky to work with). Institutional memory is a powerful force. It is extremely valuable insofar as it helps us remember why our department is where it is and allows us to learn from past successes and failures; but institutional memory can also be limiting, can prevent people from seeing that an approach may work now that didn't work before simply because the times have changed.

I am not criticizing senior faculty for this: I think it's an inevitable result of being part of a single community for an extended period of time, and particularly it's a result of having worked to help build that community into what it is. In other words, this limited perspective is an unavoidable side-effect of having done a lot of good stuff. So the solution to this problem is not, I think, for professors to seek out new jobs every 5-10 years. A functional department needs stability, needs to have the kind of community that has developed ways of handling problems over time, and high turn-over would work against community-building. But what about faculty exchange programs? What if universities worked together to "swap" a certain number of faculty every semester? What if it were written into tenure contracts that every, say, 10 semesters or so, a tenured faculty member must spend one semester at a different institution, while faculty member's department would welcome a tenured professor from another institution to serve in the faculty member's place for that semester?

An exchange like this would mean that departments would have a steady but not overwhelming number of "senior visiting" professors running through the department--not junior colleagues who need to be trained, but seasoned professors who can bring a different set of assumptions and of experiences into the department. And it would mean that tenured faculty members would also get to immerse themselves in another department's culture regularly but not so frequently that it would be unproductively disruptive to their work or lives, enabling the faculty member to experience and work with a different departmental culture. Such an exchange could also allow departments to seek out partner-departments that offer something very different from themselves: for instance, a department with high-research productivity but which thinks it needs to work on its teaching profile might seek out an exchange with a more teaching-oriented department, and vice versa. A department unsure how to approach a curricular revision might seek out exchanges with departments who have recently undergone radical revisions in new directions, to see what that revisions look like on the ground and to get input from members of partner-departments in the home department's discussions.

I have no real idea what the logistics of such an exchange would look like--I imagine it would be complicated to coordinate across an entire university, let alone multiple universities--but it seems to me that the benefits of such a system would be immeasurable. A limited amount of faculty exchange already happens at well-funded, high-prestige universities where departments have "visiting scholar" lines through which they can bring in other high-prestige faculty members. But what I'm thinking of here is not just an abroad program for the academic elite: I think it would be beneficial to think about faculty exchange for all faculty and all departments for the purposes of curricular and community enrichment, rather than just prestige-building. Not only would individual departments decrease the myopia that comes with living in a specific, local community, but such a program would also expand a sense of community within the disciplines outside institutional boundaries, and potentially foster greater unity within our disciplines at a time when the value of a college education is largely under attack.

Sunday, June 2, 2013

How Much Bibliography Is Enough?

Recently, I was reading a chapter on a specific text by a very famous scholar in my field, and, because I am thinking of writing on the same text, I went to check on some of the endnotes to see if the sources might be worth reading. What I noticed when I flipped to the endnotes in that chapter is that (1) there were in fact amazingly few endnotes for a 40-page chapter, and (2) the scholar cited only a single critical work that was published after 1970 (and nothing published in the previous decade). My initial reaction was confusion--not over how someone of this scholar's caliber could get away with so little engagement with recent scholarship (I know perfectly well how a famous scholar can get away with this kind of thing), but over whether I should think that this scholar was derelict in his scholarly responsibilities, or whether I should think that this was completely awesome. The more I've reflected on it, the more I'm coming down on the side of "completely awesome."

As a junior scholar, I have routinely had the experience of sending work out for review and being told, whether the review was positive or not, that I need to engage with this scholar or that kind of criticism before the work could be published. The best of these reviews are specific ("the author should look into the work of X, Y, and Z") and the worst are dismissive and unhelpful ("this author has a light grasp of the large body of criticism on X," without providing any specifics on where to look to find this supposedly "large body of criticism" that I had in fact already looked for and been unable to find before submitting the work). But in all cases, the demand is the same: I am supposed to show wide-ranging engagement with the relevant bodies of criticism throughout my work. Speaking anecdotally, this has also been the experience of many of my junior colleagues. I understand the utility of the demand for wide citation. Certainly, as scholars, it's important to do due diligence in assuring that we're not simply making arguments that have already been made, wasting paper and other people's time by creating an echo-chamber: and the best way to assure this is to make sure that we're well-acquainted with what others have had to say on the subject about which we're writing.

But there's a difference between doing due diligence and showing it. Famous Scholar X can get away with not showing all the research that may have been done before writing because there's an assumption that she knows her stuff already and doesn't need to prove herself anymore. The rest of us, it seems to me, seem to operate under a cloud of suspicion, wherein if we don't show all the research we've done, there's an assumption that we simply didn't do it. This would make sense, I suppose, if the reviewers of our work weren't experts in the field about which we were writing--they'd need to see the research to be sure that we weren't just recycling old ideas. But that's not how journals and book publishers operate: the people reading our work are people who also work in the field--are also published in the field--which means that they've done the research for themselves. Which should also mean that they know, looking at an essay in their field, whether an author has also done his research, without needing a bibliography the length of my arm. 

Beyond due diligence, the other (good) reason for citing is to show how our work speaks to the work of others--that is, to help foster dialogue, rather than each of us writing as if we're the only ones in the conversation. This is a very important aspect of scholarly work, which, as I've said before, tends to feel too insular. But if this is a good reason to cite, it's also a good reason not to cite too much. In one essay I sent off for review, I had spent a large part of the opening of the essay situating my ideas in relation to what other scholars had said; I did this largely because I was aware that other work I had sent out had received the criticism that it was not citing enough. But the reviewers came back saying that my attempts to engage with all of these scholars were dragging my argument astray--and the reviewers were right. I was engaging with handfuls of critics to show that I'd done my due diligence, and the result was something disorganized. What I needed to do with the essay was to compose my argument and engage with other works of criticism when they were relevant to what I had to say, rather than scrambling to construct my argument around what everyone else had to say. This effectively means, however, that I needed to cite less.

Ultimately, the demands of scholarly publishing create a double-bind. On the one hand, my reviewers  (rightly) want to see a direct, clear, comprehensible argument; on the other, reviewers (perhaps, as I'm suggesting, not so rightly) want me to engage with a very broad array of other critics, a practice that can get in the way of writing direct, clear arguments. Sacrificing the quality of the essay as an argument to critical engagements that don't clearly advance the argument seems to me like a poor trade-off for our profession, particularly since, as I've been suggesting, there isn't clearly a good reason for this citational demand. There are, of course, bad reasons for it. Scholarly egos and the critical biases we all develop might account for some requests for wider critical engagement in work (I once witnessed someone on a search committee argue against a candidate on the grounds that the candidate's published writing sample didn't cite an article that this committee member was particularly fond of). Tenure requirements, which often require us to document the "impact" of our scholarship--which often means how frequently we've been cited--also potentially drive an unhealthy attachment to footnotes. But these kinds of considerations are not good for us as critics, and are not good for literary study on the whole, insofar as they make it harder to construct lucid, clear essays. Furthermore, it introduces a level of conformity--everyone writing on the same topic must bend their arguments so that they engage with the same set of authorities at all times--that can lead (some might argue has led) to stagnation in our fields. It seems to me that there's nothing to be gained by lists of endnotes that extend page counts by 15-20%.

The book chapter that precipitated this reflection is really, really good: it could perhaps have been trimmed down a bit, but on the whole it makes a powerful, persuasive argument full of insights and amazing readings of the text under discussion. It is not, I don't believe, an essay that would be improved by adding a bunch of citations, or by more direct engagement with other critics. This isn't to say that the chapter never engages with what others have said: it simply draws on earlier work when it is directly relevant to the argument at hand. Ultimately, I'm jealous of the freedom this scholar's fame has bought him--the freedom to write his argument as he sees best, without worrying about "proving" his scholarly chops in the bibliography. This freedom is not, I'd argue, something that should be relegated to the very famous (or the very connected), and we should perhaps recognize that, if great scholars can do great work without lengthy lists of notes, perhaps the rest of us can, too. 

Friday, May 3, 2013

Collaboration Among the Cats

My spouse recently sent me a link to a very informative and thought-provoking piece on alt-ac careers at Scholars' Lab. The essay, "Humanities Unbound: Careers & Scholarship Beyond the Tenure Track," provides some interesting statistical information about those who have pursued careers outside of the professoriate that suggests how we in the humanities might do a better job both preparing graduate students for the world and removing the stigma from such careers paths. It's worth a read for many reasons, but one passage in particular caught my attention:

"It’s not surprising that employers find that alt-ac employees need training in skills like project management and collaboration. Employees themselves also recognize that these are by and large not skills that they acquire in graduate school. Even among those who felt that their skills in these areas were strong, they noted that they gained them outside of their graduate program—for instance, through jobs or internships."

It's significant, I think, that author of the essay, Katina Rogers, can assert that the lack of management and collaboration skills among humanities scholars is "not surprising" without further comment: it suggests that all of us in the humanities know that this is a hole in our training. Given the way scholarship in the humanities is carried out, the fact that the hole is there is also not surprising. At least in literature, scholarship often feels like a very solitary endeavor. We conduct our research alone in libraries or behind closed doors in our offices. When we write, it's us, the computer screen, and however many cups of coffee (or glasses of bourbon) it takes to get through. While we share our work publicly at conferences, and appreciate conversations with fellow colleagues engaged in similar work, these moments of contact rarely lead us to work on anything together. I remember being somewhat startled in my first academic job, which was at a small school where I regularly spoke with folk in psychology, sociology, biology, and other data- and experiment-driven fields, to discover how often articles and books in those disciplines are co-authored. I don't think it had ever occurred to me before then that scholarly work could be accomplished that way.

It seems to me that this way of carrying out our intellectual work invades our habits of working more generally, and to our detriment. We see this largely in committee work, in the dysfunction of so many committees and the reluctance to participate on them. In my own experience, in two tenure-track jobs at two different different universities, committee work tends to be extremely unproductive: we talk a lot, and we're good at debating things like procedure or larger "philosophical" issues behind proposed changes or actions, but I've sat on committees that in some cases have accomplished literally nothing over the course of two or three years, simply because we were better at discussing ideas than at acting on them. Faculty meetings sometimes seem to go the same way. (I remember a colleague at my former job, after a particularly unproductive faculty meeting, saying to me, "Why does every decision we try to make devolve into a debate about the nature of democratic decision-making?") If I had a dollar for every time I'd heard--or used--the phrase "herding cats" to describe attempts at academic decision-making, I'd be able to retire.

The fact that we're so willing to describe ourselves as cats says something, I think, about how we think of ourselves. First and foremost, we to see ourselves as independent. This is largely a good thing--our job is to challenge assumptions, to think new ideas, to chart new paths in our fields. But the downside to this is that a lot of us (myself included) also tend to resent or avoid situations that require us to depend on others or to make compromises. This makes us avoid collaboration. It's a peculiar facet of the humanities that so many of us are so bad at--or simply reluctant to--work with other humans. And it shouldn't be this way.

Recent adjunct unionization efforts across the country suggest one way in which academic collaboration is crucial to our professional survival. Quite simply, there's strength in numbers, and what we can learn from these adjunct struggles is what labor organizing has been demonstrating for over a century: we can all get better working conditions if we're willing to come together and pull in the same direction. The attacks on academic standards and academic freedom coming both from legislatures and from university administrations that are the greatest threat to the survival of higher education in this country can best be fended off by working together.

From a pedagogical perspective, however, there is also a strong case to be made for greater willingness to collaborate in the humanities. As Rogers' piece points out, the humanities do not teach collaboration skills, but these are some of the skills most desired by employers. One way to serve our students better, then, is to build more collaboration among students into our curricula, both at the graduate and undergraduate level. But the lesson for students perhaps goes beyond the workplace. For a long time, I resisted all groups projects in my courses because I hated such projects when I was a student. Already a "cat," whether by nature drawn to a discipline of cats or trained into cat-like behavior by the discipline I chose, I resented not having total creative control over my work and having to pick up others' slack. What I realize now, however, was that group projects were an opportunity both to learn from the ideas of others and to learn how to work with others--to motivate them but also to back down, to change my mind, to be persuaded by others. However much I hated them at the time, collaborative assignments tried to teach what is in fact true about the world: that no idea I have amounts to much if I can't engage others with it, and that no engagement with the world can--or should--leave me untouched. I am not an independent being: my existence depends on others, and, if I'm honest about it, I've never really had the control my seemingly solitary work gives me the illusion of having. My own work has always been about and been informed by the work and ideas of others; why should I be reluctant, then, to engage others more directly, to produce work with others rather than just about others?

As a scholar in the humanities, I know well the oft-quoted words of John Donne: "No man is an island, entire of itself; every man is a piece of the continent, a part of the main." It's only recently that the real import of these lines for my own work has started to sink in. If Donne was right when he reflected, "any man's death diminishes me, because I am involved in mankind," then it seems to me that the converse must also be true: that what other human beings add to this world also augments me, makes me more, and what I add to this world has the same effect on others. But how much more can we add, by how much more can we be augmented, if we embrace our connectedness, if we work with it rather than pretend it doesn't exist? And how much more do we teach our students, for their professional lives but also for their ability to engage with others meaningfully in the world, when we model that collaboration for them and encourage them to develop those skills themselves?

In practical terms, I don't have a good sense of what professional collaboration would mean for the work I do. But when I look at pages like Scholars' Lab, MLA Commons, and other such sites, I realize that a lot of work is already being done to enable greater collaboration in the humanities, and so I don't feel like I need to reinvent the wheel (which is, incidentally, one of the great benefits of collaborative endeavors). In terms of classroom design, I'm still doing a lot of thinking about how to best engineer group projects, but I think I've had moderate success in a few things. One change I've implemented in all of my upper-level classes, both undergrad and grad, is some form of presentation/workshopping on final projects. The key things here is that this takes place before the final paper is due. Students present on their research and provide an explanation of the main arguments of their project, but also discuss the difficulties they've encountered and the questions that still remain for them. Their classmates and I then provide feedback, suggesting passages the presenter might look at, ways of handling the difficulties, and sources that might be useful. The idea here is both to have the class learn from the individual work done by each student, and also to have each student get help on their work from the rest of the class. Every student who has ever commented on this exercise, whether to me or on course evaluations, has described the experience as useful. I've also experimented with the various collaborative tools offered by our course management software. I've used the wiki tool to have students create annotated bibliographies as a class, and to create a reference work on a variety of authors of their choice. I've used blogs and discussion boards to have students comment on each other's reflections. Currently I'm looking into websites that allow me to upload a document that everyone in the class or in a specific group can annotate and see others' annotations.

But the best outcome I've ever had from group work came from a semester-long project that broke students into groups that were responsible for a specific section of the course material. In individual projects, they were asked to research beyond the syllabus; as a group, I asked them to develop a study guide for the final exam and lead a review session of that part of the syllabus in the final week of class. Students did not necessarily like the project, but they clearly learned from it; I saw a nearly 10-point average increase on the final exams compared to previous years. And this goes back to the way in which our connectedness is a general asset. When students aren't just responsible for their own learning, but for others' as well--and when they will be held responsible in front of those others--they learn more. Collaborative work doesn't just teach students how to work with each other; it also potentially teaches them why they should work together, by showing them that their individual outcomes improve when they work as a group.

Saturday, April 27, 2013

The Email Problem

A colleague of mine recently had the frustrating experience of having spent time commenting on a draft of a student paper, only to have the student later confess that they'd* never looked at the colleague's comments. For those of us who teach writing-intensive courses, the experience is of course a familiar one, but in this case it was particularly frustrating because the draft was unassigned, and was sent to the colleague via an email soliciting comments. My colleague had gone out of their way to do some extra work in order to comply with a student request--but it was a request the student had apparently not made in earnest.

The situation reminded me of what was at the time for a me a "nightmare" exchange with a student over email. Back when I was still very new to the classroom, I encouraged students to send me drafts outside of those assigned for the course, with the offer to email them comments on their work. This seemed like a good idea at the time and worked well enough for a couple of semesters; but then I encountered a student I will call "Jimmy." Jimmy was highly engaged and seemed inclined to work hard at the class, and so I was more than happy to encourage his efforts by commenting on any work he was willing to send me. One fateful Saturday, however, Jimmy went a little overboard--and I jumped right after him. Over the course of about 5 hours, Jimmy sent me something like 4 drafts of the same paper. With each draft, I sent a response--usually within about half an hour of receiving the draft. As you might expect of work "revised" so rapidly and in such a compressed period of time, each draft was worse than the previous. This would not have been so bad if the exchange between us had not also grown increasingly heated--when I tried to tell him that the quality of the drafts wasn't improving, Jimmy grew belligerent, blaming me for being unduly harsh in my criticism. This got my dander up, and in return I accused him of not recognizing that he still had things to learn. All this over email, on what might have otherwise been a lovely Saturday afternoon. It was ultimately the opposite of a productive exchange, and it was only by the intervention of my truly wonderful mentor that I was able to right the ship with that student and sail back into more pedagogically sound waters.

The experience with Jimmy gave me two insights (not unique to me, but new to me at the time) into the problem with email as a pedagogical tool. The first is universally a problem with "work" email: it makes us over-connected, which can foster potentially unhealthy expectations (which the interaction with Jimmy illustrates). For anyone who uses email in the workplace, email creates the well-known problem of blurring the distinctions between work and non-work time, making it difficult ever to "clock out." In an educational setting, however, email creates the additional problem of creating for our students the illusion that they can have their questions answered at any time of day or night. Firing off an email at 1am is easy; why shouldn't the response be as easy to fire back? This in turn, I think, enables poor decision making when it comes to time management for our students. If students could ask questions only during scheduled face-to-face time, like class and office hours, there would be an incentive for them to get started on assignments early: there's a specific time-window during which they get to ask questions, so they know in advance that, if they want to ask questions, they have to be prepared. If they know they can reach us whenever they have a question, however, then the need to work ahead of time to make effective use of classes and office hours seems to disappear. Email makes it easier to procrastinate because it creates the (often false) sense that it's never too late to ask a question or to get clarification on the assignment.

The second insight the situation with Jimmy provided me is closely related. In the same way that email makes asking questions so "easy" that it encourages bad time-management habits, email also makes the draft-commenting process so "easy" from the perspective of the student that the commenting process appears frivolous. In my experience, sometimes (though by no means always) students will email drafts, not because they want comments to help them revise further, but because they're checking to see if the draft is "good enough" yet--they're checking to see if they can stop working on it. The work of revision is something they already have to do for the assignment, but it's the easiest thing in the world to put the draft in an email and send it off in the hopes that the instructor will say, "not bad!", and then they can be done with it (this was very much what Jimmy was looking for, and he was frustrated when he wasn't getting it). These are the situations in which students aren't reading our comments: and frequently I think, after a good night's sleep, they wake up realizing that the paper isn't "good enough" without even receiving our comments, and they get back to work before they've seen any feedback from us. Email here just makes it too easy for them, in a moment of weariness, to send us their work simply because, at that moment, they either don't want to work on it anymore, or don't know what else to do with it. The problem is that we have no way to know which students are earnestly seeking comments and which are sending the email as a way to stop working for the night. This creates extra work for us that is not useful to our students--it's lose-lose.

So I've adopted two policies to deal with the email problem. The first is that I have a "24-hour" clause in the email section of my syllabus that states that students should expect up to 24 hours to pass before I respond to their emails (and 48 on weekends and holidays). This is both to create some breathing room for me--as a human being I have a right to have an evening in which I do nothing work-related--and to cut down somewhat on last-minute emailing: if my students know that they can't rely on getting a response from me 12 hours before a paper is due, they know that they have to plan ahead a bit (whether they actually plan ahead or not is their choice). The second policy I adopted, not long after the Jimmy situation, is that I no longer comment on drafts via email: I will only provide comments in office hours. I tell my students that I am very happy to read extra drafts of their work, and that they should (1) make an appointment to see me in office hours, and (2) email me the draft they want to discuss at least 24 hours before their appointment. What this does is balance the equation of effort: if I'm going to put in the time to read over extra drafts and provide feedback, they must also put in the time to meet with me. As a result, I almost never see "please just tell me it's good enough" drafts coming my way anymore. Students who genuinely want the extra help are also willing to put in the time to meet with me, and the requirement that drafts be sent 24 hours in advance of a scheduled appointment removes the "I'm just sick of working on this, maybe I'll send it to my teacher" option that produces non-serious requests for comments. It's no longer so "easy" to get extra comments from me: students must plan a little and use a little forethought, which I hope also teaches them better time-management skills than my previous "email me anything anytime" system had done. It also has the added benefit of getting a few more students into my office hours, where it's a lot easier to assess the difficulties they are having and to address them.

At the end of the day, balancing effort--making sure that effort on my part is matched by some level of effort on their part--is a key element, I think, of teaching effectively. If we're doing all the work, we're not teaching our students to try. If we show them that, when they do work for us, we'll do work for them, we're rewarding them for their efforts, which seems like the right lesson to teach. In the electronic age, this balancing-act becomes much more difficult but also, I think, much more important.




*In case anyone objects to the use of what is considered a plural pronoun to refer to a singular antecedent, I'll just quote The Cambridge Grammar of the English Language (eds. Rodney Huddleston and Geoffrey K. Pullum [Cambridge, 2006]): "The use of they with a singular antecedent goes back to Middle English, and in spite of criticism since the earliest prescriptivist grammars it has continued to be very common in informal style. In recent years it has gained greater acceptance in other styles as the use of purportedly sex-neutral he has declined" (493). Having one pronoun serve multiple functions is not uncommon in language--see, for instance, the German pronoun sie--and I prefer to avoid both the clunkiness of "s/he" and invention of new pronouns (like "hu") when we have a pronoun that has served this purpose for centuries.

Thursday, April 25, 2013

Bad Idea

I know a project is a bad idea when I come up with it when I have other things I'm supposed to be doing. In my younger life, these projects were entirely and obviously inconsequential. If I had an essay to write, it suddenly became necessary for me to organize my CDs. But at least then I knew I was wasting time. As I've grown older, my "procrastination projects" have developed a patina of professional justification, which makes the illusion of their significance much harder to see through. When I have an article or conference paper deadline, this always seems like the best time to start work on a book on the epic poem from Homer to Pound. If I have a mountain of grading to do, this is clearly the moment for me to collect and collate data on the job listings at the academic jobs wiki.

Or, when it's the end of the semester and I have final papers to grade, final grades to calculate and submit, and enough committee work to choke a bureaucrat, this is obviously the best time to start a blog.

This is a bad idea. My rational brain knows that I am spending the time playing with blog templates and color schemes and imagining topics for future posts simply because I'm tired of looking at essays and grade books and applications and proposals and everything else I've been staring at for the last several months. And my rational brain knows that projects begun under such circumstances tend to get dropped as quickly as they are taken up (see, e.g., my last attempt at blogging). But my rationalizing brain says that it will be summer shortly, and having an outlet for some structured-but-not-entirely-formal writing could prove useful; and, besides, says the rationalizing brain, I have lots to say about teaching, about research, and about academic life in general. My rational brain is either too exhausted or too apathetic to win this fight, and so shrugs, hands the keys to the rationalizing brain, and wanders off to find a glass a scotch.

So this is a bad idea I'm running with for now, against my better judgment and without a clear sense of direction beyond what the title of this blog, "Reflections on an Academic Life," suggests.