Googling Peer Review, Part Two

Talking with a friend about peer review, it occurred to me that the stuff which has been most influential in my intellectual life, the stuff that’s been most profound and useful, is profound and useful in ways that have nothing at all to do with peer review.

Was Foucault’s Discipline and Punish peer reviewed? It sure doesn’t read as if it was. History of Sexuality, v.1? I’m guessing no. Both books had a profound influence. Both books are deeply flawed in terms of evidence etc. etc. but both had powerful and provocative things to say. Is Judith Butler’s work provocative and useful  because of double-blind blind peer review? Did blind peer review dramatically improve the core assumptions and frame of Making of the English Working Class? Peer review is to those books as pea shooters are to aircraft carriers.

Was Geertz’s essay on cockfighting in Bali dramatically improved by peer review? I seriously doubt it–what’s valuable about that famous essay is the clarity of his prose and the nature of the insights. Maybe peer review pushed him to make it a little better, but the value comes from the method, the intellectual core, not some fine tuning on Balinese village customs forced by Geertz’s disciplinary rivals.

A few years ago the JAH published a round table on the anniversary of the Brown v. Brd. case. There were a couple articles in there that showed me an entirely new way to think about the Brown decision and civil rights. Really good stuff which I had not considered before.[1. Mary L. Dudziak. “Brown as a Cold War Case;” and Daryl Michael Scott, “Postwar Pluralism, Brown v. Board of Education, and the Origins of Multicultural Education,” in Journal of American History Vol. 91, No. 1 June 2004.]

Here again, what was valuable about those articles had zip to do with peer review, and everything to do with the fact that the authors had found a new frame for a familiar topic. They had seen evidence lying in plain sight in a new way.

So when you think back on the books and articles that most influenced you, is your first thought “hell of a job on the peer review?”

Now the obvious objection is that peer review is supposed to be invisible, and present us, the general public, with a reliable, vetted, accurate product. I suppose one could argue that in the examples I cited, it worked as it was supposed to. But again it’s not the fact that they were peer reviewed that makes these pieces worthwhile: peer review is to their worth as the parsley garnish is to the blue plate special.

I’m tempted to advance “O’Malley’s law:” the inverse relationship between peer review and enduring intellectual value. Good work is generally good because it has something valuable to say, not because it has appeased other professors. Work that is good because of peer review is probably not very good.

Now of course most of us are not brilliant thinkers, and even brilliant thinkers get help. No doubt Geertz, Foucault and other postmodern worthies worked in a community, and benefited from exchange with their peers.  We all want that input on our work: we want to clarify our thinking and gain from the insights of people we respect. We want to make that easier, not harder: more fluid and less cumbersome.

The biggest objection I hear when I talk about this is from academics who worry about having to wade through a bunch of junk to find something valuable. I don’t think that will be a problem, and this scenario explains why.

Imagine a website devtoed to your field–I’ll use mine as an example, gilded age US. The website is transparent to google–that is, google can “crawl” it and index its contents. I post an article, a piece of research, to the website.  I “tag” it with subject words, and I make commenting available.

Interested persons have registered at the site. As registered members, they can make comments, post links, suggest revisions: they can do what we ask our friends to do. They can also “tag” the article in various ways–maybe they can rate it, perhaps as “must read” or “needs work” or “in progress” or “unpersuasive.” They would not be anonymous, which ideally would restrain nastiness, and they would be your key audience–motivated scholars who care about the subject. Your piece of research would accumulate a tag cloud of things scholars said about it, the things they liked, the things they found problematic. As the author, you could tag other people’s comments, and comment on them in turn–some body writes an unfair review, you could tag it as “hostile” or “antagonistic.”  Scholars visiting the site could rate others’ comments–“useful” or “not useful.” So a nasty comment that added nothing would quickly be isolated.

You could choose to revise and resubmit the piece. Or you could let your commentary stand as revision, and present the piece as a dialogue. Or you could decide to live with it as a finished piece, as we do upon publication now. Scholars and the general public searching for your research could find it by going directly to the site, or by a search engine like Google. It’s cheap, it’s effective, it requires no “too busy to get to the MS” blind peers, and systems like it are in place now, all over the web.

Academics always worry about the great unwashed flocking to such a site, but really, let’s face it, there are probably not 1000 people interested in your latest research, or mine. There are probably not 100: there may not be a dozen who are interested enough to read and comment. Too much information is not going to be the problem most of the time.

The spector of Amazon alarms scholars; people would post unsupported  half-assed crap and we’d have to wade through it, wasting time, whereas now we know that some other guy has already waded through it for us. But really, it’s not hard to figure out if the evidence is lacking or an argument is muddled and unclear. How long does it take to read an article in your specialty and figure that out? And again, the site I’m imagining is extremely unlikely  to have really large, Amazon-like numbers of people visiting it. And even if it did, so what? We want a larger audience, no? The mechanisms I’ve described is self policing–lazy and ill-conceived work is quickly called out; needlessly hostile and unproductive comments are flagged: in time they would simply be filtered out, lost in the noise of more useful posts and comments.

What would it take to start a site like this? I’m not entirely sure. I need to ask at CHNM. How hard would software like that be to mount and maintain? If I were editing The Journal of the Gilded Age and Progressive Era this is the model I’d look to. If you go to the website of that journal, you find that they are trying to adjust to the digital age, but their model remains the print journals and the pre-vetted article. The result is an uncomfortable fit, neither fish nor fowl.

Who knows how to set up a site like that?

11 Comments

  • […] This post was mentioned on Twitter by Dan Cohen and Alex Gil, mabel rosenheck. mabel rosenheck said: RT @dancohen: http://bit.ly/aiNThd // "think back on the work that most influenced you, is your 1st thought “hell of a job on peer review?”" […]

  • This is a fascinating topic. I used to work for an academic journal in the sciences, where peer review was essential. It was also a free service expected of doctors who at times looked at it as a chore akin to going to get a root canal. Do you foresee any reluctance of academics drowning in their own work to go further than determining the general overview of the article and take the time to do a systematic, thoughtful review? Or will it be more of a crowdsourcing technique? Can anyone register?

    Also, I’m wondering exactly what your definition of “academic” is. Would PhDs, professors, professional historians be submitting their work? Would PhD students be included? MA students?

  • Well I’d like to see the definition of “academic” broadened, to include serious students of history of all kinds. I’m imagining crowd sourcing peer review, and to people who worry about too many people being involved, I say “you’re flattering yourself!

  • They’re also worrying because they flatter themselves that grifting graduate students and predatory professors are standing by to poach their precious research (on toothpick factory workplace conditions, which actually sounds great).

  • You could get pretty close to what you need with a wiki of some sort, though you’d probably want to moderate membership to some degree. Certainly it would be reasonable to start it as an experiment and then see what you needed to change about the software to make it more inclusive, and allow better “peer commenting on peer” functionality.

  • […] 15, 2010 by rosendof Leave a Comment I thought this article on the future of peer review was particularly relevant to this class. Namely, the issues of open-source scholarship and how it […]

  • […] claims not to believe this. He writes, Talk­ing with a friend about peer review, it occurred to me that the stuff which has […]

  • K. Hering wrote:

    Of course Foucault had an editor — many of his works were published with the Editions Gallimard, like the Order of Things (in French) in the mid 1960s, and Foucault’s editor was Pierre Nora, who developed Gallimard’s social science division…Whether you call it peer review or not, you can’t understand any of his works without situating it in the intense intellectual culture and exchanges about structuralism in France at the time, and Gallimard and its (paid) editors played an important role in advancing these debates.

  • The argument was not that Foucault had an editor–he clearly did. The point was about whether or not the process of “peer review” as practiced now–double blind review by person chosen by an editor–happened at all and if it happened at all, made a material improvement in the work.

    There’s no doubt that all persons are embedded in a social context and that they benefit form the exchange that context enables

  • […] initial post,  but as Michael O’Malley (among others) has argued in a pair of blog posts, peer review itself imposes significant costs on the scholarly […]

Leave a Reply

Your email is never shared.Required fields are marked *