Who is not ambivalent about peer review? On the one hand, it establishes a basic, reliable level of quality in argument and in evidence. On the other, it grinds everything down to a bland sameness. Peer review assures professional standards are met, and also enforces orthodoxy. Anonymous peer review prevents intimidation: anonymous peer review allows irresponsible, spiteful criticism. Peer reviews can be extremely helpful: peer reviews can crush the spirit. They take forever, and they also present a significant burden of uncompensated labor.
I often enjoy doing peer review, as I get to read new and interesting stuff, and I almost always learn from it, when my own work is reviewed. But just as often the criticisms reflect stubborn genre conventions more than free thought. Everyone sees reviews which just do not “get” it, and more significantly, never will, because the intellectual apparatus isn’t compatible: a social historian who finds “linguistic” evidence unpersuasive; a cultural type who find empiricism naive and “undertheorized;” a political historian who sees analysis of popular culture as trivial and pointless, an historian of popular culture weary of political history. In cases like this, no actual “review” takes place: rather, the author and his or her reviewer talk past each other.
Of course academic publishing was never supposed to be easy. It was always supposed to be rigorous, demanding, and difficult, and for good reason. Good work takes hard work: there’s nothing unreasonable about that. But there’s a strong sense in which peer review isn’t always about good work, it’s about familiar work, work which is asking questions to which the answers are always already known.
The profession’s origins lie in class privilege: “peer” has origins in “peerage,” and class exclusivity. As Rob Townsend recently pointed out, the AHR did not start doing “double blind” peer review until the mid 20th century. Before that, the editorial board handled things itself, ensuring a closed and clubby atmosphere.
It was an example of what Marx called “the illusion of scarcity.” Plenty of people had opinions about history, and expertise: “peers” kept the unwashed out.
Peer review also reflected the relative scarcity and expense of print media. Journals were expensive to produce, expensive to buy, and the vetting in peer review raised their value. Peer review also increases the tendency of journals to specialize in one kind of thing. How many of you have started a research project thinking not “is this interesting” but rather “where could I publish this?” That’s the illusion of scarcity.
Earlier I argued that the era of scarcity in evidence was coming to a close, because so much previously hard to get material now exists online. Maybe it’s time for the era of scarcity in peer review to end as well. We ought to be able to rethink peer review in ways that make it more effective and less “clubby.”
The first objection modern academics make to changing peer review is that they don’t want to have to wade through a lot of junk to find something good. Very early on, that was everyone’s experience of the internet and digital technology: a very poor signal to noise ratio.
Google, the search engine that fixed that problem and remade the internet, is in effect a gigantic peer review. Google responds to your query by analyzing how many other people–your “peers”– found the same page useful.
When you enter search terms, Google looks for web pages that contain instances of those terms, and then from that set it looks for those pages most often linked to or referred to by others. In other words, it’s a massive form of peer review: it foregrounds web pages which others have vetted and cited.
And it’s extremely good at getting useful information in a hurry. If you enter “green breast of the new world” as a search term, you get pages devoted to the Great Gatsby, not outer space-themed pornographic sites. The results are extremely specific. Google has peer-reviewed the web, but the peers are virtually every internet user in the English speaking world. The more specific your search terms, the smaller and the more specific the set of “peers.”
That’s a simplification: Google guards its methods very closely. But it describes the principle. Google turns peer reviewing into “crowd sourcing.”
There has been a lot of talk about how to change peer review, usually involving online versions of what we have now–journals which use an editor, editorial board, and peer reviewers to “gatekeep.”
But why preserve a structure of gatekeeping that evolved specifically early in the twentieth century, for early twentieth century social and economic conditions?
We could certainly “crowdsource” peer review. Imagine going to a site where scholars in history and history-related fields post new work, work in progress, research findings, and queries. The site would be transparent to Google. You could enter a set of search terms, and instantly get the results your fellow academics, searching the same terms, found most useful. That’s peer review in action. And the group of “peers” would be larger, more representative, more up to date, and less inclined to disciplinary orthodoxies. Such a site would require only minimal editing and minimal maintenance.[1. Google has sort of already done this, with “Google Scholar.” I’ve not found Google Scholar to be very useful or different, because it mostly searches databases like Jstor, which have already been peer reviewed, and in effect gives basically the same results for me as searching Jstor directly. I believe that’s because there is not enough scholarly work online to search, outside of the subscription databases.]
Work at such a site could find a much larger audience than simply publishing in a prestigious journal. Let us imagine that journal “X” has eight to ten thousand subscribers, including libraries. That’s a lot: that’s a big journal. But of those 8000 subscribers, how many actually read your article? How many read it and cite it? How many cite it, but never read it? You don’t really know.
As of 10/14/2010, just two weeks after my first post, this humble blog has had 1352 visits. That’s tiny in proportion to the size of the internet, but large in academic terms. Google Analytics lets you track the usage of a website–the number of visitors, the way the visitors got to your page, the countries they visited from.
I can also tell which individual pages/postings drew the most attention, and get a reading of the average time visitors spent on the site, and report which pages they spent the most time reading. It can show you where your page is linked to. While this is far from an accurate assessment of how the material on my blog is being used, it’s also far more information than I get about an article published in a journal. In many ways it would give a tenure review committee a much more accurate account of how someone’s work is used.
So far, what I see on Google Analytics is not all that encouraging–the average visitor stays for just under five minutes. The most “academic” of the posts, “Colored Me,” continues over five pages. The number of people who spend time on page five is far lower than the number that view page one. But I’m inclined to say that’s my fault, not the fault of digital technology–I need to learn how to present scholarly material effectively on the web.
We all do. Sooner or later, the economics will catch up with the paper journals and with the structure of peer review. Peer review was born in a specific historical moment–it’s not the final product of human progress, objectively understood: we don’t need to preserve it in its present form. We need to make sure “peer review” in the future works for our needs and the needs of the pubic that wants good history.
So I’m thinking enough talking, I have to actually do it. In the next few days, I’m going to post a poll, asking people to vote for which online project I should pursue. Then I’ll post the equivalent of a journal article online and see if I can manage to get it peer reviewed by crowdsourcing.
[…] This post was mentioned on Twitter by Links About Google, Sheila Brennan, Sean Gillies, Mark Sample, Mark Sample and others. Mark Sample said: There's "a strong sense in which peer review isn’t always about good work, it’s about *familiar* work." – Mike O'Malley http://bit.ly/9HxmgC […]
[…] blog post on open access). Google itself might be “in effect a gigantic peer review” (The Aporetic). As Johnson says, “By creating fluid networks of words, by creating those digital-age […]
[…] I suspect this will hold true for many new kinds of scholarly communication that are liberated from traditional peer review. Due to their more open and freewheeling nature, these genres, like blogging, will undoubtedly contain much dreck, and thus be negatively stereotyped by many in the professoriate, who (as I have noted in this space) are inordinately conservative when in comes to scholarly communication. But in that sea of nontraditionally reviewed material will be many of the most creative and influential publications. I’m willing to bet this pattern will be even more pronounced in the humanities, where traditional peer review is particularly adept at homogenizing scholarly work. […]
[…] av en grej jag hittade alldeles nyss på nätet. Historikern Mike O’Malley skriver ett läsvärt inlägg där han reflekterar över för- och nackdelar med peer review-systemet; dessutom sätter han det i […]
[…] made a longish post about digital publishing and peer review, and now I have to put up or shut up. I want to want to conduct a research […]
Great post. It seems odd to me that academics, who should be on the cutting edge of technology, continue to go back to the 19th Century for a methodology of knowledge dissemination. That said, I don’t think it’s as simple as “crowdsourcing” — some of Wikipedia’s early mis-steps demonstrate that there is still room for “experts” in this Brave New internet World. So what we need is a way to allow everyone to have a say, but also perhaps to let the experts have a bit more of a say. This is exactly what we’re working on.
[…] O’Malley has a splendid essay up about peer review in the Google era: Peer review was born in a spe cific his tor ical moment–it’s not the final product of human […]
[…] Over the course of last week, a huge number of friends and colleagues of mine posted links and notes on Twitter and around the blogosphere about Mike O’Malley’s post on The Aporetic about crowdsourcing peer review. […]
[…] of a journal article online. To read Mike’s full report and track his progress, visit The Aporetic. // […]
Hey, something is wrong with your site in Opera, you should check into it.
[…] get lost in the bulleted list. Last week, I linked to Mike O’Malley’s post on “Googling Peer Review“; ProfHacker’s own Kathleen Fitzpatrick responded on her own blog with a consideration […]
[…] http://theaporetic.com/?p=446 […]
[…] O’Malley, Googling Peer Review part 1 and part […]
[…] Sospecho, concluye Cohen, que esto será verdad para muchos nuevos tipos de comunicación académica que se liberan de la revisión tradicional inter pares. Debido a su carácter más abierto y despreocupado, estos géneros, como los blogs, sin duda contienen mucha paja, y por tanto se los clasifica de forma estereotipada y negativa por muchos profesores, que (como he señalado en este espacio) son excesivamente conservadores en cuando a la comunicación académica. Pero en ese mar de material no tradicionalemte revisado habrá muchas las publicaciones creativas e influyentes. Estoy dispuesto a apostar que eso será aún más pronunciado en las humanidades, donde la revisión por pares tradicional es particularmente hábil en homogeneizar el trabajo académico. […]
[…] O’Malley’s recent essay, “Googling Peer Review,” raises some really interesting questions about the peer review as we move from an age of […]
[…] O’Malley’s provocative essay, “Googling Peer Review,” raises interesting questions about the shift from an era of scarcity of information to an […]
[…] One model for this might be termed the “crowd source” model: collect all the digital work done by humanities scholars, and allow “ranking in use” to emerge on its own. For example, Prof. X writes a blog post, and American history now notes the post, adds a link, and sends the information about the post out to its reader. Interested readers respond: in very little time, blog posts that attract a great deal of interest would “rise to the top” of rankings at American History Now. No editors, no designated formal “peers,” no boards of review. […]
[…] O’Malley, M. (2010, October 19). [Web log message]. Retrieved from http://theaporetic.com/?p=446 […]