Googling Peer Review

Who is not ambivalent about peer review?  On the one hand, it establishes a basic, reliable level of quality in argument and in evidence. On the other, it grinds everything down to a bland sameness. Peer review assures professional standards are met, and also enforces orthodoxy. Anonymous peer review prevents intimidation: anonymous peer review allows  irresponsible, spiteful criticism. Peer reviews can be extremely helpful: peer reviews can crush the spirit. They take forever, and they also present a significant burden of uncompensated labor.

I often enjoy doing peer review, as I get to read new and interesting stuff, and I almost always learn from it, when my own work is reviewed. But just as often the criticisms reflect  stubborn genre conventions more than free thought. Everyone sees reviews which just do not “get” it, and more significantly, never will, because the intellectual apparatus isn’t compatible: a social historian who finds “linguistic” evidence unpersuasive; a cultural type who find empiricism naive and “undertheorized;” a political historian who sees analysis of popular culture as trivial and pointless, an historian of popular culture weary of political history. In cases like this, no actual “review” takes place: rather, the author and his or her reviewer talk past each other.

Of course academic publishing was never supposed to be easy. It was always supposed to be rigorous, demanding, and difficult, and for good reason. Good work takes hard work: there’s nothing unreasonable about that.  But there’s a strong sense in which peer review isn’t always about good work, it’s about familiar work, work which is asking questions to which the answers are always already known.

The profession’s origins lie in class privilege: “peer” has origins in “peerage,” and class exclusivity. As Rob Townsend recently pointed out, the AHR did not start doing “double blind” peer review until the mid 20th century. Before that, the editorial board handled things itself, ensuring a closed and clubby atmosphere.

It was an example of what Marx called “the illusion of scarcity.” Plenty of people had opinions about history, and expertise: “peers” kept the unwashed out.

Peer review also reflected the relative scarcity and expense of print media. Journals were expensive to produce, expensive to buy, and the vetting in peer review raised their value. Peer review also increases the tendency of journals to specialize in one kind of thing. How many of you have started a research project thinking not “is this interesting” but rather “where could I publish this?” That’s the illusion of scarcity.

Earlier I argued that the era of scarcity in evidence was coming to a close, because so much previously hard to get material now exists online. Maybe it’s time for the era of scarcity in peer review to end as well. We ought to be able to rethink peer review in ways that make it more effective and less “clubby.”

The first objection modern academics make to changing peer review is that they don’t want to have to wade through a lot of junk to find something good. Very early on, that was everyone’s experience of the internet and digital technology: a very poor signal to noise ratio.

Google, the search engine that fixed that problem and remade the internet, is in effect a gigantic peer review. Google responds to your query by analyzing how many other people–your “peers”– found the same page useful.

When you enter search terms, Google looks for web pages that contain instances of those terms, and then from that set it looks for those pages most often linked to or referred  to by others. In other words, it’s a massive form of peer review: it foregrounds web pages which others have vetted and cited.

And it’s extremely good at getting useful information  in a hurry. If you enter “green breast of the new world” as a search term, you get pages devoted to the Great Gatsby, not outer space-themed pornographic sites. The results are extremely specific. Google has peer-reviewed the web, but the peers are virtually every internet user in the English speaking world. The more specific your search terms, the smaller and the more specific the set of “peers.”

That’s a simplification: Google guards its methods very closely. But it describes the principle. Google turns peer reviewing into “crowd sourcing.”

There has been a lot of talk about how to change peer review, usually involving online versions of what we have now–journals which use an editor, editorial board, and peer reviewers to “gatekeep.”

But why preserve a structure of gatekeeping that evolved specifically early in the twentieth century, for early twentieth century social and economic conditions?

We could certainly “crowdsource” peer review. Imagine going to a site where scholars in history and history-related fields post new work, work in progress, research findings, and queries. The site would be transparent to Google. You could enter a set of search terms, and instantly get the results your fellow academics, searching the same terms, found most useful. That’s peer review in action.  And the group of “peers” would be larger, more representative, more up to date, and less inclined to disciplinary orthodoxies. Such a site would require only minimal editing and minimal maintenance.[1. Google has sort of already done this, with “Google Scholar.” I’ve not found Google Scholar to be very useful or different, because it mostly searches databases like Jstor, which have already been peer reviewed, and in effect gives basically the same results for me as searching Jstor directly. I believe that’s because there is not enough scholarly work online to search, outside of the subscription databases.]

Work at such a site could find a much larger audience than simply publishing in a prestigious journal. Let us imagine that journal “X” has eight to ten thousand subscribers, including libraries. That’s a lot: that’s a big journal. But of those 8000 subscribers, how many actually read your article? How many read it and cite it? How many cite it, but never read it? You don’t really know.

As of 10/14/2010, just two weeks after my first post, this humble blog has had 1352  visits. That’s tiny in proportion to the size of the internet, but  large in academic terms. Google Analytics lets you track the usage of a website–the number of visitors, the way the visitors got to your page, the countries they visited from.

I can also tell which individual pages/postings drew the most attention, and get a reading of the average time visitors spent on the site, and  report which pages they spent the most time reading. It can show you where your page is linked to. While this is far from an accurate assessment of how the material on my blog is being used, it’s also far more information than I get about an article published in a journal. In many ways it would give a tenure review committee a much more accurate account of how someone’s work is used.

So far, what I see on Google Analytics is not all that encouraging–the average visitor stays for just under five minutes. The most “academic” of the posts, “Colored Me,” continues over five pages. The number of people who spend time on page five is far lower than the number that view page one. But I’m inclined to say that’s my fault, not the fault of digital technology–I need to learn how to present scholarly material effectively on the web.

We all do. Sooner or later, the economics will catch up with the paper journals and with the structure of peer review. Peer review was born in a specific historical moment–it’s not the final product of human progress, objectively understood: we don’t need to preserve it in its present form. We need to make sure “peer review” in the future works for our needs and the needs of the pubic that wants good history.

So I’m thinking enough talking, I have to actually do it. In the next few days, I’m going to post a poll, asking people to vote for which online project I should pursue. Then I’ll post the equivalent of a journal article online and see if I can manage to get it peer reviewed by crowdsourcing.

18 Comments

Leave a Reply

Your email is never shared.Required fields are marked *