The Future of Peer Review

This guest post was written by Richard Price, founder and CEO of Academia.edu — a site that serves as a platform for academics to share their research papers and to interact with each other.

Instant distribution


Many academics are excited about the future of instant distribution of research. Right now the time lag between finishing a paper, and the relevant worldwide research community seeing it, is between 6 months and 2 years. This is because during that time, the paper is being peer reviewed, and peer review takes an incredibly long time. 2 years is roughly how long it used to take to send a letter abroad 300 years ago.

Many platforms are springing up which enable research distribution to be instant, so that the time lag between finishing a paper, and everyone in the relevant research community worldwide seeing it, is measured in hours and days, rather than months and years. Some of the strong platforms are Academia.edu, arXiv, Mendeley, ResearchGate and SSRN.

What about peer review?

One question many academics have is: in a future where research is distributed instantly, what happens to peer review? Will this be a world where junk gets out, and there is no way to distinguish between good and bad research?

Content discovery on the web

Instant distribution is a characteristic of web content, and the web has thrived without a system of formal peer review in place. No-one thinks that the web would be enhanced by a panel of formal peer reviewers who verify each piece of content before it was allowed to be posted on the web.

The web has thrived because powerful discovery systems have sprung up that separate the wheat from the chaff for users. The main two systems that people use to discover content on the web are:

  • Search engines (Google, Bing)
  • Social platforms (mainly sites like Facebook and Twitter, but also generic communication platforms like email, IM etc)

Both search engines and social platforms are peer review systems in different ways. One can think of these two systems as “Crowd Review” and “Social Review” respectively:

  • Crowd Review: Google’s PageRank algorithm looks at the link structure of the entire web, and extracts a number (PageRank) that represents how positively the web thinks about a particular website.
  • Social Review: Twitter and Facebook show you links that have been shared explicitly by your friends, and people you follow.

One can think of the peer review system in the journal industry as “two person review”:

  • Two Person review: Two people are selected to review the paper on behalf of the entire possible audience for that paper.

The drawbacks of the Two Person review process are that it is:

  • expensive: $8 billion a year is spent on subscriptions to journals, which is money that could be spent on more research.
  • slow: the Two Person review process takes about 6 months to 2 years to complete, sometimes more.
  •  of questionable quality: the two people who are selected as peer reviewers may be biased against the paper, or unqualified, or just in a bad mood, when reviewing it.
  •  unchanging: the judgement is fixed, and doesn’t change as the impact of the paper changes
  •  a lot of work for the reviewers: it takes a lot of time to review a paper, and the review is not published, so reviewer doesn’t receive credit for their work.

More and more, academics are discovering research papers nowadays via the web, and in particular, via search engines and social platforms:

  • Search engines: Google, Google Scholar, Pubmed
  • Social platforms: Academia.edu, arXiv, Mendeley, ResearchGate, blogs, conversations with colleagues over email or IM, Facebook and Twitter.

As research distribution has moved to the web mostly, so the discovery engines for research content are the same as those for general web content. The peer review mechanism is evolving from The Two Person review process to the Crowd Review process, and the Social Review process.

But has the research been done to a high standard?

People often say that the formal peer review process helps ensure that all the accessible research is above a certain minimum quality. The fear is that if this quality floor was removed, things would start falling apart: an academic would be reading a paper, and would have no idea whether to trust it or not.

The experience of the web is that this fear is over-blown. There is no quality floor for content on the web. There is bad content on the web, and there is great content. The job of search engines and social platforms is to ensure that the content that you discover, either via Google or Facebook, is of the good kind. The success of the web shows that the discovery engines do a good job generally.

Discovery and credit systems are powered by the same metrics

Peer review in the journal industry has historically played another interesting role, other than powering research discovery. It has helped an academic build up academic credit, which is required to get grants, and get jobs. People on hiring and grant committees have historically focused on how many peer reviewed publications an academic has in order to get a sense of the academic’s level of achievement, and in order to see how deserving the academic is of the grant or job in question.

The peer review system has historically played this dual role, in powering both the discovery system and the credit system, because ultimately research discovery and research credit are about the same issue: which is the good research? Whichever systems are good at answering that question will drive both the discovery system and the credit system.

One new metric of academic credit that has emerged over the last few years is the citation count. Google Scholar makes citation counts public for papers, and so now everyone can see them easily. Citations between papers are like links between websites, and citation counts are an instance of the Crowd Review process.

Legend has it that Larry Page came up with the idea of PageRank after reflecting on the analogy between citations and links. Citation counts nowadays play the dual role of driving discovery on Google Scholar, as they determine the ordering of the search results, and help to determine academic credit.

Academic credit from social platforms

In the case of social platforms, the metric that drives discovery is how much interaction there is with your content on the social platform in question. Examples of such interaction include:

  • numbers of followers you have
  • the number of times your content is shared, liked, commented on, viewed.

These metrics show how much interest there is in your papers, and how widely they are read right now, and thus provide a sense of their level of impact.

One drawback of citation counts as a metric of academic credit is that they are a lagging indicator, in that they take a while to build up. If you publish a paper now, it is going to take several years for a body of papers to emerge that cite your paper. This leads to academics experiencing a credit gap, where papers they have published in the last 3-4 years hardly impact their academic credit.

The advantage of the kinds of metrics that social platforms like Academia.edu, Mendeley, and SSRN provide is that they are real time, and they fill this credit gap. Academics are increasingly including these real time metrics in their applications for jobs and for grants. The competition for jobs, and grants is intense, and having more data that speaks to the impact of your work helps.

Funding bodies are also eager to see more data about the impact of research, as it helps them make better decisions.

Instant Distribution and Peer Review

The prospect of instant distribution of research is tremendously exciting. If you can tap the global brain of your research community in effectively close to real time, as opposed to waiting 6 months to 24 months to distribute your ideas, there could be a wonderful acceleration in the rate of idea generation.

The web has shown that you can take out this 6 month to 24 month distribution delay, which occurs when research is undergoing the Two Person peer review process, and see high quality filtering of content done by new peer review mechanisms, Crowd Review and Social Review, which are faster, cheaper, and more personalized.

The web is also an incredible place for new ideas to be invented and to take hold. No doubt new peer review mechanisms will emerge in the future that will advance beyond Crowd Review and Social Review.