Great Experiments in Science Publishing

It’s a great time to follow scientific publishing: right now there are some innovative and even radical experiments happening. Open access has, of course, been the biggest (and most successful) experiment. But figuring out how to run a good journal without a paywall is more of an economic innovation, rather than an innovation in how we communicate science. There are other fascinating experiments underway that go beyond open access.

PLOS One goes all-in on post-publication peer review, publishing papers after a review for methodological soundness, and letting the community decide whether the work is significant. eLife tries to make the traditional publishing approach less wasteful by forcing editors and reviewers to talk to each other to produce a consensus review. Faculty of 1000, PeerJ, and The Winnower are trying various more radical experiments in peer review. And Academia.edu and ResearchGate are both trying to harness the power of social media to help researchers communicate their work with each other.

These are fascinating experiments, but do they work? It’s a hard question to answer, but in my latest Pacific Standard column, I take a look at a recent study by Academia.edu, which found that papers posted to their site had a citation advantage — on average, 83 percent more five years after publication. The study is not published in a peer-reviewed journal (for now), but it’s out there for the community to review: the authors have released all their code and data alongside the report.

The question of whether there is a citation advantage for certain types of publications (e.g., open access journals) has been controversial and hard to resolve. There are clearly many potentially confounding variables that have to be controlled for if you want to make a convincing case. The Academia.edu study takes a stab at this, and it is a provocative attempt to get the scientific publishing community to focus not just on the question of open access in general, but specifically on how it’s implemented:

Beyond Academia.edu, our work raises questions about how characteristics of venues matter for open access citations. To our knowledge there has been no research on what features of open access repositories or databases make articles easier to discover, and to what extent that leads to increased citations.

As Academia.edu’s founder, Richard Price told me, we need to explore whether savvier use of social media tools will make for a better publishing system, one that helps people find work that otherwise would have gone unnoticed:

Certain open access platforms are push networks: articles are pushed out to followers on upload, and sometimes there are viral properties where followers can re-share the article with their followers. A tentative conclusion is that push networks with viral properties generate more exposure for papers, and this exposure leads to citations.

Advertisements

2 responses to “Great Experiments in Science Publishing

  1. http://openpsych.net/ is using open forum peer review. So authors submit a paper by starting a thread on the forum. Reviewers will then engage the author just like on any regular forum. It seems to work quite well.

    Disclaimer: I am the co-founder. We are non-profit and I pay for all expenses and made the website.

    • Thanks for letting us know about another publishing experiment. We can’t know how well things like open peer review work until we try them.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s