Essay and Opinion

# Principles of the Self-Journal of Science: bringing ethics and freedom to scientific publishing

Version 1 Released on 24 January 2015 under Creative Commons Attribution 4.0 International License

## Authors' affiliations

1.  SJS - The Self Journal of Science

# Abstract

I present the core principles of the “Self-Journal of Science” (SJS), an open repository as well as a new paradigm of scientific publication. Rooted in Science ethics, a full and consistent solution is proposed to address the many flaws in current systems. SJS implements an optimal peer review, which itself becomes a measurable process, and builds an objective and unfalsifiable evaluation system. In addition, it can operate at very low costs. One of the essential features of SJS is to allow every scientist to play his full role as a member of the scientific community and to be credited for all contributions – whether as author, referee, or editor. The output is the responsibility of each scientist, and no subgroup can dictate scientific policy to all. By fully opening up the process of publication, peer pressure becomes the force that drives output towards the highest quality in a virtuous self-regulating circle. SJS also provides a self-organizing and scalable solution to handle an ever-increasing number of articles.

## Introduction : Science ethics

When addressing the problems inherent in the system currently used for publishing scientific results, it is important not to lose focus on what Science is. Science can be described as a two-step process. First comes the “research” part – one produces a scientific thesis according to an appropriate methodology. To be considered scientific, this thesis must fulfill some conditions associated with clarity, transparency, and verifiability; it must include well-defined statements that can be proven wrong, and experiments that can be reproduced. Then comes “publication” – results and conclusions are made public so they can be tested, verified, and debated by the scientific community. The ultimate goal is to:

• reach a global consensus that promotes a scientific thesis to a scientific fact (peer review)
• assess the importance of the thesis and fact by fitting them into the broader scheme of scientific knowledge (evaluation)

This process is necessarily community-wide and inevitably takes time. Moreover, the peer review should also be transparent: counter-arguments must be public and well-defined, counter-experiments must be reproducible. The uniqueness of Science, not to say its glory, comes from the fact that the knowledge it produces can withstand such a demanding process.

I suggest that all shortcomings in the current publication system are rooted in the fact that it has drifted away from Science ethics, with publication – peer review, evaluation and dissemination – being privatized. A process whose rationale is to be open, transparent, and community-wide has become trapped in editors' mailboxes. The validity and value of a scientific work are both decided once and for all time, by two or three people in a process that is confidential, private, anonymous, undocumented, and with short deadlines. Here, I use the term “privatization” not mean that the process is conducted by private companies, but to imply it concentrated in a few hands. Whilst some may consider that private publishers charge exorbitant (and unaffordable) prices for their journals, my arguments still stand if the current system was entirely run by public institutions, learned societies or any non-profit organization.

Science is both a collective endeavour and the responsibility of all scientists. Therefore, Science can only be published in a scientific agora (defined in the Oxford English Dictionary as “a public open space where people can assemble, esp. a marketplace, originally in the ancient Greek world”), where every scientist plays his natural role in all its dimensions – researcher, referee and editor. In earlier times, scientific conferences used to be a good approximation of it, where most actors in a given field could meet and debate, at least briefly. However, because of the growth in scientific output and globalization, conferences can never now be large or long enough to achieve this. Fortunately, the Internet now provides the necessary connectivity, which SJS will use.

I now describe some shortcomings of the current system addressed by SJS (solutions will be explained in Material and methods, and the Discussion).

### Privatization of peer review: peer trial

If we agree that peer review – to be deemed scientific – must be open, transparent, and community-wide, and that sufficient time should be given to allow a global consensus to form, we need to use a different word to describe what currently happens in academic journals. As currently applied by such journals, “peer review” is in fact a selection process whose goal is to let an editor make a binary decision regarding the fate of an article (accept or reject). The editor will generally listen to the opinion of one, two, or three selected people (whose identity will not generally be disclosed), and who are believed to be peers of the authors. Consequently, those people hold temporary power to enforce all modifications they want in the article. I think this process is closer to a trial than to a scientific debate, and I propose to refer to it as “peer trial”. All deliberations of this trial are generally hidden from the public, to the point that – as readers – we cannot even be sure it happened [1,2]. Of course, such a trial may be fair, with editor and anonymous jury genuinely committed to scientific principles. But it is not relevant here to discuss whether this is often the case or not; it is simply unscientific, undocumented, and not open – which inevitably undermines credibility and leaves the system open to a wide range of criticisms including:

(i) Misdoings of referees:

• Bias towards the authors : friends, foes, competitors, “wrong” gender, fame of the affiliation [3], nationality ...
• Bias towards the thesis/results (which might differs from the referee's) [4] ...
• Incompetence (often unavoidable in interdisciplinary fields), lack of commitment...
• Conflict of interests [5]
• Abuse of power (e.g. requires that he/she be cited)
• Delay (so the referee can publish first)
• Loss of confidentiality
• ...

(ii) Impossibility for two people to check all the possible problems in a limited time, which leads to [6]:

• errors
• frauds
• fake data
• plagiarism
• redundant publication
• ...

(iii) Economic inefficiency:

• many sound (and unsound) articles go through cycles of reviewing and rejection that start from scratch each time, which takes the time of many referees to no avail.

All these shortcomings disappear in a community-wide, open, transparent and time-unlimited peer review.

## References

1. Richard Van Noorden. Publishers withdraw more than 120 gibberish papers. Nature, 2014.
3. Douglas P Peters and Stephen J Ceci. Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5(02):187–195, 1982.
4. Mohammadreza Hojat, Joseph S Gonnella, and Addeane S Caelleigh. Impartial judgment by the “gatekeepers” of science: fallibility and accountability in the peer review process. Advances in Health Sciences Education, 8(1):75–96, 2003.
5. Sheldon Krimsky, LS Rothenberg, P Stott, and G Kyle. Scientific journals and their authors' financial interests: A pilot study. Psychotherapy and psychosomatics, 67(4-5):194–201, 1998.
6. retractionwatch.com.
7. M. Ware and M. Mabe. The stm report. 2012.
8. Reed Elselvier. Reed Elsevier Annual Reports and Financial Statements . 2013.
9. Junguk Hur, Kelli A Sullivan, Adam D Schuyler, Yu Hong, Manjusha Pande, HV Jagadish, Eva L Feldman, et al. Literature-based discovery of diabetes-and ROS-related targets. BMC medical genomics, 3(1):49, 2010.
10. David F Horrobin. The philosophical basis of peer review and the suppression of innovation. Jama, 263(10):1438–1441, 1990.
11. S. Greaves, J. Scott, M. Clarke, L. Miller, T. Hannay, A. Thomas, and P. Campbell. Overview: Nature's peer review trial. Nature, 2006.
12. Peter M. Rothwell and Christopher N. Martyn. Reproducibility of peer review in clinical neuroscience: Is agreement between reviewers any greater than would be expected by chance alone? Brain, 123(9):1964–1969, 2000.
13. Cyril Labbé. Ike Antkare, one of the great stars in the scientific firmament. International Society for Scientometrics and Informetrics Newsletter, 6(2):48–52, 2010.
14. http://fr.arxiv.org/help/support/2012_budget.
Filters:

## {{childPeer.user.name}}

{{getCommentDate(childPeer.date) | amDateFormat:'MMM Do YYYY, HH:mm'}}

## {{childPeer.user.name}}

{{getCommentDate(childPeer.date) | amDateFormat:'MMM Do YYYY, HH:mm'}}