Publishing in academia still bears the imprints of the book age.
Just as in the story of the QWERTY keyboard, a system of academic publishing prevailed that works, but is suboptimal. The established system of academic publishing, from submission, review, and publication is in the eye of the socio-technological opportunities outdated. It takes too much time, it is too expensive and leads to an artificial scarcity of content. It no longer reflects the zeitgeist.
The Biological Records Centre, which supports more than 80 wildlife recording societies and schemes, is celebrating its 50th anniversary.
Its data, submitted by volunteers, is used by scientists, such as monitoring the spread of invasive species.
It has also helped researchers gain insight into ecological concerns, such as the demise of pollinating insects.
Journal, Research Network or Preprint archive?
Or simply a "Research and Publishing Network"?
A few words about Open Science and Peter Murray-Rust.
Here is an editorial essay by Gerald F. Davis that appeared recently in Administrative Science Quarterly.
The Web has greatly reduced the barriers to entry for new journals and other platforms for communicating scientific output, and the number of journals continues to multiply. This leaves readers and authors with the daunting cognitive challenge of navigating the literature and discerning contributions that are both relevant and significant. Meanwhile, measures of journal impact that might guide the use of the literature have become more visible and consequential, leading to “impact gamesmanship” that renders the measures increasingly suspect. The incentive system created by our journals is broken. In this essay, I argue that the core technology of journals is not their distribution but their review process. The organization of the review process reflects assumptions about what a contribution is and how it should be evaluated. Through their review processes, journals can certify contributions, convene scholarly communities, and curate works that are worth reading. Different review processes thereby create incentives for different kinds of work. It’s time for a broader dialogue about how we connect the aims of the social science enterprise to our system of journals.
Its open access.
Blog post on data sharing in a publication-driven academic system:
Would more researchers share data if they got more for it? Possibly. The currency does not even have to change. What is missing in the academic system is the recognition for intermediaries, also for data. Those who publish well get cited. The H-index increases and thereby the chances for professional advancement. Good articles are good for the career. Good data however are still not as important than they should be.
Anupam Chander, Director of the California International Law Center, was recently our guest at the HIIG Open Journal Club. You can listen to his talk and watch an additional interview.
Here is the resonator podcast from Germany's Helmholtz Gesellschaft with an episode on Open Science. Host Holger Klein interviews Hans Pfeiffenberger from the AWI
James Evans on science in the age of computing and search engines. Interview for Zeit Online (in German)
Check out the working paper that two of our HIIG colleagues, Sascha Friesike and Benedikt Fecher, published on the barriers of data sharing. Feedback is welcome!
Despite widespread support from policy makers, funding agencies, and scientific journals, academic researchers rarely make their research data available to others. At the same time, data sharing in research is attributed a vast potential for scientific progress. It allows the reproducibility of study results and the reuse of old data for new research questions. Based on a systematic review of 98 scholarly papers and an empirical survey among 603 secondary data users, we develop a conceptual framework that explains the process of data sharing from the primary researcher’s point of view. We show that this process can be divided into six descriptive categories: Data donor, research organization, research community, norms, data infrastructure, and data recipients. Drawing from our findings, we discuss theoretical implications regarding knowledge creation and dissemination as well as research policy measures to foster academic collaboration. We conclude that research data cannot be regarded a knowledge commons, but research policies that better incentivise data sharing are needed to improve the quality of research results and foster scientific progress.
Grand et al. on data issues in open science:
Open science is a practice in which the scientific process is shared completely and in real time. It offers the potential to support information flow, collaboration and dialogue among professional and non-professional participants. Using semi-structured interviews and case studies, this research investigated the relationship between open science and public engagement. This article concentrates on three particular areas of concern that emerged: first, how to effectively contextualise and narrate information to render it accessible, as opposed to simply available; second, concerns about data quantity and quality; and third, concerns about the skills required for effective contextualisation, mapping and interpretation of information