Towards a scientific process freed from systemic bias
- Date:
- January 26, 2015
- Source:
- Springer
- Summary:
- Research on how science works -- the science of science -- can benefit from studying the digital traces generated during the research process, such as peer-reviewed publications. This type of research is crucial for the future of science and that of scientists, according to experts.
- Share:
Large-scale analysis of bibliographic data can help us better understand the complex social processes in science and provide more accurate evaluation methods.
Research on how science works -- the science of science -- can benefit from studying the digital traces generated during the research process, such as peer-reviewed publications. This type of research is crucial for the future of science and that of scientists, according to Frank Schweitzer, Chair of Systems Design at ETH Zurich, in Switzerland. Indeed, quantitative measures of scientific output and success in science already impact the evaluation of researchers and the funding of proposals. He shares his views in an Editorial spearheading a thematic series of articles entitled "Scientific networks and success in science," published in EPJ Data Science. There, Schweitzer notes, "it is appropriate to ask whether such quantitative measures convey the right information and what insights might be missing."
As studies in this thematic series demonstrate, data science is in a unique position to leverage large data sets and the latest statistical analysis techniques, and to validate and quantify phenomena related to scientific evaluation and publishing practices empirically. For example, Alexander Petersen and Orion Penner from the IMT Lucca Institute for Advanced Studies, Italy, found a strong cumulative advantage by which the initial publishing success of individuals is amplified over time.
In a separate study, Christian Schulz from ETH Zurich and colleagues show how different authors with the same name in a publication repository can be identified by means of an analysis of the similarity of their respective citation networks.
Finally, Emre Sarigöl from ETH Zurich and colleagues address the question of whether quantitative, citation-based measures of scientific impact can be seen as objective. They show that the position of scientists in the collaboration network alone is -- to a surprisingly large degree -- indicative for the future citation success of their papers. Citation-based measures may therefore not be the most appropriate way to quantify scientific impact.
Journal References:
- Alexander Petersen, Orion Penner. Inequality and cumulative advantage in science careers: a case study of high-impact journals. EPJ Data Science, 2014; 3 (1): 24 DOI: 10.1140/epjds/s13688-014-0024-y
- Christian Schulz, Amin Mazloumian, Alexander M Petersen, Orion Penner, Dirk Helbing. Exploiting citation networks for large-scale author name disambiguation. EPJ Data Science, 2014; 3 (1): 11 DOI: 10.1140/epjds/s13688-014-0011-3
- Emre Sarigöl, Rene Pfitzner, Ingo Scholtes, Antonios Garas, Frank Schweitzer. Predicting Scientific Success Based on Coauthorship Networks. Submitted to arXiv, 2014 [abstract]
Cite This Page: