You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Of the various areas of impact (academic, societal and economic) of Open Science, academic impact is probably the most well-established and researched impact area. In particular, indicators of [citation impact](citation_impact.qmd) and [collaborations](collaboration_intensity.qmd) have a long history and a long development and are well-established. At the same time, there are also more recently developed indicators that are becoming relevant in this area, such as the [reuse of data](reuse_of_data_in_research.qmd) and [code](reuse_of_code_in_research.qmd) in research, and are still actively being developed. The rise of reference managers enabled tracking [readership](academic_readership.qmd) instead of only citations, making it possible to distinguish between papers that are read widely and papers that are cited widely. At the same time, aspects of academic impact such as “quality”, [“novelty” or “interdisciplinarity”](interdisciplinarity.qmd) remain complicated to develop robust indicators for based on existing data sources, thus requiring more in-depth manual assessments.
Copy file name to clipboardExpand all lines: sections/5_reproducibility/introduction_reproducibility.qmd
+28-26Lines changed: 28 additions & 26 deletions
Original file line number
Diff line number
Diff line change
@@ -15,40 +15,42 @@ affiliations:
15
15
city: Athena
16
16
country: Greece
17
17
18
-
title: Introduction to Reproducibility
19
-
---
18
+
title: Introduction to Reproducibility
19
+
20
20
21
+
repo-actions: true
22
+
---
21
23
22
24
Large-scale computation and the rise of data-driven methodologies have transformed the way scientific research is conducted in many disciplines. Open Science with its overarching goals of sharing research outcomes (resources, methods, or tools) as well as the flow of the actual research processes has become a key enabler for scientific discovery and faster knowledge spillover, contributing or leading those major shifts in science.
23
25
24
26
In the backdrop of these changes, reproducibility and replicability have raised critical concerns about the development and evolution of science and the way we generate reliable knowledge. Open Science could streamline the requisite processes addressing reproducibility challenges and accelerate the uptake of good practices about research integrity.
25
27
26
28
In PathOS, reproducibility refers strictly to computational reproducibility and computational non-reproducibility. Concretely, we define reproducibility as a continuous, “ongoing” process, ranging from
27
29
28
-
- systematic efforts to regenerate/reproduce computationally a previous study,
29
-
- studies that re-use or build upon or expand (part of) the research outputs of a previous study,
30
-
- studies that verify or confirm the results of a previous study by collecting and analysing new data,
31
-
- studies that provide evidence that support or refute scientific claims and inferences from a previous study, to
32
-
- studies conducting meta-analyses and research synthesis by consolidating, evaluating, interpreting, and contextualizing findings from previous studies on a particular topic.
30
+
-systematic efforts to regenerate/reproduce computationally a previous study,
31
+
-studies that re-use or build upon or expand (part of) the research outputs of a previous study,
32
+
-studies that verify or confirm the results of a previous study by collecting and analysing new data,
33
+
-studies that provide evidence that support or refute scientific claims and inferences from a previous study, to
34
+
-studies conducting meta-analyses and research synthesis by consolidating, evaluating, interpreting, and contextualizing findings from previous studies on a particular topic.
33
35
34
36
In the following sections, we delve into the following aspects in the intersection of OS and reproducibility providing relevant indicators (summarized in the table below) while keeping a pragmatic approach to what is feasible in terms of measuring, monitoring, and evaluating reproducibility.
0 commit comments