A major principle of the scientific method is replication, or the ability of an experiment or study to be duplicated. A similar idea is “reproducible research,” that the results of a study can be reproduced given the raw data and the analysis protocols or methods (usually in code, script, or syntax form). With full transparency, the reader has the ability to follow and evaluate all the analytical decisions that led to the study conclusion.
Especially with the introduction of AI, the concept of reproducible research has become even more important. The availability of tools like chatGPT to provide data analysis and code, the rise in the number of published articles, the increasing complexity of data analysis, and the alarming recent rise in retraction rates have all increased the focus on reproducibility and transparency.
This workshop covers ethical issues in data analysis, including concepts such as HARKing, P-hacking, Fishing Expeditions and the File Drawer Effect. It also covers possible solutions to these issues, including strategies that can be adopted locally by the research team, as well as publishing practices that can be encouraged to push for reproducible research in entire fields of research.