Datum
2021-02-03Metadata
Zur Langanzeige
Aufsatz
Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility
Zusammenfassung
In the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regarding coding and effect sizes. While from a technical point of view the meta-analysis aligns with current meta-analytical guidelines (e.g., PRISMA) and conforms to general meta-analytical requirements (e.g., considering publication bias), it exemplifies certain insufficient practices in the creation and review of meta-analysis. We criticize the lack of transparency and negligence of open-science practices in the generation and reporting of results, which complicate evaluation of the meta-analytical reproducibility, especially given the flexibility in subjective choices regarding the analytical approach and the flexibility in creating the database. Here we present a framework applicable to pre- and post-publication review on improving the Methods Reproducibility of meta-analysis. Based on considerations of the transparency and openness (TOP)-guidlines (Nosek et al. Science 348: 1422–1425, 2015), the Reproducibility Enhancement Principles (REP; Stodden et al. Science 354:1240–1241, 2016), and recommendations by Lakens et al. (BMC Psychology 4: Article 24, 2016), we outline Computational Reproducibility (Level 1), Computational Verification (Level 2), Analysis Reproducibility (Level 3), and Outcome Reproducibility (Level 4). Applying reproducibility checks to TRANSFER performance as the chosen outcome variable, we found Xie’s and colleagues’ results to be (rather) robust. Yet, regarding RECALL performance and the moderator analysis, the identified problems raise doubts about the credibility of the reported results.
Zitierform
In: Educational Psychology Review Volume 33 / Issue 3 (2021-02-03) , S. 1221-1247 ; eissn:1573-336XFörderhinweis
Gefördert im Rahmen des Projekts DEALZitieren
@article{doi:10.17170/kobra-202109144763,
author={Weißgerber, Sophia Christin and Brunmair, Matthias and Rummer, Ralf},
title={Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility},
journal={Educational Psychology Review},
year={2021}
}
0500 Oax 0501 Text $btxt$2rdacontent 0502 Computermedien $bc$2rdacarrier 1100 2021$n2021 1500 1/eng 2050 ##0##http://hdl.handle.net/123456789/13272 3000 Weißgerber, Sophia Christin 3010 Brunmair, Matthias 3010 Rummer, Ralf 4000 Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility / Weißgerber, Sophia Christin 4030 4060 Online-Ressource 4085 ##0##=u http://nbn-resolving.de/http://hdl.handle.net/123456789/13272=x R 4204 \$dAufsatz 4170 5550 {{Transparenz}} 5550 {{Metaanalyse}} 5550 {{Open Science}} 5550 {{Reproduzierbarkeit}} 7136 ##0##http://hdl.handle.net/123456789/13272
2021-09-29T13:43:09Z 2021-09-29T13:43:09Z 2021-02-03 doi:10.17170/kobra-202109144763 http://hdl.handle.net/123456789/13272 Gefördert im Rahmen des Projekts DEAL eng Namensnennung 4.0 International http://creativecommons.org/licenses/by/4.0/ disfluency effect transparency meta-analytical standards open-science reproducibility 150 Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility Aufsatz In the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regarding coding and effect sizes. While from a technical point of view the meta-analysis aligns with current meta-analytical guidelines (e.g., PRISMA) and conforms to general meta-analytical requirements (e.g., considering publication bias), it exemplifies certain insufficient practices in the creation and review of meta-analysis. We criticize the lack of transparency and negligence of open-science practices in the generation and reporting of results, which complicate evaluation of the meta-analytical reproducibility, especially given the flexibility in subjective choices regarding the analytical approach and the flexibility in creating the database. Here we present a framework applicable to pre- and post-publication review on improving the Methods Reproducibility of meta-analysis. Based on considerations of the transparency and openness (TOP)-guidlines (Nosek et al. Science 348: 1422–1425, 2015), the Reproducibility Enhancement Principles (REP; Stodden et al. Science 354:1240–1241, 2016), and recommendations by Lakens et al. (BMC Psychology 4: Article 24, 2016), we outline Computational Reproducibility (Level 1), Computational Verification (Level 2), Analysis Reproducibility (Level 3), and Outcome Reproducibility (Level 4). Applying reproducibility checks to TRANSFER performance as the chosen outcome variable, we found Xie’s and colleagues’ results to be (rather) robust. Yet, regarding RECALL performance and the moderator analysis, the identified problems raise doubts about the credibility of the reported results. open access Weißgerber, Sophia Christin Brunmair, Matthias Rummer, Ralf doi:10.1007/s10648-020-09579-1 Transparenz Metaanalyse Open Science Reproduzierbarkeit publishedVersion eissn:1573-336X Issue 3 Educational Psychology Review 1221-1247 Volume 33 false
Die folgenden Lizenzbestimmungen sind mit dieser Ressource verbunden: