Zur Kurzanzeige

dc.date.accessioned2021-09-29T13:43:09Z
dc.date.available2021-09-29T13:43:09Z
dc.date.issued2021-02-03
dc.identifierdoi:10.17170/kobra-202109144763
dc.identifier.urihttp://hdl.handle.net/123456789/13272
dc.description.sponsorshipGefördert im Rahmen des Projekts DEALger
dc.language.isoengeng
dc.rightsNamensnennung 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectdisfluency effecteng
dc.subjecttransparencyeng
dc.subjectmeta-analytical standardseng
dc.subjectopen-scienceeng
dc.subjectreproducibilityeng
dc.subject.ddc150
dc.titleNull and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibilityeng
dc.typeAufsatz
dcterms.abstractIn the 2018 meta-analysis of Educational Psychology Review entitled “Null effects of perceptual disfluency on learning outcomes in a text-based educational context” by Xie, Zhou, and Liu, we identify some errors and inconsistencies in both the methodological approach and the reported results regarding coding and effect sizes. While from a technical point of view the meta-analysis aligns with current meta-analytical guidelines (e.g., PRISMA) and conforms to general meta-analytical requirements (e.g., considering publication bias), it exemplifies certain insufficient practices in the creation and review of meta-analysis. We criticize the lack of transparency and negligence of open-science practices in the generation and reporting of results, which complicate evaluation of the meta-analytical reproducibility, especially given the flexibility in subjective choices regarding the analytical approach and the flexibility in creating the database. Here we present a framework applicable to pre- and post-publication review on improving the Methods Reproducibility of meta-analysis. Based on considerations of the transparency and openness (TOP)-guidlines (Nosek et al. Science 348: 1422–1425, 2015), the Reproducibility Enhancement Principles (REP; Stodden et al. Science 354:1240–1241, 2016), and recommendations by Lakens et al. (BMC Psychology 4: Article 24, 2016), we outline Computational Reproducibility (Level 1), Computational Verification (Level 2), Analysis Reproducibility (Level 3), and Outcome Reproducibility (Level 4). Applying reproducibility checks to TRANSFER performance as the chosen outcome variable, we found Xie’s and colleagues’ results to be (rather) robust. Yet, regarding RECALL performance and the moderator analysis, the identified problems raise doubts about the credibility of the reported results.eng
dcterms.accessRightsopen access
dcterms.creatorWeißgerber, Sophia Christin
dcterms.creatorBrunmair, Matthias
dcterms.creatorRummer, Ralf
dc.relation.doidoi:10.1007/s10648-020-09579-1
dc.subject.swdTransparenzger
dc.subject.swdMetaanalyseger
dc.subject.swdOpen Scienceger
dc.subject.swdReproduzierbarkeitger
dc.type.versionpublishedVersion
dcterms.source.identifiereissn:1573-336X
dcterms.source.issueIssue 3
dcterms.source.journalEducational Psychology Revieweng
dcterms.source.pageinfo1221-1247
dcterms.source.volumeVolume 33
kup.iskupfalse


Dateien zu dieser Ressource

Thumbnail
Thumbnail

Das Dokument erscheint in:

Zur Kurzanzeige

Namensnennung 4.0 International
Solange nicht anders angezeigt, wird die Lizenz wie folgt beschrieben: Namensnennung 4.0 International