Nature Human Behaviour ( IF 21.4 ) Pub Date : 2024-11-19 , DOI: 10.1038/s41562-024-02062-9 Felix Holzmeister, Magnus Johannesson, Colin F. Camerer, Yiling Chen, Teck-Hua Ho, Suzanne Hoogeveen, Juergen Huber, Noriko Imai, Taisuke Imai, Lawrence Jin, Michael Kirchler, Alexander Ly, Benjamin Mandl, Dylan Manfredi, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Alexandra Sarafoglou, Rene Schwaiger, Eric-Jan Wagenmakers, Viking Waldén, Anna Dreber
Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015–2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.
中文翻译:
检查决策市场选择的在线实验的可复制性
在这里,我们测试了使用决策市场选择研究进行复制的可行性,并提供了有关在线实验可复制性的证据。社会科学家 (n = 162) 根据 PNAS 2015-2018 上发表的 41 个系统选择的 MTurk 社会科学实验的紧密重复结果进行交易,知道将选择最低的 12 项研究和最终市场价格最高的 12 项研究,以及 2 项随机选择的研究进行复制。基于统计显著性指标,前 12 组的复制率为 83%,后 12 组为 33%。总体而言,54% 的研究被成功复制,复制效应大小估计平均为原始效应大小估计的 45%。对于替代复制指标,复制率在 54% 和 62% 之间变化。观察到的 MTurk 实验的可重复性与以前涉及实验室实验的系统复制项目相当。