
Amanda Kvarven
Postdoctoral Scholar, SCRDP/ Heart Disease Prevention
Bio
I am a postdoctoral fellow at the meta-research and innovation center at Stanford, METRICS, with a focus on the validity of meta-analysis and other methodologies, particularly within social science.
I completed a PhD in Economics at the University of Bergen in 2022, where my dissertation was focused on bias in meta-analysis. In addition to meta-analysis, my work is related to testing and advancing methods and practices to achieve a higher level of generalizability, robustness and reproducibility in scientific work.
Stanford Advisors
-
John Ioannidis, Postdoctoral Faculty Sponsor
-
John Ioannidis, Postdoctoral Research Mentor
2023-24 Courses
- Meta-research: Appraising Research Findings, Bias, and Meta-analysis
CHPR 206, EPI 206, MED 206, STATS 211 (Win) -
Prior Year Courses
2022-23 Courses
- Meta-research: Appraising Research Findings, Bias, and Meta-analysis
CHPR 206, EPI 206, MED 206, STATS 211 (Win)
- Meta-research: Appraising Research Findings, Bias, and Meta-analysis
All Publications
-
The intuitive cooperation hypothesis revisited: a meta-analytic examination of effect size and between-study heterogeneity
JOURNAL OF THE ECONOMIC SCIENCE ASSOCIATION-JESA
2020; 6 (1): 26-42
View details for DOI 10.1007/s40881-020-00084-3
View details for Web of Science ID 000516954200001
-
Comparing meta-analyses and preregistered multiple-laboratory replication projects
NATURE HUMAN BEHAVIOUR
2020; 4 (4): 423-+
Abstract
Many researchers rely on meta-analysis to summarize research evidence. However, there is a concern that publication bias and selective reporting may lead to biased meta-analytic effect sizes. We compare the results of meta-analyses to large-scale preregistered replications in psychology carried out at multiple laboratories. The multiple-laboratory replications provide precisely estimated effect sizes that do not suffer from publication bias or selective reporting. We searched the literature and identified 15 meta-analyses on the same topics as multiple-laboratory replications. We find that meta-analytic effect sizes are significantly different from replication effect sizes for 12 out of the 15 meta-replication pairs. These differences are systematic and, on average, meta-analytic effect sizes are almost three times as large as replication effect sizes. We also implement three methods of correcting meta-analysis for bias, but these methods do not substantively improve the meta-analytic results.
View details for DOI 10.1038/s41562-019-0787-z
View details for Web of Science ID 000507725500002
View details for PubMedID 31873200
View details for PubMedCentralID 1182327