About Meta Student
Meta Student is a student-centred platform for collaborative, transparent meta-research. Each year, thousands of undergraduate research projects are completed at institutions worldwide — yet the vast majority of this data is never published or shared. Meta Student exists to change that, transforming individual student projects into meaningful contributions to cumulative science.
The undergraduate research waste problem
Undergraduate research generates an enormous volume of data each year, yet the majority of student projects are never disseminated outside the institution. Discipline-specific surveys put publication rates for student research in the single digits to low tens of percent; psychology dissertations publish at roughly 25% (Evans et al., 2018), social-work dissertations at a comparable rate, and undergraduate work itself typically falls well below those figures (Walkington & Jenkins, 2008; Spronken-Smith et al., 2013). Most undergraduate datasets — collected with real effort, ethical approval, and academic supervision — are effectively lost to the literature. This is research waste on a large scale, and it is a category of waste that has been widely identified as both substantial and avoidable across the wider research enterprise (Chalmers & Glasziou, 2009; Glasziou et al., 2014).
Undergraduate research generates a substantial volume of empirical data with genuine scientific value. The data simply lacks the sample size to stand alone. That is exactly where Meta-Student comes in.
By aggregating well-documented student datasets into pooled meta-analyses, Meta-Student converts individually underpowered studies into statistically robust, generalisable findings. Students justify their own study, design their own analysis, contextualise their own findings, and still write up an independent project — but within a shared, standardised data-collection protocol that makes their contribution compatible with other students work. The science is theirs; the platform makes it count.
Why we exist
Across many empirical disciplines, published effects are often smaller — and sometimes absent — when tested in larger, more robust replication studies. Landmark cross-laboratory projects in psychology (Open Science Collaboration, 2015; Klein et al., 2018), preclinical cancer biology (Errington et al., 2021), economics and the social sciences (Camerer et al., 2018), and nutrition (Schoenfeld & Ioannidis, 2013) have repeatedly converged on the same conclusion: single small studies are unreliable, and replication effect sizes are typically smaller than the originals. The need for routine replication, larger samples, and more open and collaborative research practices is now broadly accepted across fields.
Undergraduate training environments can play a crucial role in shaping long-term research quality. Questionable research practices (QRPs) are prevalent in academic research broadly — driven less by intent than by systemic incentives that reward publication quantity over rigour (Smaldino & McElreath, 2016; Gopalakrishna et al., 2022). Embedding open-science practices into the undergraduate experience from the start helps establish transparency and reproducibility as the default, not the exception.
Our mission
- Empower students to contribute data to pooled analyses and meta-analyses.
- Teach transparent, reproducible methods through hands-on experience.
- Increase statistical power by combining well-documented student datasets.
- Provide an ethical, credit-bearing route to co-authorship for student contributors.
What we do
We accept student project datasets together with a short methodology and de-identified data file. Submissions follow a standard template so that datasets can be harmonised, validated, and included in pooled meta-analyses — regardless of discipline. Where appropriate, student contributors are offered co-authorship on any resulting publications and support for reproducible reporting.
Core principles
- Openness: metadata and analysis scripts are shared as part of any report
- Transparency: each dataset includes a short methodology and a codebook for reproducibility.
- Education: contributors learn about meta-synthesis, open research, and reproducible practices.
- Credit: students who contribute meaningful data are recognised and acknowledged as co-authors.
The replication problem
The Sports Science Replication Centre's first multi-lab replication project replicated 25 preregistered studies from quartile-1 sports and exercise science journals published between 2016 and 2021. Roughly 28% of replications met the success criterion, replication effect sizes were on average (median) 75% smaller than the originals, and only 36% of original and replication effect-size estimates were compatible(Murphy et al., 2025a; Murphy, Caldwell, & Warne, 2025b). Similar patterns have been observed in psychology (Open Science Collaboration, 2015), preclinical cancer biology (Errington et al., 2021), economics and the social sciences (Camerer et al., 2018), and suggested in nutrition (Schoenfeld & Ioannidis, 2013). The lesson is universal: single small studies are unreliable. Pooled, well-documented datasets help solve this problem.
Large-scale many-laboratory collaborations — such as Many Labs 2, which collected 125 samples from 36 countries(15,305 participants across more than 60 collaborating laboratories) to replicate 28 classic and contemporary findings — have demonstrated that pooling data across sites substantially improves statistical power and the ability to estimate generalisability (Klein et al., 2018). Meta-Student applies the same principle to undergraduate research: by combining standardised student datasets we maximise the utility of work that would otherwise go unused, reduce avoidable research waste, and contribute to cumulative knowledge (Munafò et al., 2017; Nosek, Spies, & Motyl, 2012).
For supervisors
Supervisors face real challenges: final-year projects must be original, motivating, and achievable within tight timelines. Meta Student offers a ready-made, educationally-rich alternative. By directing students to join collaborative pooled analyses you can:
- Offer realistic project scopes based on existing study templates and clear submission checklists.
- Increase student motivation — projects have immediate impact and the possibility of co-authorship.
- Reduce supervision workload: We provide summary methods and guidelines for students, without compromising their independence.
- Give students exposure to robust reproducible methods and transparent reporting — valuable professional skills.
Supervisors can direct individual students or small groups towards open studies, get involved by proposing and leading new studies for the pool, and use the platform to aggregate independent student projects into a pooled, publishable analysis. This pathway supports student learning and produces higher-quality, higher-impact research outcomes.
Learn more
For the original project and full reports, visit the Sports Science Replication Centre: ssreplicationcentre.com.
Get involved
If you're a student or supervisor interested in contributing data or getting involved get started here. We provide templates, codebooks, and step-by-step guides to make submission straightforward.
References
- Camerer, C. F., Dreber, A., Holzmeister, F., et al. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2, 637–644. https://doi.org/10.1038/s41562-018-0399-z
- Chalmers, I., & Glasziou, P. (2009). Avoidable waste in the production and reporting of research evidence. The Lancet, 374(9683), 86–89. https://doi.org/10.1016/S0140-6736(09)60329-9
- Errington, T. M., Mathur, M., Soderberg, C. K., et al. (2021). Investigating the replicability of preclinical cancer biology. eLife, 10, e71601. https://doi.org/10.7554/eLife.71601
- Evans, S. C., Amaro, C. M., Herbert, R., Blossom, J. B., & Roberts, M. C. (2018). “Are you gonna publish that?” Peer-reviewed publication outcomes of doctoral dissertations in psychology. PLOS ONE, 13(2), e0192219. https://doi.org/10.1371/journal.pone.0192219
- Glasziou, P., Altman, D. G., Bossuyt, P., et al. (2014). Reducing waste from incomplete or unusable reports of biomedical research. The Lancet, 383(9913), 267–276. https://doi.org/10.1016/S0140-6736(13)62228-X
- Gopalakrishna, G., ter Riet, G., Vink, G., et al. (2022). Prevalence of questionable research practices, research misconduct and their potential explanatory factors. PLOS ONE, 17(2), e0263023. https://doi.org/10.1371/journal.pone.0263023
- Klein, R. A., Vianello, M., Hasselman, F., et al. (2018). Many Labs 2: Investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490. https://doi.org/10.1177/2515245918810225
- Munafò, M. R., Nosek, B. A., Bishop, D. V. M., et al. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. https://doi.org/10.1038/s41562-016-0021
- Murphy, J., Caldwell, A., Warne, J., et al. (2025a). Estimating the replicability of sports and exercise science research. Sports Medicine. https://doi.org/10.1007/s40279-025-02201-w
- Murphy, J., Caldwell, A., & Warne, J. (2025b). Reflections on conducting a large replication project in sports and exercise science. Sports Medicine. https://doi.org/10.1007/s40279-025-02200-x
- Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615–631. https://doi.org/10.1177/1745691612459058
- Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716
- Schoenfeld, J. D., & Ioannidis, J. P. A. (2013). Is everything we eat associated with cancer? A systematic cookbook review. American Journal of Clinical Nutrition, 97(1), 127–134. https://doi.org/10.3945/ajcn.112.047142
- Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3(9), 160384. https://doi.org/10.1098/rsos.160384
- Spronken-Smith, R., Brodeur, J., Kajaks, T., et al. (2013). Completing the research cycle: A framework for promoting dissemination of undergraduate research and inquiry. Teaching and Learning Inquiry, 1(2), 105–118. https://doi.org/10.20343/teachlearninqu.1.2.105
- Walkington, H., & Jenkins, A. (2008). Embedding undergraduate research publication in the student learning experience: Ten suggested strategies. Brookes eJournal of Learning and Teaching, 2(3).
Last updated: May 2026