Replication Prize
Replication Prize
Shape the future of rigorous and replicable science.
The Replication Prize will collect ideas on important research to replicate, and creative and successful strategies to integrate replication into research practice.
Closed on 12/19/25 08:00 PM EST
Total cash prizes: $850,000
Overview
Subject of the Challenge
The Replication Prize at the National Institutes of Health (NIH) seeks to collect ideas and strategies to make important areas of biomedical research more replicable. Submissions can address one or both of two tracks: (1) Replication Ideas: Submit ideas on research questions that can benefit most from replication; (2) Replication Exemplars: Submit a report on a creative and successful way the team has integrated replication into their standard practice. Entrants are permitted to submit one entry per track.
Replication in Biomedical Research.
Rigor of biomedical research is a critical feature in the advancement of knowledge. The use of rigorous design and research principles coupled with transparency in reporting methods and outcomes enables important areas of biomedical research to be repeated by other researchers (see https://grants.nih.gov/policy-and-compliance/policy-topics/reproducibility/guidance). Replication studies are a core part of the scientific process and critical in assessing external validity and generalizability, as well as the internal validity of novel research outcomes, particularly those that form the basis of evidence-based practices to improve public health.
Although different disciplines and institutions define replication in varying ways, for this challenge, we will use the following definition from the National Academies of Sciences Engineering and Medicine (NASEM).
- Replicability is obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data.
Replication research has multiple functions. It can confirm or disprove earlier results or expand the scope of the original research study. Replication can reduce sampling error, control for artifacts related to contextual variables that affect the experiment, unearth research misconduct, and can enable external validation and generalizing research results to a different population or context, and/or verify the hypothesis of original study using a different method.
For replicability, we will further distinguish between types of replication, using the terms direct replication and conceptual replication.
- Direct replication is the repetition of an experimental procedure. Researchers conducting direct replication may modify aspects of the procedure depending on the purpose of the replication study. Changes should be made deliberately, with most features kept as consistent as possible. Changes to an experimental design may be made to:
- Reduce Sampling Error/Type 1 Error. To control for sampling error, the replication experiment is performed on a different sample of research participants from the same population selected using the same procedures as in the original experiment.
- Control for Artifacts. To control for artifacts in the laboratory or experimental context (i.e. lack of internal validity), researchers could change aspects related to the context (e.g. research participant characteristics, cultural and historical context in which the study takes place, physical setting of the research, personnel conducting the research) and/or procedures to develop the dependent variable.
- Identify poorly described methods. To control for poorly described methodology, researchers repeat the study procedures as documented, changing only the personnel involved.
- Generalize Results. To generalize to a larger population or different population, researchers make changes to how the research participants are selected and allocated.
- Conceptual replication is verifying the underlying hypothesis of the earlier experiment using a different method and or experimental setup. Changes may occur in multiple aspects of the research.
Successful replication, however, does not necessarily confirm original results or theoretical findings, and can provide additional support to the original outcome. Likewise, failed replication does not confirm a finding is false. Instead, it can instead identify a need for additional research to establish reliability of the finding and/or method.
Performing replication research advances scientific knowledge. It increases confidence in research findings, experimental methods and theories. It may also eliminate alternate explanations and identify confounding influences and artifacts that affect results and/or their generalizability.
The Importance of Replication
There is evidence that many peer-reviewed research studies are not replicable. Large scale replicability studies have shown many challenges in performing replication studies, including issues related to incomplete documentation, lack of transparency in results and protocols, failures to share data, reagents, and other materials, and methodological challenges.
Current research culture favors unique and positive results and actively disincentivizes open research practices. Research teams with innovative results are frequently rewarded with publications, grants, and employment. Not only are null results less frequently published, journals publish replication studies less often than original research rendering this work relatively inaccessible.
An imbalanced emphasis on innovation also can deprioritize efforts to replicate and verify research results. Addressing these challenges requires a culture shift where transparency and rigor are rewarded by normalizing replication efforts across institutions and disciplines.
Opportunity to Improve Health
Unreliable scientific findings can derail the cultivation of collective knowledge built upon previous work and impede progress to enhance health, lengthen life, and reduce illness and disability. By engaging in and supporting replication activities in one’s research team or institution, a culture change within the scientific enterprise where replication becomes a ‘business as usual’ activity can be normalized. Ensuring studies, methods, and techniques are valid helps ensure that the scientific knowledge produced is both transparent and rigorous. Improving rigor and transparency in science allows us to understand the strengths and limitations of original results and how to extend or modify existing scientific theory, setting the stage for scientific discovery and technological advances that can be applied to improving the health and well-being of the population.
Prize Purpose
The goal of the Replication Prize is to solicit ideas from the public to identify high impact areas for future replication efforts and to reward progress in making important areas of biomedical research more replicable. Ultimately, the goal of this prize is to boost current replication efforts and encourage a culture change in biomedical research where replication activities are normalized as part of conducting research.
Each submission will address at least one of the two tracks included in this competition:
- Track One: Replication Ideas engages the research community to assist in identifying the most significant lines of research for future replication programming. Submissions will identify high impact areas of research that are in need of replication studies.
- Track Two: Replication Exemplars recognizes and rewards pioneering entrants that have creatively integrated replication into their work with measured success. Submissions will highlight the strategies used to integrate replication into ongoing research activities. The winning strategies will be compiled as a publicly available reference.
The use of generative artificial (AI) is permissible; however, participants are required to cite the tool and identify how it has been used in the submission.
Challenge Glossary
In this Challenge, we use the following definitions to delineate these categories:
- Basic research: A systematic study directed toward greater knowledge or understanding of the fundamental aspects of observable phenomena without specific applications towards processes or products in mind.
- Translational research: Research that bridges the gap between discoveries from basic research and health-related applications.
- Pre-Clinical research: Research studying and testing health-related interventions in non-human subjects.
- Clinical research: Research studying and testing health-related interventions in human research participants.
- Social and Behavioral research: Research to study the impact of individual behavior and social interactions on the health-related outcomes of individuals and groups.
Prizes
Amount of the Prize: The total cash prize purse for this challenge is $850,000.
The Common Fund may award up to $5,000 per winner of Track 1: Replication Ideas, and up to 4 winners each in each of the five scientific categories, with a total of up to 20 winners. For Track 2: Replication Exemplars, the Common Fund may award up to $50,000 per winner of Track 2, with a total of up to 15 winners.
The five scientific categories are: 1) Basic, 2) Translational, 3) Pre-clinical, 4) Clinical, 5) Social and Behavioral (see definitions in Challenge Glossary).
All submissions selected to win a prize will be featured on the Office of Strategic Coordination - Common Fund website. Submissions selected to win a prize may be invited to attend a workshop related to research replication where the winners of this Challenge are announced. Submissions selected as winners of Track 2 will be compiled as a publicly available reference.
Rules
Eligibility requirements
To be eligible to win a prize under this Challenge, a Participant (whether an individual, group of individuals, or entity) —
- Shall have registered to participate in the Challenge under the rules promulgated by the National Institutes of Health (NIH) as published in this announcement;
- Shall have complied with all the requirements set forth in this announcement;
- In the case of a private entity, shall be incorporated in and maintain a primary place of business in the United States, and in the case of an individual, whether participating singly or in a group, shall be a citizen or permanent resident of the United States. However, non-U.S. citizens and non-permanent residents can participate as a member of a team that otherwise satisfies the eligibility criteria. Non-U.S. citizens and non-permanent residents are not eligible to win a monetary prize (in whole or in part). Their participation as part of a winning team, if applicable, may be recognized when the results are announced.
- Shall not be a federal entity or federal employee acting within the scope of their employment;
- Shall not be an employee of the Department of Health and Human Services (HHS, or any other component of HHS) acting in their personal capacity;
- Who is employed by a federal agency or entity other than HHS (or any component of HHS), should consult with an agency ethics official to determine whether the federal ethics rules will limit or prohibit the acceptance of a prize under this Challenge;
- Shall not be a judge of the Challenge, or any other party involved with the design, production, execution, or distribution of the Challenge or the immediate family of such a party (i.e., spouse, parent, step-parent, child, or step-child).
- Shall be 18 years of age or older at the time of submission.
Participation Rules
- Participants (whether individuals, groups of individuals, or entities) may not use federal funds from a grant award or cooperative agreement to develop their Challenge submissions or to fund efforts in support of their Challenge submissions unless use of such funds is consistent with the purpose, terms, and conditions of the grant award or cooperative agreement. Participants intending to use federal grant or cooperative agreement funds must register for and participate in the challenge on behalf of the awardee institution, organization, or entity. If a Participant uses federal grant, cooperative agreement, or Other Transaction (OT) funds and wins the Challenge, the prize must be treated as program income for purposes of the original grant or cooperative agreement in accordance with applicable Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards (2 CFR § 200).
- Federal contractors may not use federal funds from a contract to develop their Challenge submissions or to fund efforts in support of their Challenge submissions.
- By participating in this Challenge, each Participant (whether an individual, group of individuals, or entity) agrees to assume any and all risks and waive claims against the federal government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from participation in this Challenge, whether the injury, death, damage, or loss arises through negligence or otherwise.
- Based on the subject matter of the Challenge, the type of work that it will possibly require, as well as an analysis of the likelihood of any claims for death, bodily injury, property damage, or loss potentially resulting from Challenge participation, no Participant (whether an individual, group of individuals, or entity) participating in the Challenge is required to obtain liability insurance, or demonstrate financial responsibility, or agree to indemnify the federal government against third party claims for damages arising from or related to Challenge activities in order to participate in this Challenge.
- A Participant (whether an individual, group of individuals, or entity) shall not be deemed ineligible because the Participant used federal facilities or consulted with federal employees during the Challenge if the facilities and employees are made available to all Participants participating in the Challenge on an equitable basis.
- By participating in this Challenge, each Participant (whether an individual, group of individuals, or entity) warrants that they are sole author or owner of, or has the right to use, any copyrightable works that the submission comprises, that the works are wholly original with the Participant (or is an improved version of an existing work that the Participant has sufficient rights to use and improve), and that the submission does not infringe any copyright or any other rights of any third party of which the Participant is aware.
- By participating in this Challenge, each Participant (whether an individual, group of individuals, or entity) grants to the NIH an irrevocable, paid-up, royalty-free nonexclusive worldwide license to reproduce, publish, post, link to, share, and display publicly the submission on the web or elsewhere, and a nonexclusive, nontransferable, irrevocable, paid-up license to practice, or have practiced for or on its behalf, the solution throughout the world. Each Participant will retain all other intellectual property rights in their submissions, as applicable. To participate in the Challenge, each Participant must warrant that there are no legal obstacles to providing the above-referenced nonexclusive licenses of the Participant’s rights to the federal government. To receive an award, Participants will not be required to transfer their intellectual property rights to NIH, but Participants must grant to the federal government the nonexclusive licenses recited herein.
- Each Participant (whether an individual, group of individuals, or entity) agrees to follow all applicable federal, state, and local laws, regulations, and policies.
- Each Participant (whether an individual, group of individuals, or entity) participating in this Challenge must comply with all terms and conditions of these rules, and participation in this Challenge constitutes each such Participant’s full and unconditional agreement to abide by these rules. Winning is contingent upon fulfilling all requirements herein.
- As a condition for winning a cash prize in this Challenge, each Participant (whether an individual, group of individuals, or entity) that has been selected as a winner must complete and submit all requested winner verification and payment documents to NIH within 14 business days of formal notification. Failure to return all required verification documents by the date specified in the notification may be a basis for disqualification of a cash prize winning submission.
Judging
Basis Upon Which a Winner Will be Selected.
Submissions that are responsive and comply with the entry requirements will be reviewed by a multidisciplinary panel of judges composed of federal employees from across NIH and potentially other federal government agencies. Depending on the volume of submissions received, an expert panel may conduct an initial review of eligible submissions and determine a sub-set for judge evaluation. The judges will consider the following criteria and make recommendations to the award-approving official based upon their assessments of the criteria. Track 1 is expected to award 20 winners, and Track 2 is expected to award 15 winners.
Submissions will be evaluated using the following three (3) criteria for Track 1: Replication Ideas:
- Understanding of the Research Question (0-10)
- How well does the submission describe the research question? How reasonable/sound is the rationale?
- How well is the explanation supported by the relevant data or evidence (with confidentiality ensured for any unpublished data)?
- Potential Impact (0-10)
- To what extent does answering this question significantly impact public health?
- How will answering the research question impact more than one field of research?
- Clarity (0-5)
- How clear and well-articulated is the written submission?
- How well is the submission organized?
Submissions will be evaluated using the following four (4) criteria for Track 2: Replication Exemplars:
- Significance (0-10)
- To what extent does the demonstrated strategy address an important gap in biomedical research replication?
- How appropriate is the method for the gap being addressed?
- Innovation (0-10)
- How well does the strategy leverage innovative methods, training approaches, or participants?
- How innovative is the strategy to overcome challenges/barriers to performing replication activities?
- Potential Impact (0-15)
- To what extent have these activities effectively increased replication output within the organization/research team?
- How likely is this strategy to be disseminated beyond the organization/team in which it has been implemented? Are there currently plans for broader dissemination?
- To what extent have replication activities been adopted by researchers outside of the research team?
- How has the replication work increased confidence in and reception of the team's research activities?
- How well do the metrics evaluate replication strategy outcomes?
- Feasibility (0-10)
- How likely can these activities be scaled up, repeated by a different team, or carried out in a different context?
- How likely are the activities described to be sustained?
How to enter
Submission Requirements
Entrants to Track One: Replication Ideas
The objective of Track One is to engage the public in collecting ideas on significant areas of research that should be the focus of future replication studies and programming.
Submission materials
Entrants must:
- Pose a scientific research question with the potential to significantly impact public health that could strongly benefit from replication. Some topics the research question could include, but are not limited to: 1) areas of ongoing debate with evidence of disagreement, 2) high profile research questions, 3) unclear methodology, 4) a need for generalizability and translatability.
Entrants should identify which one (or more) of the following scientific categories the research question applies to: 1) Basic, 2) Translational, 3) Pre-clinical, 4) Clinical, 5) Social and Behavioral (see definitions in Challenge Glossary).
This section should provide a brief description of the research question and the rationale for asking this particular question. Describe the scientific, public health, and economic impact of having a reliable answer to this question. This section will be no more than 1000 words.
Optional: Identify which replication approaches (e.g. reduce sampling error, mitigate research misconduct, control for artifacts, generalize results) could be considered to address the research question (see description in Replication in Biomedical Research section in Overview). - Identify peer-reviewed publications that address the research question and/or provide support for choosing that question.
Entrants must submit a bibliography listing primary research papers that address the research question. These articles cannot come from a single journal. Please submit no more than two papers from the same research team.
Articles should include, but are not limited to, publications from the last five years. Articles must be available in English. Reviews and commentaries are welcome as supplementary information. Grey literature such as blog posts and social media posts and editorials that justify the research question are welcome but should not be submitted as the primary supporting documentation in lieu of peer reviewed research publications.
The bibliography will be 1 page. Please provide the reference list in NLM format and include the Digital Object Identifier (DOI) where applicable. These articles should be available in PubMed, or otherwise publicly available. If unavailable through those avenues, a pdf attachment is permissible if in alignment with the publication’s policy.
The use of generative AI is permissible; however, participants must cite the tool and identify how it has been used in this submission (e.g., ideation, writing). Follow the NLM format guide regarding Software on the Internet.
Submissions should clearly mark which content is confidential or proprietary, if applicable.
Entrants to Track Two: Replication Exemplars
The objective of track two is to recognize and reward pioneering entrants who have creatively integrated replication into their research practice and/or within the institution with measured success. The winning strategies will be compiled as a publicly available reference.
To be eligible, the replication activities must already be complete. If the activities are on-going, the entrant must be able to show post-implementation impact.
Submission materials.
Entrants must submit a Replication Strategy Report which will include:
- Replication Strategy. Describe the replication activities that were successfully integrated into research practice. Include the methods the team used for evaluating outcomes. Highlight the innovative methods or activities included in the strategy. Given the variability of defining replication, the entrant’s working definition of replication when the strategy was implemented should be included. This section must identify which one (or more) of the following categories these activities fall under: 1) Basic, 2) Translational, 3) Pre-clinical, 4) Clinical, 5) Social and Behavioral (see definitions in Challenge Glossary).
- Environment. Describe the context where the activities took place. Describe where the activities occurred; the dates and duration; and the expertise, role and responsibilities of each team member for each activity. Describe how federal, organizational or institutional funds supported this effort, and when they were granted.
- Impact. Describe the impact of the replication activities, including comparisons of pre- and post-replication metrics. If applicable, describe the long-term impact. Describe how these activities improved research rigor and the value of and trust in research activities. Describe how these activities have been disseminated to and adopted by others, if applicable.
- Response to Barriers. Describe any barriers encountered, and how they were overcome. Barriers and resolutions described can be at any level of organization (e.g. within the team, at location of implementation, in the community).
- Timeline & Budget. Provide a description of the implementation timeline, associated costs, and resources that were required to achieve replication outcomes.
- Collaboration. Describe the necessary partnerships or those that should have been considered in retrospect to implement replication activities. Describe how existing partnerships were leveraged, and how new partnerships were built to support replication.
- Sustainability/Transferability. Address the sustainability, transferability, and generalizability of the replication activities. Describe current sustainability plans in place or in development. Describe how these activities can be scaled or applied in different contexts. Describe additional replication activities or other events, dissemination or communication strategies, information gathering, networking, training programs, changes to institutional policy, etc. you may wish to integrate in the future.
The Replication Strategy Report will be no more than 5000 words, excluding references.
The use of generative AI is permissible; however, participants must cite the tool and identify how it has been used in this submission (e.g., ideation, writing). Follow the NLM format guide regarding Software on the Internet.
Submissions should clearly mark which content is confidential or proprietary, if applicable.
Challenge Glossary
In this Challenge, we use the following definitions to delineate these categories:
- Basic research: A systematic study directed toward greater knowledge or understanding of the fundamental aspects of observable phenomena without specific applications towards processes or products in mind.
- Translational research: Research that bridges the gap between discoveries from basic research and health-related applications.
- Pre-Clinical research: Research studying and testing health-related interventions in non-human subjects.
- Clinical research: Research studying and testing health-related interventions in human research participants.
- Social and Behavioral research: Research to study the impact of individual behavior and social interactions on the health-related outcomes of individuals and groups.
Additional resources
Contact
If you have questions about this challenge, contact the Replication Prize team at CFReplication@od.nih.gov.
Winners
Winners will be announced following judging of submissions.
This page last reviewed on
