Step 8: Report your results
Turning new knowledge into impact
Once your research is funded and under way, it’s time to write up what happened and what you learned through this work.
Publishing your results in a publicly accessible journal (fee-based or open access) facilitates the dissemination of credible research evidence. This dissemination helps bridge the gap between research and practice, ensuring that valuable insights do not remain confined to small academic circles but are shared with the world.
By publishing their findings, implementation scientists provide valuable data on implementation strategies, outcomes, and constructs. This iterative process of sharing and building on research findings is essential for advancing the field and improving implementation outcomes.
Broadly, publishing results increases transparency and accountability in the scientific community, fostering trust among those who rely on published research to make informed decisions about healthcare choices, practices, and policies. By publishing evidence of what works and what doesn’t, scientists can influence the direction of policies and the prioritization of funding decisions.
In order for published research to be considered rigorous enough to be considered “knowledge” it must go through peer review and be published in a reputable journal.
The peer review process involves evaluation by independent experts in the same field, which helps ensure the accuracy, validity, and quality of the research. These experts scrutinize the methodology, data analysis, and conclusions drawn by the researchers, providing critical feedback and suggestions for improvement. This rigorous evaluation helps to filter out flawed or unsubstantiated studies, ensuring that only high-quality research is published.
When research is published in a peer-reviewed journal, it signifies that the study has undergone a thorough vetting process and has been deemed worthy by knowledgeable peers. This credibility is crucial for advancing scientific knowledge, as other researchers, practitioners, and policymakers rely on these publications to inform their work and decisions. The peer review process also helps to maintain the integrity of the scientific record by preventing the dissemination of erroneous or misleading information.
Finally, peer-reviewed publications serve as a permanent and citable record of scientific advancements. This archival function is essential for the cumulative nature of scientific progress, allowing researchers to build on previous findings and contribute to the ongoing development of their field.
✪ The Key Principles for Scientific Publishing (International Science Council, 2023)
✪ Principles of Transparency and Best Practice in Scholarly Publishing (Committee on Publication Ethics, 2019)
💻 Best Practice Guidelines on Research Integrity and Publishing Ethics (Wiley)
💻 Publication Ethics (Science Partner Journals)
💻 Publishing ethics (Elsevier)
💻 Research ethics, publication ethics and good practice guidelines (EQUATOR Network)
💻 Coursera: Writing in the Sciences
💻 How to write a great research paper using reporting guidelines (EQUATOR Network)
✪ Enhancing the reporting of implementation research (Implementation Science, 2017)
✪ Promoting Learning from Null or Negative Results in Prevention Science Trials (Prevention Science, 2020)
The Importance of Reporting Guidelines
The use of reporting guidelines is critical for the creation of transparent, accurate, reliable, and replicable scientific research publications. The EQUATOR Network (Enhancing the QUAlity and Transparency Of health Research) is an international effort to help encourage the widespread use of reporting guidelines in health research. The EQUATOR Network maintains a searchable database of reporting guidelines online, in addition to tutorials and tool kits designed to improve scientific writing.
In a clear signal about the importance of appropriate reporting, in 2017 the journal Implementation Science published ✪ Enhancing the reporting of implementation research in order to clearly identify for researchers the specific reporting issues necessary to enhance the scientific reporting quality and transparency of the manuscripts submitted.
For an expert review of why reporting standards are critical, be sure to watch the National Cancer Institute's archived webinar "Reporting Guidelines, Measures and Harmonization".
While not exhaustive, outlined below is a selection of reporting guidelines most often used in implementation science.
Rigorous Standards For Publishing On Racial Health Inequities
Racial health inequities are lethal and eliminating them requires that researchers, editors, reviewers, and readers examine racism as a cause of gaps in health outcomes by race, rather than race.
In response to the continued practice in health equity research of invoking biology-as-cause while dodging systemic racism, Dr. Rhea W. Boyd and colleagues have set the new bar for publishing on race in medicine and health.
If your research includes race as a variable, make sure you are reporting and framing your work with their reporting standards:
✓ Define race during the experimental design, and specify the reason for its use in the study
✓ Name racism (its form, mechanisms, and intersectionalities)
✓ Never offer genetic interpretations of race (because such suppositions are not grounded in science)
✓ Solicit patient input
✓ Identify the stakes
✓ Cite the experts, particularly scholars of color
Open Access articles will be marked with ✪
Please note some journals will require subscriptions to access a linked article.
Reporting Guidelines to Know
Implementation Strategies – Recommendations for Specifying and Reporting
In 2013, Proctor, Powell and McMillen published their recommendations for naming, defining and operationalizing seven dimensions of implementation strategies: actor, the action, action targets, temporality, dose, implementation outcomes addressed, and theoretical justification. These recommendations responded to the poor description, inconsistent labeling, and lack of theoretical underpinning characteristic of most implementation research.
✪ Implementation Strategies – Recommendations for Specifying and Reporting (Implementation Science, 2013)
Standards for Reporting Implementation Studies (STaRI)
The STaRI initiative developed this 27 item checklist to improve the scientific reporting of implementation strategies. The checklist prompts both the detailed description of the intervention strategy used as well as a detailed description of the intervention effectiveness. The STaRI checklist is not specific to a particular research methodology, and can be applied to the wide range of study designs used in implementation science.
✪ Standards for Reporting Implementation Studies (STaRI) Statement (BMJ, 2017)
Consolidated Standards of Reporting Trials (CONSORT)
First published in 1996, the Consolidated Standards of Reporting Trials (CONSORT) statement aims to improve the quality of reporting on randomized controlled trials (RCTs). The CONSORT statement was revised in 2001 and again in 2010 to improve wording and clarify previous versions.
✓ Consolidated Standards of Reporting Trials (CONSORT) Online
✪ CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials (BMJ, 2010)
✪ CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trials (BMJ, 2010)
✓ CONSORT EQUATOR Network profile
Selected Extensions:
CONSORT Harms: ✪ Better reporting of harms in randomized trials: an extension of the CONSORT Statement (Annals of Internal Medicine, 2004)
CONSORT Cluster: ✪ CONSORT Group. Consort 2010 statement: extension to cluster randomised trials (BMJ, 2012)
CONSORT Pilot & Feasibility: ✪ CONSORT 2010 statement: extension to randomised pilot and feasibility trials (Pilot and Feasibility Studies, 2016)
Standards for Reporting Qualitative Research (SRQR)
The Standards for Reporting Qualitative Research (SRQR) were published in 2014 to improve transparent reporting of the very broad range of qualitative research. This 21-item checklist outlines the minimum criteria that should be present in qualitative research reporting, explicitly addressing differences from quantitative reporting.
✪ Standards for Reporting Qualitative Research: A Synthesis of Recommendations – Published by Academic Medicine, 2014
✓ SRQR EQUATOR Network profile
Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ)
The Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) statement was published in 2012 to assist researchers in synthesizing qualitative health research. Comprised of 21 items, this checklist outlines how to find, assess, and synthesize qualitative health research.
Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) Statement (BMC Medical Research Methodology, 2012)
Template for Intervention Description and Replication (TIDieR)
An international panel of experts developed this checklist to improve the quality of intervention description in published research. The checklist is intended to guide authors in their reporting of interventions, as well as to aid reviewers and editors in their decisions regarding evidence of necessary reporting. The ultimate goal of the checklist is to ensure that publications contain clear and accurate accounts of interventions.
There are several extensions to existing standards that increase the rigor of health equity reporting:
PRISMA-Equity 2012 Extension: For use with systematic reviews focused on health equity
STROBE-Equity: For use with equity-relevant observational studies
CONSORT-Equity 2017 Extension: Reporting of intervention effects in randomized trials where health equity is relevant
Additional Guidance
✪ Resource of health equity-related data definitions, standards, and stratification practices (US Centers for Medicare and Medicaid Services, 2024)
The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies
The STROBE guidelines were developed by an international group of experts to improve the reporting quality and transparency of observational studies. They consist of a 22-item checklist covering key aspects of study design, data collection, analysis, and interpretation, and were simultaneously published in 8 leading biomedical journals in 2007.
✪ The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies (PLoS Medicine, 2007)
✓ STROBE EQUATOR Network Profile
Extensions:
Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Checklist
In 2005 a stakeholder panel was convened to adapt the 1996 QUality Of Reporting Of Meta-analyses (QUOROM), with adaptation focusing on sub-optimal reporting of meta-analyses, capturing the iterative nature of systematic reviews, and reporting on bias. The resulting instrument was named Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), to reflect the incorporation of both systematic reviews and meta-analyses.
✪ Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement (PLOS Medicine, 2009)
✓ PRISMA EQUATOR Network profile
Extensions:
✓ PRISMA Harms: This extension identifies a minimal set of items that should be reported when adverse events are reviewed. See PRISMA harms checklist: improving harms reporting in systematic reviews (BMJ, 2016) for more information.
✓ PRISMA – Equity 2012: This extension seeks to improve transparency and completeness of reporting systematic reviews focused on equity. See ✪ Extending the PRISMA statement to equity-focused systematic reviews (PRISMA-E 2012): Explanation and elaboration (International Journal for Equity in Health, 2015) for more information.
Cochrane Handbook for Systematic Reviews of Interventions
Cochrane reviews are systematic reviews of primary human health research focusing on the effects of interventions. The stringent guidelines adhered to in a Cochrane review are internationally recognized as the gold standard, with their reviews being peer-reviewed and regularly updated.
✪ Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 – Published by Cochrane.org, 2011
✓ Cochrane Systematic Reviews EQUATOR Network profile
💻 Cochrane Interactive Learning Module 1: Introduction to conducting systematic reviews
Extensions:
✓ Methodological Expectations for Cochrane Intervention Reviews (MECIR)
WIDER Recommendations for Reporting of Behavior Change Interventions
In response to Michie et al’s paper ✪ Specifying and reporting complex behaviour change interventions: the need for a scientific method, Albrecht et al used the 2007 ✪ Workgroup for Intervention Development and Evaluation Research (WIDER) Recommendations to develop a checklist operationalizing each of the recommendations.
✪ Workgroup for Intervention Development and Evaluation Research (WIDER) Recommendations (Implementation Science, 2013)
Best Practices for Mixed Methods Research in the Health Sciences
In 2010, The Office of Behavioral and Social Sciences Research (OBSSR) of the National Institutes of Health (NIH) commissioned a working group to evaluate and make recommendations on best practices for mixed methods research. The resulting report provides best practice recommendations for grant applicants, reviewers, and stakeholders engaging in health services mixed methods research.
✓ Best Practices for Mixed Methods Research in the Health Sciences (The Office of Behavioral and Social Sciences Research, 2010)
Guidance for Reporting Involvement of Patients and Public 2 (GRIPP2)
In 2017, the first international guidance on reporting patient and public involvement in health research was published. The original GRIPP checklist was revised using the EQUATOR method of developing reporting guidelines, and is available in both a short and long form version.
✪ GRIPP2 reporting checklists: tools to improve reporting of patient and public involvement in research (BMJ, 2017)
Consolidated Health Economic Evaluation Reporting Standards (CHEERS)
Economic evaluation data is increasingly used to make allocation decisions, but a lack of transparency can make it difficult for decision makers to use or find economic evaluation data. Published by the International Society for Pharmacoeconomics and Outcomes Research in 2013 as an attempt to consolidate and clarify existing economic evaluation guidelines (and updated in 2022), the Consolidated Health Economic Evaluation Reporting Standards (CHEERS 2022) is a 24-item checklist intended for researchers, editors, and peer reviewers.
‘The new CHEERS 2022 statement replaces previous CHEERS reporting guidance. It reflects the need for guidance that can be more easily applied to all types of health economic evaluation, new methods and developments in the field, as well as the increased role of stakeholder involvement including patients and the public. It is also broadly applicable to any form of intervention intended to improve the health of individuals or the population, whether simple or complex, and without regard to context (such as health care, public health, education, social care, etc.)‘ (Husereau et al, 2022).
✪ Consolidated Health Economic Evaluation Reporting Standards (CHEERS) Statement (BioMed Central, 2013)
The IS Research Pathway
Open Access Journals to Consider
Implementation Science
Implementation Research and Practice
Implementation Science Communications
Frontiers in Health Services
BMC Health Services Research
BMJ Open
PLoS ONE
See ✪ Relevant Journals for Identifying Implementation Science Articles: Results of an International Implementation Science Expert Survey (Mielke et al, 2021) to learn more.
🎥 Videos from our friends
Reporting, Guidelines, Measures, & Harmonization
Approaches to Publishing Implementation Science
Inclusive language in scientific publishing: Race, ethnicity, and ancestry
Inclusive language in scientific publishing: Health and disability
Find Examples
Browse our Library of UW community co-authored publications to see examples of published research.
PAUSE AND REFLECT
❯ Who is the intended audience for the research findings, and how can the results be made accessible to diverse groups, including those who may not have academic or technical backgrounds?
❯ Are there barriers (e.g., language, technical jargon) that might prevent certain groups from understanding the research findings?
❯ Are the voices and perspectives of marginalized or underrepresented groups highlighted in the findings?
❯ How might the research findings impact different communities, particularly those that are historically or currently marginalized? Are there potential negative consequences of the findings, and how can these be mitigated?
❯ Are the methods, data, and analysis processes clearly and transparently reported to allow for scrutiny and replication?
❯ How will the research team be accountable to the communities involved in or affected by the research?
❯ What strategies will be used to disseminate the findings to ensure they reach and benefit the relevant communities? How will the research team engage with the community to discuss the findings and their implications?
❯ Are there ethical considerations related to privacy, consent, and the use of data that need to be addressed in the reporting? How will the research team ensure that the findings are used ethically and responsibly?