Authors:
Samuel Kalibala, Godfrey B Woelk, Stephen Gloyd, Nrupa Jani, Lynnette Kay, Avina Sarna, Jerry Okal, Charity Ndwiga, Nicole Haberland, & Irit Sinai
University of Washington affiliated authors are displayed in bold.
✪ Open Access
Published: July 2016
Read the full text open access in the Journal of the International AIDS Society
Abstract
Introduction
According to UNAIDS, the world currently has an adequate collection of proven HIV prevention, treatment and diagnostic tools, which, if scaled up, can lay the foundation for ending the AIDS epidemic. HIV operations research (OR) tests and promotes the use of interventions that can increase the demand for and supply of these tools. However, current publications of OR mainly focus on outcomes, leaving gaps in reporting of intervention characteristics, which are essential to address for the utilization of OR findings. This has prompted WHO and other international public health agencies to issue reporting requirements for OR studies. The objective of this commentary is to review experiences in HIV OR intervention design, implementation, process data collection and publication in order to identify gaps, contribute to the body of knowledge and propose a way forward to improve the focus on “implementation” in implementation research.
Discussion
Interventions in OR, like ordinary service delivery programmes, are subject to the programme cycle, which continually uses insights from implementation and the local context to modify service delivery modalities. Given that some of these modifications in the intervention may influence study outcomes, the documentation of process data becomes vital in OR. However, a key challenge is that study resources tend to be skewed towards documentation and the reporting of study outcomes to the detriment of process data, even though process data is vital for understanding factors influencing the outcomes.
Conclusions
Interventions in OR should be viewed using the lens of programme evaluation, which includes formative assessment (to determine concept and design), followed by process evaluation (to monitor inputs and outputs) and effectiveness evaluation (to assess outcomes and effectiveness). Study resources should be equitably used between process evaluation and outcome measurement to facilitate inclusion of data about fidelity and dose in publications in order to enable explanation of the relationship between dosing and study outcomes for purposes of scaling up and further refinement through research.