What Are the Strength of Recommendations and Methodologic Reporting in Health Economic Studies in Orthopaedic Surgery?
Cost-effectiveness research is an increasingly used tool in evaluating treatments in orthopaedic surgery. Without high-quality primary-source data, the results of a cost-effectiveness study are either unreliable or heavily dependent on sensitivity analyses of the findings from the source studies. However, to our knowledge, the strength of recommendations provided by these studies in orthopaedics has not been studied.
We asked: (1) What are the strengths of recommendations in recent orthopaedic cost-effectiveness studies? (2) What are the reasons authors cite for weak recommendations? (3) What are the methodologic reporting practices used by these studies?
The titles of all articles published in six different orthopaedic journals from January 1, 2004, through April 1, 2014, were scanned for original health economics studies comparing two different types of treatment or intervention. The full texts of included studies were reviewed to determine the strength of recommendations determined subjectively by our study team, with studies providing equivocal conclusions stemming from a lack or uncertainty surrounding key primary data classified as weak and those with definitive conclusions not lacking in high-quality primary data classified as strong. The reasons underlying a weak designation were noted, and methodologic practices reported in each of the studies were examined using a validated instrument. A total of 79 articles met our prespecified inclusion criteria and were evaluated in depth.
Of the articles included, 50 (63%) provided strong recommendations, whereas 29 (37%) provided weak recommendations. Of the 29 studies, clinical outcomes data were cited in 26 references as being insufficient to provide definitive conclusions, whereas cost and utility data were cited in 13 and seven articles, respectively. Methodologic reporting practices varied greatly, with mixed adherence to framing, costs, and results reporting. The framing variables included clearly defined intervention, adequate description of a comparator, study perspective clearly stated, and reported discount rate for future costs and quality-adjusted life years. Reporting costs variables included economic data collected alongside a clinical trial or another primary source and clear statement of the year of monetary units. Finally, results reporting included whether a sensitivity analysis was performed.
Given that a considerable portion of orthopaedic cost-effectiveness studies provide weak recommendations and that methodologic reporting practices varied greatly among strong and weak studies, we believe that clinicians should exercise great caution when considering the conclusions of cost-effectiveness studies. Future research could assess the effect of such cost-effectiveness studies in clinical practice, and whether the strength of recommendations of a study’s conclusions has any effect on practice patterns.
Given the increasing use of cost-effectiveness studies in orthopaedic surgery, understanding the quality of these studies and the reasons that limit the ability of studies to provide more definitive recommendations is critical. Highlighting the heterogeneity of methodologic reporting practices will aid clinicians in interpreting the conclusions of cost-effectiveness studies and improve future research efforts.