Please use this identifier to cite or link to this item:
https://hdl.handle.net/2440/80826
Citations | ||
Scopus | Web of Science® | Altmetric |
---|---|---|
?
|
?
|
Type: | Journal article |
Title: | Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases: A review and suggested reporting framework |
Author: | Hajiali Afzali, H. Gray, J. Karnon, J. |
Citation: | Applied Health Economics and Health Policy, 2013; 11(2):85-93 |
Publisher: | Adis International Ltd. |
Issue Date: | 2013 |
ISSN: | 1175-5652 1179-1896 |
Statement of Responsibility: | Hossein Haji Ali Afzali, Jodi Gray, Jonathan Karnon |
Abstract: | Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess ‘best practice’ in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies. |
Keywords: | Humans Cardiovascular Diseases Calibration Decision Support Techniques Models, Organizational Benchmarking Employee Performance Appraisal Practice Guidelines as Topic |
Rights: | © Springer International Publishing Switzerland 2013 |
DOI: | 10.1007/s40258-013-0012-6 |
Published version: | http://dx.doi.org/10.1007/s40258-013-0012-6 |
Appears in Collections: | Aurora harvest Public Health publications |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.