Qualitative Evaluation Checklist
This 13-page checklist outlines a 10-step approach that is designed to guide evaluators in determining when qualitative methods are appropriate for their program and inquiry. Though intended for evaluators, this guide can be used to walk program staff through the rationale and uses of qualitative evaluation. It begins with three overarching factors to consider: selecting qualitative approaches that are appropriate for an evaluation’s uses and to adequately answer the questions it poses; collecting high quality and credible qualitative data; analyzing and reporting findings. It then guides the user through answering 10 thought-provoking questions, which would aid in generating an evaluation design, and even an action plan, for qualitative data collection. Questions begin with the evaluation type (whether process or outcome) and purpose (to add faces, or stories to the numbers that may exists, or to reveal meaning or uncommon characteristics of a program), and proceed to sampling decisions (how to determine a purposeful sample, unit of analysis and sample size). Questions then give approaches to fieldwork (such as determining the role of the evaluator, versus staff; the duration of field study; and measures to use) and interviewing (structured or open, and how to document them). They then move to ethics (confidentiality, ownership of information, legal considerations and other areas), analysis of data (anticipating data collection as data is gathered, and criteria by which analysis will be conducted and used). Questions end with reporting (including how to focus and summarize data and facilitate its usage). Patton is a foremost expert in this field. Though the checklist is detailed and at times necessarily dense, it would serve as a useful and relevant guide to thinking through some of the hard questions associated with evaluation design and data collection when qualitative data is desired. The manner in which it is written lends itself to discussion among stakeholders and the 10 questions could even be used as the basis for agendas for evaluation meetings. Because the field has lacked both the understanding and structure of how to use qualitative data in a thoughtful and systematic way, it has long struggled with the demand for numbers in evaluating a program, versus the value and insight that qualitative information provides. If used as it is intended, this checklist could help to combat the claim that qualitative date is merely “anecdotal” or biased. It could educate the user about both the framework and complexities of selecting a qualitative approach. It could leave the user with a sense of the larger issues to consider and both strengths of and barriers to qualitative data. This checklist may be educative for the novice, but could also be an effective tool for those with some background in research methods. Some of the terminology assumes a level of familiarity with evaluation principles and some is specific to the author (such as the term “sensitizing concept”).