By J Wilberg
Cold case detectives aren’t just on TV. Some of them are also called evaluators, experts called in to help a project complete an outcome evaluation after a program has been designed and implemented. In the worst situation, a cold case evaluator is called in to complete an evaluation with no data or bad data. Frequently, time is short, a funding source is demanding a final evaluation report, and program staff are disinterested and maybe even antagonistic about having an evaluator look at their outcomes.
As a consultant who has been in this situation more than once, I have this to say: You would be amazed at what passes for data collection in many programs – hand-signed attendance sheets, ginned-up pre and post tests, and anecdotes galore. Interesting material, often, but not the stuff of decent evaluations.
What to do when you’re asked to evaluate a program that is nearing the end of its funding period and has had no solid evaluation system put into place? Here are some ideas gleaned from my own experience as a cold case evaluator.
#1: Enlist program staff in your cause.
A quick way to guarantee that you will never get any data with which to evaluate the program is to alienate the program staff. If they feel you are judging them or taking a superior attitude because you’re in the evaluator position, they will make your job harder. Instead of tsk-tsking your way around, make program staff your partners in telling the program’s story in the most accurate way possible.
#2: Use what you have.
Is there any program data? Separate the wheat from the chaff and use it. Are program participants still engaged? Develop a retrospective survey instrument to gather their insights about program impact. Is there a staff person who has been involved with the program from the beginning? Ask her/him a thousand questions. You may find out there’s more data laying around than anyone knew. They didn’t tell you because they didn’t think it was important. Moreover, an evaluation encumbered by lack of decent data can be greatly enhanced by attention to good process evaluation. In that case, telling the program’s story through the views of informed observers can also give insight into the difficulty in establishing an outcome evaluation.
#3: Create a beautiful product.
Present whatever data you have in a clear, readable format. Use graphs and charts whenever you can. Compare the program’s results to the results of other similar programs. Bulk up the content with the insights of program staff and vignettes about representative participants. Include a carefully crafted and objectively stated list of ‘areas to consider for further development.’ In this list, be sure to include the need to design the outcome evaluation when the program is designed and to establish good data collection protocols from the beginning. Say this as a going forward recommendation, not as a criticism. By now, program staff know they missed the boat on designing an outcome evaluation, no need to rub it in. Last, make sure the evaluation report looks good. I work with a professional graphic designer on all my products; it’s money well-spent.
There are important things to be learned from every program’s implementation. Sometimes, we can’t measure all of them but often we can know more than we think if we are patient, professional, and persistent, just like a good cold case detective.