The central objective of the act is to elicit from the agencies a regular accounting of the planning, performance, and results of their research activities.
Worksheet 5 — Justify Conclusions Step 5: Justify Conclusions Whether your evaluation is conducted to show program effectiveness, help improve the program, or demonstrate accountability, you will need to analyze and interpret the evidence gathered in Step 4.
Step 5 encompasses analyzing the evidence, making claims about the program based on the analysis, and justifying the claims by comparing the evidence against stakeholder values.
Because as central as data analysis is to evaluation, evaluators know that the evidence gathered for an evaluation does not necessarily speak for itself. Justification of conclusions is fundamental to utilization-focused evaluation. When agencies, communities, and other stakeholders agree that the conclusions are justified, they will be more inclined to use the evaluation results for program improvement.
The complicating factor, of course, is that different stakeholders may bring different and even contradictory standards and values to the table. Differences in values and standards will have been identified during stakeholder engagement in Step 1.
Those stakeholder perspectives will also have been reflected in the program description and evaluation focus.
Top of Page Analyzing and Synthesizing The Findings Data analysis is the process of organizing and classifying the information you have collected, tabulating it, summarizing it, comparing the results with other appropriate information, and presenting the results in an easily understandable manner.
The five steps in data analysis and synthesis are straightforward: Enter the data into a database and check for errors. If you are collecting data with your own instrument, you will need to select the computer program you will use to enter and analyze the data, and determine who will enter, check, tabulate, and analyze the data.
Some basic calculations include determining The number of participants The number of participants achieving the desired outcome The percentage of participants achieving the desired outcome.
When examination of your program includes research as well as evaluation studies, use statistical tests to show differences between comparison and intervention groups, between geographic areas, or between the pre-intervention and post-intervention status of the target population.
Present your data in a clear and understandable form. Data can be presented in tables, bar charts, pie charts, line graphs, and maps.
In evaluations that use multiple methods, evidence patterns are detected by isolating important findings analysis and combining different sources of information to reach a larger understanding synthesis.
Top of Page Setting Program Standards for Performance Program standards not to be confused with the four evaluation standards discussed throughout this document—are the benchmarks used to judge program performance.
Possible standards that might be used in determining these benchmarks are: Needs of participants Community values, expectations, and norms Program mission and objectives.Justification of conclusions is fundamental to utilization-focused evaluation.
When agencies, communities, and other stakeholders agree that the conclusions are justified, they will be more inclined to use the evaluation results for program improvement.
EVALUATING CONCLUSIONS IN THE LIGHT OF KNOWN FACTS This booklet is designed to provide practice questions in evaluating conclusions when you are given specific data to work with. Conclusions and Recommendations of seven Country Evaluations. evaluation in The study builds on preliminary findings reported in in OECD-DAC’s Evaluation Insights, based on the Tunisia, Mali and Zambia cases. Conclusions & Recommendations of seven country evaluations. The economic evaluation was challenging in a number of ways, particularly in seeking to obtain completed resource logs from case study research partners. Having a 2-week data collection period was also problematic in a field such as public involvement, where activity may be very episodic and infrequent.
This document is a “how to” guide for planning and implementing evaluation activities. The manual, based on CDC’s Framework for Program Evaluation in Public Health, is intended to assist managers and staff of public, private, and community public health programs to plan, design, implement and.
Problems, drawbacks, and challenges encountered during your study should be summarized as a way of qualifying your overall conclusions. If you encountered negative or unintended results [i.e., findings that are validated outside the research context in which they were generated], you must report them in the results section and discuss their.
Read chapter 7 Conclusions and Recommendations: This book reviews the evaluation research literature that has accumulated around 19 K mathematics curr.
Lesson A Assessing the Conclusions of the Study: Conclusions pull together the various results of the study, consider what they mean, and suggest their importance.
There are several types of conclusions. The following is one typology. A Summary of STURP's Conclusions. Editor's Note: After years of exhaustive study and evaluation of the data, STURP issued its Final Report in The following official summary of their conclusions was distributed at the press conference held after their final meeting in October