Evaluation Caf茅 2023-24
Do any of these describe you?
- Are you a program evaluator, researcher, data analyst, or someone else who regularly shares data and evidence with others?
- Do you care enough about your work to actually share it with others?
- Do you want to create better reports that are more accessible and reach more people?
- Do you feel like your reports are either too long or too short, with absolutely no in between?
- Does the idea of creating more reports just sound like tons more work?
- Not sure whether you should create infographics, dashboards, slidedocs, or something else entirely?
Join Chris Lysy of freshspectrum.com, and author of The Reporting Revolution, as he discusses why we need to think beyond the PDF and walks you through building your own modern reporting strategy. As a bonus: everyone who attends will also get access to a free modern reporting strategy template you can use to start crafting your own process.
Evaluative criteria represent values about what a high-quality or successful intervention 鈥渓ooks like鈥. These implicit or explicit criteria direct evaluators鈥 lines of inquiry, including which evaluation questions are asked, data are collected and analyzed, and conclusions are reached and reported. Community members, program participants, staff, leaders, funders, and evaluators often hold varying values. Thus, evaluators are charged with the complex tasks of identifying relevant values, specifying appropriate criteria, and applying those criteria to direct inquiry. This presentation will introduce an empirically supported model of evaluative criteria developed to guide evaluation practice. Discussion will highlight how the framework can be used to support criteria specification, make criteria more explicit, broaden the range of values that shape evaluative inquiry, and clarify evaluation design and reporting. The presentation will also explore current research on evaluation to refine the model and deepen understanding of practice.