Our Approach

A systematic approach

Evaluation Theories & Strategy

EEC utilizes a systematic approach to evaluation that incorporates best practice evaluation theory and strategies. As outlined below, these elements yield the types of formative and summative data that clients appreciate. EEC works collaboratively with program leaders to assess their project’s purpose, design evaluations, and collect the necessary information for informed, data-based decisions.

Theoretical Approach to Evaluation

Most clients appreciate a holistic, comprehensive approach to evaluation that allows ongoing data collection, reflection, and utilization for service planning, midcourse corrections, and decision-making. Therefore, EEC approaches program implementation and evaluation using the principles of Implementation Science; we recognize that developing and measuring evidence-based practices requires thoughtful, systematic implementation processes in order for effective outcomes to be realized (Fixsen et al., 2005). EEC's approach to evaluation, therefore, starts with the client. Using a Participatory Evaluation approach, EEC seeks input from stakeholders at multiple levels to deeply understand program services and effects, identify relevant evaluation questions and process issues, and accurately interpret evaluation findings (Preskill & Catsambas, 2008). Other approaches include Developmental Evaluation and Case Study. Specific evaluation strategies include: process/formative evaluation (quality of the content, design and delivery/implementation); outcome/summative evaluation (value of outcomes); lessons learned (barriers/enablers, nuances, surprises, causal explanations); overarching questions about the value/worth of the program; and forward/outward focused evaluation questions (e.g., replication, sustainability, threats, opportunities) (Davidson, 2009).

Logic Model Development

EEC evaluators recognize the importance of logic models as critical tools that help programs identify their goals, plan their services, and specify the intended outcomes they aim to achieve using inputs and program processes. Essentially, a logic model is a visual presentation of how the project will work, illustrating the program's impact theory and assumptions (Donaldson, 2007), links between project objectives, individuals and organizations working to achieve them (Frechtling, 2007), external factors that may influence results, and the relationships among resources, activities and outcomes. EEC works with clients to develop detailed, conceptually complete logic models that provide the foundation for successfully moving a program forward according to its goals and objectives. We approach logic model development, review and refinement holistically, recognizing that the value of logic models is not only for program planning, but for implementation, evaluation and program performance-in essence, throughout the life of a program (Kellogg Foundation, 2004).

Evaluation Plan Development

Once the logic model is completed, EEC develops the evaluation plan. Typically, evaluation plans include overarching questions that guide the measurement and data collection strategies for assessing critical outputs, as well as short-term, intermediate and long-term outcomes. Evaluation plans also outline data sources, methods and timelines for collection, analysis and reporting. The client then reviews the plan for accuracy and coherence with program goals, objectives and activities. The final plan serves as the roadmap for the evaluation, is reviewed at least annually, and revised as needed.

Methodology

EEC uses a mixed-methodology approach to most evaluation projects, incorporating both quantitative and qualitative strategies to collect, analyze and report data. Typical evaluations include: development of surveys, individual and focus group interviews, observation protocols, and records/document review forms. Fidelity of implementation across sites or providers is also frequently measured. End-user data are analyzed and incorporated into formative and summative reports to gauge satisfaction as well as project progress, value and effectiveness. All instruments and procedures are developed, tested, and implemented in accordance with standard evaluation protocols (Fowler, 2008; Dillman, 2008; Krueger & Casey, 2000; Ruhe and Zumbo, 2009; Wholey et. al., 2010). Instruments are created in a collaborative manner allowing EEC evaluators to engage with project staff as well as benefit from their content expertise. EEC's strives to conduct all program evaluations in a non-intrusive manner, adapting as necessary to client needs and schedules. Data collection strategies include in-person and phone interviews (individual and focus group), online/web-based survey methods, and data extraction from program databases and documents. EEC evaluators engage in continuing education opportunities that allow them to stay abreast of current issues such as increasing survey response rates and data visualization.

Communication & Collaboration

EEC is committed to an ongoing dialogue with clients. During project start-up, we expect several face-to-face meetings with the client to develop a solid working relationship, establish communication channels and monthly evaluation work group calls, and identify relevant management and staff meetings in which EEC evaluators may participate Ongoing participation in key meetings provides valuable context for evaluation activities as well as the opportunity to inform evaluation methodologies, data collection strategies and data analysis. Furthermore, because EEC employs a cyclical evaluation approach that simultaneously plans for and reflects on data collections, these various meetings provide a platform in which EEC and project leaders and staff can review evaluation findings at strategic intervals, use formative and summative evaluation reports with key stakeholders to evaluate progress, and modify plans as appropriate. After initial meetings, EEC often participates via video conferencing or phone, attending in-person meetings at clients' request or for data collection.

Tasks, project management & reporting

Once the evaluation plan is finalized and agreed upon by project staff and funders, EEC develops a Data Collection Schedule to manage the evaluation. Typical schedules include timelines for meetings with clients, evaluation activities (e.g., instrument development, data collection, analysis), and interim and summative reporting. This process keeps projects on track, aligned with goals and projected outcomes, and in accordance with the client's and funding agency's expectations. Interim and summative reports are tailored to address the stakeholder audience(s) and delivered according to project timelines. Reports usually include quantitative and qualitative data that provide periodic performance feedback and examine the effectiveness of project activities as well as the achievement of outcomes. In addition to written reports, EEC also has the capacity to develop brief infographics as well as PowerPoint presentations. EEC evaluators are familiar with federal reporting requirements and assist clients in annual and final submissions of their Continuation Reports (e.g., 524B). Sample reports are available upon request.

Evaluation Ethics

EEC work is aligned with the American Evaluation Association (AEA) Guiding Principles for Evaluators and Program Evaluation Standards of Utility, Feasibility, Propriety and Accuracy. All EEC project personnel conduct their work in accordance with these principles and standards.

testimonials

Hear it from our clients

Skip to content