About the Evaluation
Outlier’s approach to evaluation is grounded in our commitment to generating useful and practical information for clients that furthers program development, leads to program improvements, and directs programs to their desired outcomes.
Outlier's Role
Outlier evaluators take on a range of roles during the evaluation process. One of Outlier’s primary roles is that of "critical friend." In this role, Outlier evaluators become part of the project team while still maintaining an outsider point of view. Through regular communication about project activities and findings, Outlier evaluators embrace a collaborative approach to program improvement.
Evaluation Questions, Methodology and Sample
Outlier Research and Evaluation at CEMSE, University of Chicago has conducted an evaluation of TECH CORPS’ Techie Clubs and Camps in Ohio. The evaluation examined the TECH CORPS mission and strategies as a whole, as well as specific and unique strategies and experiences between Clubs and Camps. The evaluation had two strands. The first strand focused on working with TECH CORPS leaders to clearly articulate the TECH CORPS mission and model as a whole, and identify Techie Club and Camp specific theories of action. The second strand focused on on data collection and analysis of the Ohio Techie Clubs and Camps during the 2013-2014 school year and summer to examine implementation and outcomes.
The evaluation contained both formative and summative components and provided timely feedback to help inform the design and implementation process as well as determine the impact of both Techie Club and Techie Camp, and make recommendations for future improvements.
-
Evaluation Questions
- What is TECH CORPS’ intended model for both Techie Clubs and Camps?
- What are specific goals for both Techie Clubs and Camps?
- How are instructional and program strategies aligned with anticipated youth and instructor outcomes?
- What is the status of the Techie Club and Camp program implementation?
- Are camps and clubs being implemented as intended?
- What variation in implementation exists across Club and Camp sites and why?
- To what extent do Techie Clubs and Camps appear to influence students’ perceived knowledge and skills?
- To what extent do participants report greater knowledge of the fundamentals of Robotics, Programming, Android App or Web Development after participation?
- To what extent do Techie Clubs and Camps experiences appear to influence youth’s attitudes, self-efficacy and career aspirations pertaining to computer science and engineering?
- What parts of the TECH CORPS program model (for clubs and camps) appear to be essential? Which parts may be further developed or altered?
- To what extent is TECH CORPS positioned to spread Techie Clubs and Camps to other sites?
The evaluation employed a combination of qualitative (e.g., interviews, focus groups) and quantitative (e.g., surveys) data collection methods to answer the six evaluation questions. Data sources included pre and post youth questionnaires, activity observations, volunteer interviews, school leader interviews, coordinator interviews, youth interviews, youth focus groups, and meetings and interviews with TECH CORPS leadership. The table below illustrates the data sources used to answer each evaluation question.
- What is TECH CORPS’ intended model for both Techie Clubs and Camps?
-
Evaluation Questions and Data Collection
Outlier Philosophy and Principles
Outlier Research and Evaluation is the new name of the Research and Evaluation group at the Center for Elementary Mathematics and Science Education (CEMSE) at the University of Chicago.
Outlier’s mission is to empower those who seek to advance and improve education with the knowledge, tools, and support to realize change.
Outlier evaluators take on a variety of roles during the evaluation process. Not only are evaluators providing reflective information to TECH CORPS, but they are also making specific recommendations to TECH CORPS.
Two primary principles guide our evaluation work:
- Evaluations must be grounded in a commitment to conducting rigorous studies of significant questions; and
- Evaluation designs must appropriately address the context and condition of the project and be sensitive to the needs and interest of the stakeholders.