Our work is guided by:
Collaboration - We believe we can do better together
Passion - We care deeply about improving education
Innovation - We seek new and different approaches
Implementation measurement has two facets. The first is “innovation implementation,” - the extent to which an intervention or program itself is enacted. Some refer to this as “implementation fidelity” or “treatment integrity.” The second facet is the “implementation process” which concerns the contextual factors that may support and/or inhibit innovation implementation. To learn more about the theory and framing of our implementation work, click here.
We have developed a suite of instruments for measuring implementation of instructional resources. We can customize them to fit your program, or, if you use Everyday Mathematics, FOSS, STC, or BSCS Science Tracks, we already have instruments for those programs. The instruments (below) allow for a variety of data sources and may be used in different combinations according to your questions and needs. Once we learn more about your needs and interests, we can recommend a customized plan. To learn more, please contact Amy Cassata.
To provide customized instruments when needed, we have developed an instrument customization process. This involves identifying your program/intervention “components” including materials, designs, resources, agendas, and other relevant materials. We will work with you to determine which of these components are essential. We will then place the components in our conceptual framework and use that to guide the instrument customization.
This includes a review of items within existing instruments to determine which will be retained and which are in need of revision. We will also work with you to identify components that are not measured in the existing instruments but that are present in your program, and create additional items to measure them. During this process, if you are interested in fidelity of implementation, we will ask you to provide an implementation fidelity benchmark (the intended level of enactment) associated with each item.
We can make any of the instruments available for use with or without our assistance.
There is a $250 annual use fee for the questionnaires. There is no fee for the interview protocols.
Scroll over instrument name for further description.
|Instrument||Innovation Implementation||Implementation Process|
|Teacher Instructional Questionnaire||✔|
|Teacher Factor Questionnaire||✔|
|Teacher Instructional Log||✔|
|Classroom Observation Protocol||✔|
|School Leader Questionnaire||✔||✔|
|Teacher Interview Protocol||✔|
|School Leader Interview Protocol||✔|
The Questionnaires, the Classroom Observation Protocol, and the Teacher Instructional Log are designed to be administered online. We have online interfaces for each instrument that can be included in a customized implementation measurement plan. All data are housed on a secure, password-protected server.
Costs vary depending on the number of instruments used.
New users of the Observation Protocol are required to participate in a 3-day in-person or online training session led by Outlier staff. The purpose of the training is to help users develop a shared understanding of the coding protocol, rating rules, and illustrative examples needed to use the protocol with reliability. Training activities include watching and practice coding selected videos of sample lessons. Two inter-rater reliability analyses (midpoint and end-of-training) are conducted as part of the training. Training costs vary depending on the size of your team, location, type of training (in person or online) and data housing method.
If requested, Outlier is available to prepare the data for analysis. This preparation involves data cleaning (e.g., omitting duplicate responses, accounting for incomplete and/or missing data), exporting the data into an MS Excel spreadsheet or analysis software program, and creating a codebook to specify items and responses. Costs vary depending on the size and type of data set.
To learn more about the ways we can customize these services for you, contact:
State, district, and school leaders are establishing growing numbers of STEM schools. In an effort to support those efforts, we are making the processes and tools we developed in our research work available to others.
We have worked with over 25 schools to carefully and precisely articulate the models of each school. Our model articulation process includes in-depth interviews with key personnel. We synthesize this information and in an iterative process develop a clear description of the school model. We present the models visually, in a way that helps schools communicate about themselves to funders, partners, teachers, students and the greater STEM community.
A key part of our STEM school work is focused on understanding how STEM schools and programs implement the essential components of their models. We can work with you to measure implementation of your STEM school or program using questionnaires, observations, interviews, and focus groups. We can present the results of our measurement in a variety of ways to accommodate the unique needs and differences of each school that we work with.
To learn more about how we can support STEM schools, contact:
Evaluations can provide valuable information about a range of issues including your program’s implementation, on-going growth, and effectiveness.
You may be looking for an evaluator because your funder has requested a third-party evaluation; you want to improve your program; you want to share information about program success; you need help describing and understanding your program; or one of many other reasons.
We can help you decide on the best type of evaluation depending on your needs.
Some questions we will explore with you:
We can help clarify your program model and theory of action. This will enable you to speak clearly about your strategy and enable the evaluation to examine which parts of your program are moving you toward your goals and which need improvement.
Our team is experienced with a range of evaluation designs. Given your needs and circumstances, we may suggest a pre-post approach, a time series, a comparison group design, case study, or a combination of these and others.
We have expertise with the development and administration of a range of data collection instruments including questionnaires, interviews, focus groups, and observations as well as using a range of data sources including documents, participant work product, artifacts, and secondary data. We will work with you to identify sources that are robust and feasible.
The evaluation design and data sources we select with your input will necessarily be tied to appropriate analyses. We are experienced with both qualitative and quantitative analysis techniques including regression, time series analysis, comparison studies and propensity score matching, inductive qualitative analysis, data reduction and psychometric techniques including confirmatory and exploratory factor analysis, secondary (archival) data analysis, and concept mapping.
You may want to keep evaluation findings entirely confidential, or you may want to publicize them widely. In either case, we want to ensure that our reporting meets your needs. We can create data visualizations, online reports, presentations, single page summaries, and full print reports.
See all of our evaluation projects.
From the Boston Schoolyard Initiative Evaluation Report
See more data visualizations.
To learn more about how we can customize an evaluation plan for you, contact: