At the STEM Program at WestEd, we are in the third year of an evaluation of an innovative, hands-on STEM curriculum. Learning by Making is a two-year high school STEM course that integrates computer programming and engineering design practices with topics in earth/environmental science and biology. Experts in the areas of physics, biology, environmental science, and computer engineering at Sonoma State University (SSU) developed the curriculum by integrating computer software with custom-designed experiment set-ups and electronics to create inquiry-based lessons. Throughout this project-based course, students apply mathematics, computational thinking, and the Next Generation Science Standards (NGSS) Scientific and Engineering Design Practices to ask questions about the world around them, and seek the answers. Learning by Making is currently being implemented in rural California schools, with a specific effort being made to enroll girls and students from minority backgrounds, who are currently underrepresented in STEM fields. You can listen to students and teachers discussing the Learning by Making curriculum here.

Using a Logic Model to Drive Evaluation Design

We derived our evaluation design from the project’s logic model. A logic model is a structured description of how a specific program achieves an intended learning outcome. The purpose of the logic model is to precisely describe the mechanisms behind the program’s effects. Our approach to the Learning by Making logic model is a variant on the five-column logic format that describes the inputs, activities, outputs, outcomes, and impacts of a program (W.K. Kellogg Foundation, 2014).

Learning by Making Logic Model

Click image to view enlarge

Logic models are read as a series of conditionals. If the inputs exist, then the activities can occur. If the activities do occur, then the outputs should occur, and so on. Our evaluation of the Learning by Making curriculum centers on the connections indicated by the orange arrows connecting outputs to outcomes in the logic model above. These connections break down into two primary areas for evaluation: 1) teacher professional development, and 2) classroom implementation of Learning by Making. The questions that correlate with the orange arrows above can be summarized as:

  • Are the professional development (PD) opportunities and resources for the teachers increasing teacher competence in delivering a computational thinking-based STEM curriculum? Does Learning by Making PD increase teachers’ use of computational thinking and project-based instruction in the classroom?
  • Does the classroom implementation of Learning by Making increase teachers’ use of computational thinking and project-based instruction in the classroom? Does classroom implementation promote computational thinking and project-based learning? Do students show an increased interest in STEM subjects?

Without effective teacher PD or classroom implementation, the logic model “breaks,” making it unlikely that the desired outcomes will be observed. To answer our questions about outcomes related to teacher PD, we used comprehensive teacher surveys, observations, bi-monthly teacher logs, and focus groups. To answer our questions about outcomes related to classroom implementation, we used student surveys and assessments, classroom observations, teacher interviews, and student focus groups. SSU used our findings to revise both the teacher PD resources and the curriculum itself to better situate these two components to produce the outcomes intended. By deriving our evaluation design from a clear and targeted logic model, we succeeded in providing actionable feedback to SSU aimed at keeping Learning by Making on track to achieve its goals.

About the Authors

Linlin Li

Linlin Li box with arrow

Senior Research Associate, WestEd

Dr. Linlin Li is a Senior Research Associate for the Science, Technology, Engineering, and Mathematics (STEM) program at WestEd. Linlin is a principal investigator or co-principal investigator for several projects, including 1) Department of Education Investing in Innovation i3 funded evaluation of a science-driven computational-thinking integrated STEM intervention that improves mathematical and science proficiency for high-needs rural students, 2) (IES) Goal 3 efficacy study of a vocabulary intervention at elementary schools, and 3) evaluation and technical assistance to an elementary school district for their Race to the Top-District (RTTT-D) data systems and blended learning project. Linlin is also the senior methodologist and statistician on several federally funded projects. Linlin applies modern statistical techniques in practical settings, including psychometric analysis, multilevel analysis, longitudinal modeling, and relevant application in evaluation and research.

Rachel Tripathy

Rachel Tripathy box with arrow

Research Associate, WestEd

Rachel Tripathy is a Research Associate for the Science, Technology, Engineering and Mathematics (STEM) program at WestEd. She holds an M.A. in International Comparative Education and has extensive experience working on education projects in developing areas, with a focus on randomized trials and quantitative research methods. Her research interests include early grade literacy development, environmental education, and hands-on/discovery learning. Rachel has worked with Bay Area educators and researchers to design and implement studies examining the efficacy of hands-on environmental science and outdoor education programs, and continues to participate in research on interactive math and science curricula in her work at WestEd. Rachel also manages WestEd’s evaluation of the i3-funded Learning by Making curriculum, in partnership with Sonoma State University, where she designs and implements fidelity and efficacy measures, and contributes to reporting and dissemination.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.