A good prompt to start thinking about how to approach the evaluation of an Advanced Technological Education (ATE) professional development (PD) project is the ATE program solicitation. Regarding PD grants, the solicitation states that “projects should be designed to enhance the educators’ disciplinary capabilities, teaching skills, and understanding of current technologies and practices, and employability skills.” It further recommends the “evaluation should demonstrate use in the classrooms and sustainable changes in practice of participating faculty and teachers leading to more qualified technicians for the industry. Changes in student learning outcomes as well as students’ perceptions of technical careers should be assessed” (National Science Foundation, p. 5).

ATE grants span multiple years. However sustainable, lasting systemic change is the long-term goal. It is important to consider the potential for systemic change as the project begins, and build in realistic indicators that the project activities are influencing the system. The following are some tips to consider when evaluating PD projects.

  1. Evaluate the design and process of PD interventions, as well as the outcomes. This is especially helpful for formative evaluation, which provides feedback for improving interventions while they’re underway. It’s also critical for illuminating the strengths and weaknesses of a PD effort to aid in understanding why certain outcomes were or were not achieved. Learning Forward’s Standards for Professional Learning and the Southern Regional Education Board’s Standards for Online Professional Development are good sources of information about what high-quality PD looks like. Fellow instructors or program deans with content knowledge can be helpful collaborators and internal evaluators, providing feedback on the quality of the content, instruction, and materials.
  2. Don’t reinvent the wheel with your evaluation design. PD is one of a relatively few areas where there are well-established frameworks for evaluation. Donald Kirkpatrick was the guru of PD evaluation and the originator of the “Four Levels” approach. Thomas Guskey adapted the Kirkpatrick model specifically for education contexts and defined five levels of professional learning evaluation. Jack and Patti Phillips bring a return-on-investment perspective to this work. Check out their materials for great ideas for framing your PD evaluation and for guidance in determining which data and data sources to employ. Joellen Killion brings these models together in her book Assessing Impact, which offers six levels to consider: reaction, learning, organizational support, application, impact on students, and return on investment.
  3. Once you embrace the “levels” approach to PD evaluation, project stakeholders can work collaboratively to define the intended outcomes for each level and the evaluation data collection methods and sources. One way to focus this work is to recall the National Science Foundation’s interest in impacting (1) educators’ disciplinary capabilities, teaching skills, and understanding of current technologies and practices, and employability skills, and (2) students’ learning outcomes and perceptions of technical
  4. If a professional learning community (e.g., community of practice, virtual learning community) is involved, pay special attention to capturing the nature of the interactions and associated learning among participants. In this type of PD initiative, assessing process is crucial. To learn more about evaluating professional communities, see Etienne and Beverly Wenger-Trayner’s overview of communities of practice.

Online PD has its own set of challenges for evaluation, but tools and frameworks are available to successfully evaluate them. Back-end analytics are available via various online venues, and with that technology, evaluation may actually be easier, because records are kept automatically.

ADDITIONAL RESOURCES

The Evaluation Exchange’s special issue on professional development (see especially the article by Spicer et al. about online professional development).

Example professional development follow-up survey developed by the ATE project, Destination Problem-Based Learning

The Student Assessment of Their Learning Gains Instrument for use by college instructors to “gather learning-focused feedback from students.”

* This blog is based on a handout from an EvaluATE workshop at the 2011 ATE Principal Investigators Conference.

About the Authors

Terryll Bailey

Terryll Bailey box with arrow

Founder and President The Allison Group

Terryll Bailey is founder and president of The Allison Group in Seattle, Washington. The consulting firm specializes in workforce development research and evaluation. With over 20 years of experience, Bailey is the external evaluator for many National Science Foundation projects and centers across multiple programs—including Advanced Technological Education, S-STEM, Improving Undergraduate STEM Education, and Hispanic-Serving Institutions—as well as not-for-profit foundations. The Allison Group has a collaborative approach to evaluation, with the goal of integrating evaluative thinking and activities into the project to focus on evidence of impact on individuals and organizations.

Lori Wingate

Lori Wingate box with arrow

Executive Director, The Evaluation Center at Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She is co-principal investigator of EvaluATE and leads and a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.