Evaluations are most useful when evaluators make relevant findings available to project partners at key decision-making moments. One approach to increasing the utility of evaluation findings is by collecting real-time data and providing immediate feedback at crucial moments to foster progress monitoring during service delivery. Based on our experience evaluating multiple five-day professional learning institutes for an ATE project, we discovered the benefits of providing real-time evaluation feedback and the vital elements that contributed to the success of this approach.

What did we do?

With project partners we co-developed online daily surveys that aligned with the learning objectives for each day’s training session. Daily surveys measured the effectiveness and appropriateness of each session’s instructional delivery, exercises and hands-on activities, materials and resources, content delivery format, and session length. Participants also rated their level of understanding of the session content and preparedness to use the information. They could submit questions, offer suggestions for improvement, and share what they liked most and least. Based on the survey data that evaluators provided to project partners after each session, partners could monitor what was and wasn’t working and identify where participants needed reinforcement, clarification, or re-teaching. Project partners could make immediate changes and modifications to the remaining training sessions to address any identified issues or shortcomings before participants completed the training.

Why was it successful?

Through the process, we recognized that there were a number of elements that made the daily surveys useful in immediately improving the professional learning sessions. These included the following:

  • Invested partners: The project partners recognized the value of the immediate feedback and its potential to greatly improve the trainings. Thus, they made a concentrated effort to use the information to make mid-training modifications.
  • Evaluator availability: Evaluators had to be available to pull the data after hours from the online survey software program and deliver it to project partners immediately.
  • Survey length and consistency: The daily surveys took less than 10 minutes to complete. While tailored to the content of each day, the surveys had a consistent question format that made them easier to complete.
  • Online format: The online format allowed for a streamlined and user-friendly survey. Additionally, it made retrieving a usable data summary much easier and timelier for the evaluators.
  • Time for administration: Time was carved out of the training sessions to allow for the surveys to be administered. This resulted in higher response rates and more predictable timing of data collection.

If real-time evaluation data will provide useful information that can help make improvements or decisions about professional learning trainings, it is worthwhile to seek resources and opportunities to collect and report this data in a timely manner.

Here are some additional resources regarding real-time evaluation:

About the Authors

Elizabeth Peery

Elizabeth Peery box with arrow

Researcher and Evaluator Magnolia Consulting, LLC

Beth Peery, Lead Research Assistant with Magnolia Consulting, provides support for a variety of studies through database development and management, data collection, survey management, data analysis, and report writing. At Magnolia Consulting, she assisted with an NSF-funded Advanced Technological Education (ATE) project, which set out to improve the academic-to-workforce pathways of geospatial technologies at several Virginia community colleges. Her educational experience includes in-depth training in quantitative and qualitative data collection and research methodologies.

Stephanie Wilkerson

Stephanie Wilkerson box with arrow

President Magnolia Consulting, LLC

Stephanie B. Wilkerson, President of Magnolia Consulting, brings over 20 years of experience working on research and evaluation studies at national, state, and local levels. She has served as the principal evaluator for the NSF-funded Advanced Technological Education (ATE) project with the Virginia Space Grant Consortium and several Virginia community colleges that aim to enhance geospatial technician education to meet workforce demands. Stephanie has directed 32 randomized controlled trials and quasi-experimental evaluations in the areas of STEM, reading, and English language proficiency using mixed-method designs with student assessment data, student interest surveys, instructional practices surveys, and implementation fidelity measures.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.