The Formative Assessment Systems for ATE project (FAS4ATE) focuses on assessment practices that serve the ongoing evaluation needs of projects and centers. Determining these information needs and organizing data collection activities is a complex and demanding task, and we’ve used logic models as a way to map them out. Over the next five weeks, we offer a series of blog posts that provide examples and suggestions of how you can make formative assessment part of your ATE efforts. – Arlen Gullickson, PI, FAS4ATE
Week 4 – Why making changes based on evidence is important
At the Mentor-Connect: Leadership Development and Outreach for ATE project (www.Mentor-Connect.org), formative feedback guides the activities we provide and resources we develop. It is the compass that keeps us heading in the direction of greatest impact. I’ll share three examples of how feedback in the different stages of the project’s life cycle helped us adapt the project. The first was feedback from an outside source; the second two were based on our internal feedback processes.
The initial Mentor-Connect technical assistance workshop for each cohort focuses on developing grant writing skills for the NSF ATE program. The workshop was originally designed to serve teams of two STEM faculty members from participant colleges; however, we were approached by grant writers from those colleges who also wanted to attend. On a self-pay basis, we welcomed these additional participants. Post-workshop surveys and conversations with grant writers at the event indicated that during the workshop we should offer a special breakout session just for grant writers so that issues specific to their role in the grant development and submission process could be addressed. This breakout session was added and is now integral to our annual workshop.
Second, feedback from our mentors about our activities caused us to change the frequency of our face-to-face workshops. Mentors reported that the nine-month time lag between the project’s January face-to-face workshop with mentors and the college team’s submission of a proposal the following October made it hard to maintain momentum. Mentors yearned for more face-to-face time with their mentees and vice versa. As a result, a second face-to-face workshop was added the following July. Evaluation feedback from this second gathering of mentors and mentees was resoundingly positive. This second workshop is now incorporated as a permanent part of Mentor-Connect’s annual programming.
Finally, one of our project outputs helps us keep our project on track. We use a brief reporting form that indicates a team’s progress along a grant development timeline. Mentors and their mentees independently complete and submit the same form. When both responses indicate “ahead of schedule” or “on time” or even “behind schedule,” this consensus is an indicator of good communications between the mentor and his or her college team. They are on the same page. If we observe a disconnect between the mentee’s and mentor’s progress reports, this provides an early alert to the Mentor-Connect team that an intervention may be needed with that mentee/mentor team. Most interventions prompted by this feedback process have been effective in getting the overall proposal back on track for success.
With NSF ATE projects, PIs have the latitude and are expected to make adjustments to improve project outcomes. After all, it is a grant and not a contract. NSF expects you to behave like a scientist and adjust based on evidence. So, don’t be glued to your original plan! Change can be a good thing. The key is to listen to those who provide feedback, study your evaluation data, and adjust accordingly.
Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.