How can ATE project staff and/or STEM educators in general tell if the strategies they are implementing to increase diversity are impacting the targeted students and if those students actually find those strategies helpful?
I’m very passionate about using evaluation and data to support the National Science Foundation’s (NSF’s) goal of broadening impacts in STEM education. In IWITTS’ CalWomenTech Project, we provided technical assistance to seven community colleges in California between 2006 and 2011 to help them recruit and retain female students into technology programs where they were underrepresented. Six of seven CalWomenTech colleges had increases in female enrollment in targeted introductory technology courses and four colleges increased both female and male completion rates substantially (six colleges increased male retention). So how could the CalWomenTech colleges tell during the project if the strategies they were implementing were helping female technology students?
The short answer is: The CalWomenTech colleges knew because 1) the project was measuring increases in female (and male) enrollment and completion numbers in as close to real time as possible; and 2) they asked the female students in the targeted classes if they had experienced project strategies, found those strategies helpful, and wanted to experience strategies they hadn’t encountered.
What I want to focus on here is how the CalWomenTech Project was able to use the findings from those qualitative surveys. The external evaluators for the CalWomenTech Project developed an anonymous “Survey of Female Technology Course Students” that was distributed among the colleges. The survey was a combination of looking at classroom retention strategies that the instructors had been trained on as part of the project, recruitment strategies, and population demographics. The first time we administered the survey, 60 female students responded (out of 121 surveyed) across seven CalWomenTech colleges. The colleges were also provided with the female survey data filtered for their specific college.
Fifty percent or more of the 60 survey respondents reported exposure to over half the retention strategies listed in the survey. One of the most important outcomes of the survey was that the CalWomenTech colleges were able to use the survey results to choose which strategies to focus on. Instructors exposed to the results during a site visit or monthly conference call came up with ways to start incorporating the strategies female students requested in their classroom. For example, one STEM instructor came up with a plan to start assigning leadership roles in group projects randomly to avoid men taking the leadership role in groups more often than women, after she saw how many female students wanted to try out a leadership role in class.
To hear about more evaluation lessons learned, watch the webinar “How well are we serving our female students in STEM?” or read more about the CalWomenTech survey of female technology students here.
Human Subjects Alert: If you are administering a survey such as this to a specific group of students and there are only a few in the program, then it’s not anonymous. It’s important to be very careful about how the responses are shared and with whom, since this kind of survey includes confidential information that could harm respondents.
Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.