I am wrestling with a wicked evaluation problem: How do I balance evaluation, research, and technical assistance work when they are so interconnected? I will discuss strategies for managing different aspects of work and the implications of evaluating something that you are simultaneously trying to change.

 

Background

In 2017, the National Science Foundation solicited proposals that called for researchers and practitioners to partner in conducting research that directly informs problems of practice through the Research Practice Partnership (RPP) model. I work on one project funded under this grant: Using a Researcher-Practitioner Partnership Approach to Develop a Shared Evaluation and Research Agenda for Computer Science for All (RPPforCS). RPPforCS aims to learn how projects supported under this funding are conducting research and improving practice. It also brings a community of researchers and evaluators across funded partnerships together for collective capacity building.

 

The Challenge

The RPPforCS work requires a dynamic approach to evaluation, and it challenges conventional boundaries between research, evaluation, and technical assistance. I am both part of the evaluation team for individual projects and part of a program-wide research project that aims to understand how projects are using an RPP model to meet their computer science and equity goals. Given the novelty of the program and research approach, the RPPforCS team also supports these projects with targeted technical assistance to improve their ability to use an RPP model (ideas that typically come out of what we’re learning across projects).

 

Examples in Practice

The RPPforCS team examines changes through a review of project proposals and annual reports, yearly interviews with a member of each project, and an annual community survey. Using these data collection mechanisms, we ask about the impact of the technical assistance on the functioning of the project. Being able to rigorously document how the technical assistance aspect of our research project influences their work allows us to track change affected by the RPPforCS team separately from change stemming from the individual project.

We use the technical assistance (e.g., tools, community meetings, webinars) to help projects further their goals and as research and evaluation data collection opportunities to understand partnership dynamics. The technical assistance tools are all shared through Google Suite, allowing us to see how the teams engage with them. Teams are also able to use these tools to improve their partnership practice (e.g., using our Health Assessment Tool to establish shared goals with partners). Structured table discussions at our community meetings allow us to understand more about specific elements of partnership that are demonstrated within a given project. We share all of our findings with the community on a frequent basis to foreground the research effort, while still providing necessary support to individual projects. 

 

Hot Tips

  • Rigorous documentation The best way I have found to account for our external impact is rigorous documentation. This may sound like a basic approach to evaluation, but it is the easiest way to track change over time and track change that you have introduced (as opposed to organic change coming from within the project).
  • Multi-use activities Turn your technical assistance into a data collection opportunity. It both builds capacity within a project and allows you to access information for your own evaluation and research goals.

About the Authors

Stacey Sexton

Stacey Sexton box with arrow

Evaluator, SageFox Consulting Group

Stacey Sexton is an evaluator with SageFox Consulting Group. For the past five years, they have worked on educational assessment, with a particular focus on student learning outcomes assessment. Stacey is particularly interested in projects that examine the student transition from K-12 to postsecondary education and projects that have the potential to positively impact the most marginalized student populations. Stacey approaches their work with a firm commitment to the principles of equity, inclusion, and social justice, and it is their belief that well-done evaluations are critical to moving educational practice forward to reflect these principles. Outside of work, you are likely to find Stacey doing some form of political organizing. Stacey also enjoys gardening, hiking, and taking pictures of their dog.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.