Mountain Ridge

As a project PI, have you ever glanced at an evaluation report and wished it has been presented in a different format to be more useful?

As an evaluator, have you ever spent hours working on an evaluation report only to find that your client skimmed it or didn’t read it?

In this second part of the conversation, a Principal Investigator (client) interviews the independent evaluator to unearth key points within our professional relationship that lead to clarity and increased evaluation use. This is a real conversation that took place between the two of us as we brainstormed ideas to contribute to the EvaluATE blog. We believe these key points: understanding of evaluation, evaluation reporting, and “ah ha” moments, will be useful to other STEM evaluators and clients. In this post, the principal investigator (PI)/client interviews the evaluator and key takeaways are suggested for evaluation clients (see our prior post in which the tables are turned).

Understanding of Evaluation

PI (Manu): What were your initial thoughts about evaluation before we began working together?

Evaluator (Ayesha): “I thought evaluation was this amazing field where you had the ability to positively impact programs. I assumed that everyone else, including my clients, would believe evaluation was just as exciting and awesome as I did.”

Key takeaway: Many evaluators are passionate about their work and ultimately want to provide valid and useful feedback to clients.

Evaluation Reports

PI: What were your initial thoughts when you submitted the evaluation reports to me and the rest of the leadership team?

Evaluator: “I thought you (stakeholders) were all going to rush to read them. I had spent a lot of time writing them.”

PI: Then you found out I wasn’t reading them.

Evaluator: “Yes! Initially I was frustrated, but I realized that maybe because you hadn’t been exposed to evaluation, that I should set up a meeting to sit down and go over the reports with you. I also decided to write brief evaluation memos that had just the highlights.”

Key takeaway: As a client, you may need to explicitly ask for the type of evaluation reporting that will be useful to you. You may need to let the evaluator know that it is not always feasible for you to read and digest long evaluation reports.

Ah ha moment!

PI: When did you have your “Ah ha! – I know how to make this evaluation useful” moment?

Evaluator: “I had two. The first was when I began to go over the qualitative formative feedback with you. You seemed really excited and interested in the data and recommendations.”

The second was when I began comparing your program to other similar programs I was evaluating. I saw that it was incredibly useful to you to see what their pitfalls and successful strategies were.”

Key takeaway: As a client, you should check in with the evaluator and explicitly state the type of data you find most useful. Don’t assume that the evaluator will know. Additionally, ask if the evaluator has evaluated similar programs and if she or he can give you some strengths and challenges those programs faced.

About the Authors

Ayesha Boyce

Ayesha Boyce box with arrow

Assistant Professor, Department of Educational Research Methodology University of North Carolina Greensboro

Dr. Ayesha Boyce received her Ph.D. in Educational Psychology with a specialization in Evaluation from the University of Illinois Urbana-Champaign. She is an assistant professor at the University of North Carolina at Greensboro. Her research interests focus on addressing issues related to diversity, equity, access, climate, and cultural responsiveness while judging the quality of implementation, effectiveness, impact, and institutionalization of educational programs, especially those that are multi-site and/or STEM. Dr. Boyce has evaluated many programs funded by the National Science Foundation, National Institutes of Health, Title VI, and others. She is the Chair of the American Evaluation Association STEM TIG.

Manu Platt

Manu Platt box with arrow

Associate Professor, Department of Educational Research Methodology University of North Carolina Greensboro

Dr. Manu O. Platt earned his B.S. in Biology from Morehouse College and Ph.D. in Biomedical Engineering from Georgia Institute of Technology and Emory University. After postdoctoral work at MIT, he returned to Georgia Tech/Emory where he was recently promoted and tenured. The Platt Lab studies strokes in children with sickle cell disease, HIV-mediated cardiovascular disease, and predictive medicine in cancer. He is also Diversity Director for the NSF Center on Emergent Behaviors of Integrated Cellular Systems (EBICS). He co-founded and co-directs Project ENGAGES, a biotech and engineering research program for African-American high school students in Georgia Tech laboratories. Website: Platt Lab

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.