Evaluation Management

Managing an evaluation involves allocating and using resources—especially people, money, and time—effectively to carry out an evaluation. Plans for resource use are communicated in formal documents such as budgets, work plans, and contracts or memoranda of agreement. 

People: Professional evaluators have credentials and experience that prepare them for a variety of technical, analytic, and interpersonal activities. Evaluators often involve staff from the projects they are evaluating in planning or conducting the evaluation. Good communication between evaluators and project staff is key to a successful evaluation. 

Money: The cost of an evaluation depends mostly on its scope, because that determines how much personnel time is required. Travel, materials, and overhead costs also affect the overall evaluation budget.     

Time: Decisions about how much time is needed for an evaluation and how to use that time depend on the project’s duration and schedule. The evaluation’s scope and when information is needed for decision making must also be considered.  

Featured Resources


View More Resources

Stakeholder Engagement

Evaluation stakeholders are people who are involved in or affected by the project being evaluated or the evaluation itself. When stakeholders are involved, they are more likely to care about the evaluation and use the results. 

Involving stakeholders helps increase the relevance and usefulness of an evaluation. Stakeholders may also be able to minimize roadblocks that evaluators face when conducting an evaluation.  

Stakeholders can be engaged in a variety of ways throughout an evaluation. The nature and level of that involvement should match their interest and availability to participate.  

Typical stakeholders in ATE evaluation: 

    • Project principal investigators and other staff 
    • Faculty, staff, and students involved with the project 
    • Administrators at the host institution 
    • Industry partners 
    • National Science Foundation program officers 

Featured Resources


View More Resources

Evaluation Contextualization

Evaluations are planned for specific projects in specific settings.

Logic models are useful for tailoring evaluations to specific projects. Logic models highlight project inputs, activities, outputs, and outcomes. A project logic model is a useful reference point for evaluation planning and helps ensure a shared understanding of what is being evaluated.   

Talking with the people who are in a position to use information from the evaluation is critical for making sure the evaluation will address their needs. What they need to know and how they plan to use the information are primary considerations when planning an evaluation. It’s also helpful to know what kinds of evidence they value. 

All projects occur in unique settings. Decisions about a project’s evaluation take context into account to make sure it is feasible and relevant. Culture is present in all evaluations. Socio-economic, geographic, environmental, and social factors that influence projects may also need to be considered when carrying out an evaluation. 

Featured Resources


View More Resources

Evaluation Design

Designing an evaluation involves making a series of decisions about which aspects of the project the evaluation will focus on and how to structure the inquiry. 

This process begins with developing evaluation questions. Evaluation questions identify the aspects of a project that will be evaluated; these aspects might include a project’s impact on students, its effectiveness in meeting workforce needs, or the number and characteristics of students and faculty who benefitted from the project. Evaluation questions reflect what the project is designed to do and what the evaluation will measure, as in these examples: 

“To what extent did the program influence the teaching practices of participating faculty?”  

“What is the program’s impact on students’ employability skills?”   

Planning how to answer evaluation questions involves deciding how to collect data and from what sourcesSome questions are best addressed by setting up comparison or control groups. Other questions can be answered with data available from institutional records. Surveys, interviews, and focus groups are common data collection methods in evaluation. Which methods you choose depends on the questions driving the evaluation. Most evaluation questions are best addressed by using both qualitative and quantitative data.  

Decisions about evaluation design also have to take into account what’s feasible, ethical, and culturally appropriate.  

Featured Resources

Designing a Purposeful Mixed Methods Evaluation


View More Resources

Data Collection & Analysis

Information for an evaluation is obtained through systematic procedures. The collected information is then analyzed. Analysis is the process of organizing and transforming raw data into evaluation findings. 

Evaluators use instruments—such as questionnaires, interview questions, and observation protocols—to collect information in a structured manner. Sometimes evaluators can find existing instruments that are appropriate. More often than not, evaluators must create new ones tailored to the specific project and evaluation questions.  

The raw data collected (usually in form of numbers or words) for an evaluation then has to be transformed into usable information through analysis. This process generates findings that serve as the evidence that evaluators will use to answer the evaluation questions. 

Featured Resources

Effective Communication Strategies for Interviews and Focus Groups


View More Resources

Interpretation

Interpretation is the process of making sense of analyzed data to answer evaluation questions.  

Interpretation involves comparing findings about project performance to targets or benchmarks in order to reach conclusions. 

Another aspect of interpretation is developing recommendations based on evaluation findings and conclusions. Sometimes evaluators develop recommendations to be considered by project staff. Other times, project staff and evaluators work together to recommend actions to take based on the evaluation.  

Featured Resources

Strategies and Sources for Interpreting Evaluation Findings to Reach Conclusions


View More Resources

Communication & Use of Results

Formal communication about evaluation includes describing the evaluation process and results in a way that stakeholders can understand and use.  

Formal evaluation reports typically describe the project that was evaluated; the evaluation process; and the evaluation findings, conclusions, and recommendations. But communicating results can take many other forms, from one-on-one interactions to “data parties” to peer-reviewed articles. The format and content of these communications depends on the audience’s interest level and how they will use the information. 

Reports—whatever form they take—are the vehicle for conveying evaluation information to the people who can use it. Evaluations get “used” when the information leads to a change in the project, its host organization, or the people involved. Using evaluative information to identify opportunities to improve projects is one of the most important purposes of evaluation. 

Featured Resources


View More Resources

Quality Assurance

Quality assurance involves taking steps to ensure that evaluation plans, activities, and products are sufficiently useful, practical, ethical, and accurate.   

Evaluators can get feedback on their evaluation work from colleagues as well as project stakeholders and advisors.  Or a project may engage another evaluator to conduct a formal evaluation of the evaluation (metaevaluation). 

The Joint Committee on Standards for Educational Evaluation has developed standards for educational program evaluation. These 30 standards address five domains: 

  • Utility 
  • Feasibility 
  • Propriety 
  • Accuracy 
  • Accountability 

Featured Resources

Checklist of The Program Evaluation Standards Statements


View More Resources
Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.