Myth Busting Evaluation - COP Toolkit
113
page,page-id-113,page-child,parent-pageid-18,page-template-default,ajax_fade,page_not_loaded,,footer_responsive_adv,qode-child-theme-ver-1.0.0,qode-theme-ver-10.0,wpb-js-composer js-comp-ver-4.12,vc_responsive
 

Myth Busting Evaluation

There are numerous myths that surround evaluation in general and outcome-based evaluation more specifically.  In fact, evaluation can evoke fears and skepticism from various stakeholders within the community based sector. You may have fallen prey to the common myths described below, if the following descriptions sound familiar.

The need to get it right versus try and learn

Myth-talk:

“Outcome evaluation is challenging and requires skills that we do not have.  Funders have high expectations and we cannot afford to get it wrong. We must get it right if we are going to do it at all.”

 

 Myth-busting!

While the notion of getting it right has some merit in that doing evaluation well certainly has advantages and can make peoples lives easier, this perspective does however tend to inhibit a willingness to try.  The Getting it Right attitude perpetuates an all-or-nothing approach, which is not ideal for staff development and capacity building.  In reality, the field of evaluation is filled with many diverse perspectives, a body of literature and evidence-base is developing all the time.  There are many approaches to evaluation and Getting it Right should be replaced with a Try and Learn attitude. It is important to acknowledge that staff deliver programs and services, are on the frontlines and often observe signs of change. Effectively this makes key staff experts in identifying and documenting intended results.

We already know versus challenging our assumptions

Myth-talk:

“Our organization and staff already know exactly what our clients need.  We are listening all the time and we strive on a daily basis to meet their needs. We do not need evaluation to tell us what people need; it is a waste of time because we already know.”

 

 Myth-busting!

Organizations cannot know what they don’t know, and therefore key insights from evaluation efforts may take us by surprise and help to shift our thinking and approach.  Often the key to success is staying in frequent contact with clients and other community stakeholders and creating opportunities for anonymous feedback.  Outcome-based evaluation ensures that your knowledge of client needs remains current and evidence-based.  Constant improvement may be implemented within program interventions and approaches to service delivery. In fact, the use of data collected throughout evaluation projects is useful to inform innovations and service add-ons, whilst encouraging our assumptions to be challenged.

Just get it over with versus internal utility

Myth-talk:

“We must do evaluation because the funders are forcing us to, so we should just get it over with and send them a report. Our funding relies on pleasing the funders and we need to justify the dollars invested by providing them with results.”

 

 Myth-busting!

This myth is one that paints evaluation as a superficial and meaningless endeavor. The quality of the evaluation work may be compromised by the Just Get It Over With mentality as it becomes an essentially reactive process. Instead of pleasing someone external evaluation efforts should be valued for there internal utility. Taking the time necessary to create an evaluation plan and then setting up processes for outcome tracking will ensure it has ongoing value to the organization and staff.

No time to learn versus we don’t have time not to learn

Myth-talk:

“Our time is at a premium and evaluation is time consuming; we have no time to take on evaluation.. Evaluation is for academics and not for implementers. The time it takes for us to learn to be an expert in evaluation is far beyond our organizational capacity.”

 

 Myth-busting!

Part of good management is evaluation and reporting. Senior management will need to be aware of its effectiveness and assessing performance necessarily requires evaluation practices throughout. Also, getting started on doing evaluation work requires some basic skills but they are not that difficult to learn. Each organization can make the time for some training, and dedicate time for tracking and monitoring even with a moderate budget. The key is to ensure that learning and applying evaluation skills results in a return to the organization; this means that the time invested is not wasted. Detailed evaluation reports have often resulted in additional funds and resources being awarded by funders, so taking the time to learn literally pays off.

Do it at the end versus ongoing reflection and learning

Myth-talk:

“After our programs are finished, we can evaluate them. We cannot know the results until the end, so we will perform our evaluation once the program has been completed.”

 

 Myth-busting!

Meaningful evaluation work cannot be completed at the end of a project or initiative, as most of the tracking and monitoring must be completed on an ongoing basis throughout the program implementation in order for a substantial evaluation report to be written at the end. Using an Ongoing Reflection and Learning approach can ensure consistent process improvements and gaps are addressed as they emerge.

But…no one reads it anyway versus evidence-informed practice

Myth-talk:

“We can produce a big report but no one is going to read it anyways, so why bother. It is pointless and a waste of time if no one actually reads the report.”

 

 Myth-busting!

People often do read evaluation reports and they are useful for a wide range of stakeholders beyond the funders of social programs. Keeping final reports concise and comprehensive and/or using new forms of media to present information in ways that is more engaging and user friendly will ensure that the uptake on information will be more positive. For example, some organizations are using alternative methods for presenting evaluation information including: webinars, videos, infographics and speaker bureaus, and others. Consider holding an internal meeting to share the methodology and main findings arising from the initiative. This will also demonstrate your organizational commitment to reflection, learning and evidence-based practice.