BrunFnd label BrunFnd label BrunFnd label Effectiveness Initiatives
BrunFnd label Effectiveness Initiatives
  • Sustaining Evaluative Capacity
    Evaluation Support Project, Anchoring Evaluative Capacity



    It is possible to provide evaluation training that is concise, relevant and useful to refresh and/or teach basic skills to key staff in organizations which have a demonstrated commitment to and prior experience with building evaluative capacity.

    Providing hands-on experience in data analysis and planning during training is beneficial.

    Despite effective training delivery, participant learning and project completion, issues related to full extension of evaluative thinking are areas which need continued and even additional attention. Leaders in both projects and their philanthropic stakeholders all indicated that sustaining evaluative capacity would be an ongoing process.

    Issues for Further Consideration

    1. Logistics are challenging.
      The identification of trainees and the use of agency-specific examples worked well in both projects.There is now a definite curriculum available for use. But it may still be challenging to determine the best timing and amount of time between sessions and to weave that into existing schedules of other organizations that want to participate.

    2. Finding other evaluation consultants seems necessary but difficult to accomplish.
      It remains unclear whether this type of support can be provided to organizations with whom there is less familiarity – even if they can demonstrate on paper or in an interview that they are “ready” to participate. Since the materials are now being made available to organizations via the Bruner Foundation website, it will be very useful if some of those who elect to use the tools also keep track of and share how their participants assess the usefulness of the training. (To that end, the Bruner Foundation is posting a follow-up survey instrument on its website for organizations which use the materials and complete all three sessions. They are also providing some basic support for organizations to collect data about their use of the anchoring strategy.)

      When an organization has finished using the Evaluation Essentials for Program Managers series, please have trainees take our survey.

    3. Logistics are challenging.
      Efforts to track evaluator use of the Anchoring materials as well as to provide specific training to other evaluation professionals regarding evaluation training for nonprofit provider organizations should be pursued.

    4. The durability/longevity of Anchoring outcomes is uncertain.
      Even by the time of the final assessment (about 3 months after the last training), staff were challenged to remember some specifics of the training. They did describe specific examples of use however, so it’s clear that the level of effort expended for the Anchoring and Evaluation Support projects, followed by Organizational-leader inspired expectation of use, will promote desired utilization/application.