PowerPoint - 5008kb - 2014 AmeriCorps State and National

Evaluation: Where We’ve Been and
Where We’re Going
2014 AmeriCorps
State and National Symposium
Presenters
• Carla Ganiel, Senior Program & Project
Specialist, AmeriCorps State and National
• Emily Steinberg, Associate Director,
AmeriCorps*Texas, OneStar Foundation
• Stephen Plank, Director of Research and
Evaluation, CNCS
• Bethanne Barnes, Program Examiner, Office of
Management and Budget
AmeriCorps State and National: Lessons Learned
Carla Ganiel
Senior Program & Project Specialist
cganiel@cns.gov
Strategies for Building Evidence
• Include evidence in assessment criteria for
competitive applicants
• Ensure that grantees/subgrantees are on track
with evaluation plans
• Build evaluation capacity through training and
technical assistance for national grantees, state
commissions and state subgrantees
Grant Application Review Process (GARP)
• Levels of evidence outlined in NOFO
• Points awarded based on strength of evidence
– Strong evidence = 8 points
– No evidence = 0 points
• Evaluation reports reviewed concurrent with staff review
and informed level of evidence assessment
Competitive Grantees by Level of Evidence
Level
Number of New/Recompete Grantees
Percent of New/Recompete
Grantees
Strong
7
7%
Moderate
21
22%
Preliminary
26
28%
Pre-Preliminary
39
41%
No Data
1
1%
Evaluation Reports
• Reviewed by CNCS Office of Research &
Evaluation
• Staff did not review reports from applicants that
were not required to submit an evaluation
• Evaluations were not scored
• Evaluation results were considered in
determining an applicant’s level of evidence
Evaluation Review
Reviewers assessed:
• Alignment of Models
• Methodological Quality
• Strength of Findings
• Recency
• Compliance with ASN evaluation requirements
Large Applicants ($500,000 +)
Large Applicants
Total evaluations reviewed
Number
10
Met evaluation requirements (type of study)
3
Rated satisfactory quality
4
Met evaluation requirements AND rated satisfactory quality
2
Small Applicants (Less than $500,000)
Small Applicants
Total evaluations reviewed
Number
31
Met evaluation requirements (type of study)
31
Rated satisfactory quality
12
Met evaluation requirements AND rated satisfactory quality
12
Evaluation Reports Submitted
Type of Study
# Large
# Small
Applicants Applicants
Total
Experimental
3
4
7
Quasi-Experimental
0
4
4
Non-Experimental
5
6
11
Process/Implementation
2
17
19
10
31
41
Total
High Quality Evaluations
Quality Evaluations by Type
Experimental 43%
Quasi-Experimental 21%
Non-Experimental 7%
Process/Implementation 29%
Lessons Learned
• Less than a third of the portfolio has moderate or
strong evidence
• Few evaluations met both ASN evaluation
requirements and quality standards
• Some small applicants are conducting impact
evaluations
• More in-depth review of evidence by external
experts, focused on quality rather than quantity of
studies
Evaluation Technical Assistance
• One-on-one technical assistance provided to
large ($500,000+) national grantees and state
subgrantees
• TA focused on developing a plan to conduct an
evaluation that would meet CNCS requirements
– Quasi-Experimental Design (Comparison Group)
– Randomized Control Trial (Control Group)
Lessons Learned
• Lack of clarity about the difference between
performance measurement and evaluation
• Many grantees/subgrantees did not understand
CNCS evaluation requirements
• Some grantees did not take full advantage of TA
offered
Lessons Learned
• Grantees/subgrantees need:
– Clear timeline for completing an evaluation with in a
three-year cycle
– Guidance on budgeting for evaluation
– Guidance on how to hire an external evaluator
Lessons Learned
• At the time of grant award, most grantees have
considerably more planning to do before they can
meet CNCS evaluation requirements
• Certain program designs face challenges when
using a quasi-experimental or experimental design
• Alternative evaluation approaches may be
acceptable for some grantees (Pilot process)
• Conversation must shift from “evaluation as
compliance” to “evaluation to strengthen program
and build evidence.”
Evaluation Core Curriculum
• Completed courses:
–
–
–
–
–
–
–
AmeriCorps Evaluation Requirements
Evaluation 101
Basic Steps in Conducting an Evaluation
Overview of Evaluation Designs
How to Develop a Program Logic Model
How to Write an Evaluation Plan
How to Budget for an Evaluation
https://www.nationalserviceresources.gov/evaluationamericorps
Next Steps
• Alternative Evaluation Approach Pilot Process
• Feedback to all small 2014 competitive grants
who were required to submit evaluation plans
• One on one TA for large 2014 competitive
grantees who were required to submit evaluation
plans
• Dialogue with commissions on how to build
capacity to work with subgrantees
• Develop additional core curriculum courses
• Identify needs of tribal grantees
CNCS Grantee Symposium
September 17, 2014
Emily Steinberg, Director of National Service Programs
20
• Why evaluation matters
– Accountability
– Evaluative Learning
• Our evaluative evolution
– Then & Now
– Capacity Building Evaluations (Other State/Federal Projects)
– Direct Service Evaluations (AmeriCorps)
– Lessons Learned
• Current philosophy and approach
DUAL ROLE
Funder
Texas State Service
Commission
22
Nonprofit
Governor’s Office of FaithBased and Community
Initiatives
$8-13 million in
AmeriCorps grants/year
Think Tank/Incubator
Responsibility to evaluate
grantees and portfolio
Responsibility to evaluate
self and initiatives
Federal reporting
requirements
State/private/internal
reporting requirements
What is evaluation?
… a method for determining what change
occurred, how, why, and to whom a given
change has happened.
23
EVALUATION IS…
Important for:
• Accountability/Stewardship
• Learning about Impact – Driving Program
Design
• Communicating Impact – to staff, board,
volunteers, funders and community
stakeholders
• Multiplying your Impact – increasing the
knowledge-base of the field
ACCOUNTABILITY VS. LEARNING
Accountability
• Did what you said you’d do?
• Implemented as designed?
• Cost-benefit analysis – R.O.I.
Evaluative Learning
• Game film vs. scoreboard
• What outcomes were achieved and for whom? Did
outcomes differ across sub-groups?
• What made the difference? Which specific program
aspects achieved the most/least impact?
Peter York, TCC Group
LEARNING AND LEADERSHIP
• Learning is key to leadership:
• Good leaders are good learners:*
– Only one in four nonprofit orgs are “well led”
– Only one in four nonprofits are effective “learners”
*York, P. (2010). “Evaluative Learning.” TCC Group: Philadelphia, PA.
HISTORY
Our Evaluation Trajectory
• 2003-04: OneStar became 501(c)(3) commission
• 2004-2008: Blissful ignorance / empty enforcement
– Began implementing National PMs
– Focused more on capacity building initiatives & evaluation
– Did not have a framework for reviewing AmeriCorps
subgrantees’ evaluation plans and final reports (honor system)
HISTORY
Our Evaluation Trajectory…
• 2006-2007: Compassion Capital Fund (federal)
– FBCI: Focused on grants to small FBCOs ($25K)
– Results: Some orgs performed better than others – WHY?
• 2007-2008: Rural Texas Demonstration Project (state)
– Applied learning to small FBCOs in rural Texas ($10-12K)
– Results: Developed indicators of success and critical milestones:
readiness to change, trust/rapport, minimum capacity
HISTORY
Our Evaluation Trajectory
• 2008: FBCI Capacity Building Project (State)
– Statewide grantees, honed evaluation and success indicators
– Studied Org Assessment Tools:
• Selected Core Capacity Assessment Tool (CCAT) by TCC-Group
• 2009-2012: AmeriCorps Statewide Evaluation (State)
STATEWIDE EVALUATION OVERVIEW
• Multi-year (2009-2012)
• Competitive RFP
– Selected external evaluator: UT-Austin RGK Center
• Based on previous research
– Longitudinal AmeriCorps study by Abt Associates
– Organizational capacity assessments – TCC and
RGK
• Instruments
– Program director surveys
– AmeriCorps member surveys
– Organizational capacity surveys
EVALUATION DESIGN
PROPOSAL:

analyze the value added of AmeriCorps to TX

assess organizational/management structures most commonly
associated with impactful, value-adding programs

assess the different kinds of impact that AmeriCorps programs
have on communities
EVALUATION MODEL
Value Added of
AmeriCorps in
Communities
Organizational
Structure and
Program
Management
Characteristics
Member
Assessment of
Community
Impact and
Value
EVALUATION COMPONENTS
Surveys
AmeriCorps Program Managers
AmeriCorps Members
Database
Organizational & Financial Indicators
• Organizational Capacity Surveys
• Grantee Budgets
• 990 Forms
• Organizational Annual Budgets/Financial Reports
Case Studies
Conducted four (4) site visits
FINDING #1: Overall, AmeriCorps Service Has
a Positive Impact
Percent survey respondents choosing top response
Program Managers
2009-10
2010-11
Members
Program/service
is “very effective”
79%
83%
63%
Members “make
an important
contribution”
96%
91%
75%
“A lot of change”
is observed in the
clients served
67%
70%
51%
Finding #2: Most Programs Achieve
Goals
Percent Meeting Reported Goals by Organization Type
Met SOME Reported Goals
Met MOST Reported Goals
Met ALL Reported Goals
37
50
67
71
100
25
33
100
56
29
25
7
State Agency
Local Education Other Local Gov't Community-based 4-Yr College/Univ
Agency
Agency
Org
Other
• 83% of 2009-10 and 91% of 2010-11 respondents report meeting or
exceeding most of the goals reported to OneStar
• Statistically significant differences (p=0.045) exist between organization types
FINDING #3: Most Members Believe in
Long-Term Impact of Their Service
What Did You Leave Behind as a Result of Your Service Experience?
(Percent of members surveyed choosing answer)
No trace (box 1)
A drop in the
2%
bucket
(box 2)
Part of a real
15%
solution
(box 4)
30%
The start of
something
important (box 3)
53%
FINDING #4: Three Program Characteristics
Are Tied to Member Assessment of Service
Impact
Thought that service to the community was "very
helpful" (question 39)!
Program
Characteristics
1.00!
• Service
Clarity
• Orientation
Quality
Probability!
0.80!
0.60!
0.40!
Service Clarity!
Orientation Quality!
Inadequate!
Adequate!
Excellent!
Inadequate!
Adequate!
Excellent!
Inadequate!
0.00!
Adequate!
• Communication
Quality
Excellent!
0.20!
Communication Quality!
Financial Impact on Most
Organizations…
Financial Impact on Organizations by Utilizing
AmeriCorps Members
% of program managers surveyed
Don't Know
26%
Saves
Money
57%
Loses
Money
2%
Breaks
Even
15%
…and a Positive Economic Impact on
Communities
$16
$14
$12
$10
$8
$6
$4
$2
$0
$13.40
$6.82
Total net value per year:
Over $10 million
6.58
Program
manager
assessment of
value*
Average hourly
cost
Net value
* Program manager assessment of value has been regionally adjusted. Numbers in chart
represent averages for 2009-10 and 2010-11 terms
LESSONS LEARNED
• Evaluating more than 1 AmeriCorps program at a time is like
apples and oranges.
• You may pay a lot of $$$ to learn things that seem somewhat
predictable or ordinary.
• Be ready to edit the final product. Heavily.
• Not all evaluative data is created equal. (‘Perceived’ vs.
‘Actual’ Value)
• Case Studies are difficult to do well.
CURRENT PHILOSOPHY + APPROACH
• Back to basics – You have to get ‘down in the weeds’ at some
point – we have to specialize in each grantee’s program design
to a basic extent, to empower them to make smart evaluation
decisions and to ask the right questions.
• Evaluation begins at accountability, but is really what should
really drive decision-making – both for funders AND grantees.
• Good leadership is what drives a culture of evaluative
learning—make it part of everything we do from the top down.
• ‘Practice what you preach’ to grantees + ‘There’s always room
to grow’ (Annual grantee survey, training event surveys,
member experience and inclusion survey)
CURRENT TOOLS + RESOURCES
GRANTEE RESOURCE LIBRARY:
http://onestarfoundation.org/americorpstexas/grantees/grantee-resources/
42
CURRENT TOOLS + RESOURCES
GRANTEE RESOURCE LIBRARY:
http://onestarfoundation.org/americorpstexas/grantees/grantee-resources/
43
Contact Information
• Emily Steinberg
– Director, National Service Programs
emily@onestarfoundation.org
44
www.onestarfoundation.org
45
CNCS Office of Research & Evaluation:
Recent Path,
and Coming
Steps
CNCS Office
of Research
& Evaluation:
Recent Path, and Coming Steps
Stephen Plank, Ph.D.
Director of Research & Evaluation
splank@cns.gov
Overview
• Who we are
• What we do
• What we strive to do in support of AmeriCorps
State and National grantees
• Recent accomplishments
• Next priorities & activities
About Our Team
• 8 full-time members, plus talented interns, onsite consultant, & contracted partners
• Strong working relationships with AmeriCorps
State and National program leadership
• Expertise in program evaluation, measurement,
experimental & quasi-experimental design,
community psychology, sociology, economics,
public policy
Our Mission
• To build the evidence base for national service
programs
• To facilitate the use of evidence within CNCS &
among its grantees and subgrantees
– Our vision of what evaluation & evidence can be at
their best…
– Our view of what evaluation should not be about…
What We Strive to do
in Support AmeriCorps Grantees
• Modeling & supporting best practices in
evidence & evaluation
• Participating in dialog & problem-solving
• Participating in the grant-making & evaluationplanning process, as appropriate
• Communicating findings & lessons learned to
multiple audiences
– Clarity & practicality as watchwords
Our Portfolio
• R&E about, and directed toward, the national
service field
• Grantee support
– Technical assistance, “evidence exchange,” capacitybuilding tools
• Helping CNCS, as an agency, use evidence &
evaluation more effectively
– GARP, organizational accountability, learning
strategies
Highlights from the Past Year - R&E with
AmeriCorps State and National
• Minnesota Reading Corps evaluations
• School Turnaround AmeriCorps evaluation
– Design Phase
• AmeriCorps Exit Survey
– revisions, focus on response rates and utilization
• Grantee Evaluation and Training and Technical
Assistance
• Evaluation Bundling Project
– Pilot Phase
For the Coming Year and More - R&E with
AmeriCorps State and National
• School Turnaround AmeriCorps evaluation
• justice AmeriCorps evaluation
• R & E in support of other national service
partnerships
• Grantee evaluation training & technical
assistance
• Evaluation bundling project
• Additional projects
– building on past lessons learned
Bethanne Barnes
Program Examiner
Office of Management and Budget
Q&A