building evaluation capacity (bec)
DESCRIPTION
Building Evaluation Capacity (BEC). Beatriz Chu Clewell , Urban Institute Patricia B. Campbell, Campbell- Kibler Associates, Inc. BEC : The Background - PowerPoint PPT PresentationTRANSCRIPT
Building Evaluation Capacity (BEC)Building Evaluation Capacity (BEC)
Beatriz Chu Clewell, Urban InstituteBeatriz Chu Clewell, Urban Institute
Patricia B. Campbell, Campbell-Kibler Associates, Inc.Patricia B. Campbell, Campbell-Kibler Associates, Inc.
BEC: The BackgroundBEC: The Background
Evaluation Capacity Building (ECB) is a system for Evaluation Capacity Building (ECB) is a system for enabling organizations and agencies to develop the enabling organizations and agencies to develop the mechanisms and structure to facilitate evaluation to meet mechanisms and structure to facilitate evaluation to meet accountability requirements.accountability requirements.
ECB differs from mainstream evaluation by being ECB differs from mainstream evaluation by being continuous and sustained rather than episodic. It is continuous and sustained rather than episodic. It is context-dependent; it operates on multiple levels; it is context-dependent; it operates on multiple levels; it is flexible in responding to multiple purposes, requiring flexible in responding to multiple purposes, requiring continuous adjustments and refinements; and it requires a continuous adjustments and refinements; and it requires a wide variety of evaluation approaches and methodologies wide variety of evaluation approaches and methodologies (Stockdale et al, 2002).(Stockdale et al, 2002).
BEC: The ProjectBEC: The Project
The goal of BEC was to develop a model to build The goal of BEC was to develop a model to build evaluation capacity in three different organizations: the evaluation capacity in three different organizations: the National Science Foundation (NSF), the National Institutes National Science Foundation (NSF), the National Institutes of Health (NIH), and the GE Foundation. of Health (NIH), and the GE Foundation.
More specifically, the project’s intent was to test the More specifically, the project’s intent was to test the feasibility of developing models to facilitate the collection feasibility of developing models to facilitate the collection of cross-project evaluation data for programs within these of cross-project evaluation data for programs within these organizations that focus on increasing the diversity of the organizations that focus on increasing the diversity of the STEM workforce.STEM workforce.
BEC Guide I:BEC Guide I: Designing A Cross-Project Evaluation
• Evaluation design & identification of program goals• Construction of logic models• The evaluation approach, including:
Generation of evaluation questionsSetting of indicators
• Integration of evaluation questions and indicators • Measurement strategies including:
Selection of appropriate measuresThe role of demographic variables
Highlights from Guide I: Highlights from Guide I: Constructing a Logic ModelConstructing a Logic Model
The basic components of a simplified logic model are:1. Inputs (resources invested)2. Outputs (activities implemented using the resources)3. Outcomes/impact (results)
Highlights from Guide I: Highlights from Guide I: Constructing a Logic Model (continued…)Constructing a Logic Model (continued…)
Highlights from Guide I: Questions & IndicatorsHighlights from Guide I: Questions & Indicators
BEC Guide II: BEC Guide II: Collecting and Using Cross Project Evaluation Data
• The strengths and weaknesses of various types of formats that can be used in data collection
• Data collection scheduling• Data quality and methods of ensuring it• Data unique to individual projects• Confidentiality and the protection of human subjects in
data collection• Ways of building data collection capacity among
projects • Rationales, sources, and measures of comparison data• Issues inherent in the reporting and displaying of data• The uses to which data might be put
Highlights From Guide II:Highlights From Guide II:Data Collection FormatsData Collection Formats
Highlights From Guide II: Highlights From Guide II: Available Information on Comparison DatabasesAvailable Information on Comparison Databases
URLAvailability: Public Access/ Restricted Use (fees/permission needed)Data Format: Web Download/Other ElectronicStudent Demographic Variables: Race/Ethnicity, Sex, Disability, CitizenshipData Level: National, State, Institution, StudentStudent Population: Pre-College, College, Graduate School, EmploymentSurvey Population: First Year; Most Recent Year AvailableOther Variables: Attitudes, Course-taking, Degrees, Employment, etc
Precent of Under-Represented STEM Students in 17 Project Colleges
0%
5%
10%
15%
20%
2000 2001 2002 2003 2004
Percent of Under-Represented STEM Students in 17 Project and 17 Comparison Colleges
0%
5%
10%
15%
20%
2000 2001 2002 2003 2004
Project CollegesComparison Colleges
Highlights From Guide II: Making ComparisonsHighlights From Guide II: Making Comparisons
Other Sources of Comparison DataOther Sources of Comparison DataThe WebCASPAR database The WebCASPAR database (http://caspar.nsf.gov) (http://caspar.nsf.gov)
provides free access to institutional level data on students provides free access to institutional level data on students from surveys as Integrated Postsecondary Education Data from surveys as Integrated Postsecondary Education Data System (IPEDS) & the Survey of Earned Doctorates. System (IPEDS) & the Survey of Earned Doctorates.
The Engineering Workforce Commission The Engineering Workforce Commission ((http://www.ewc-online.org/) provides institutional level ) provides institutional level data (for members) on bachelors, masters and doctorate data (for members) on bachelors, masters and doctorate enrollees & recipients by sex by race/ethnicity for US enrollees & recipients by sex by race/ethnicity for US students & by sex for foreign students.students & by sex for foreign students.
Comparison institutions can be selected from the Comparison institutions can be selected from the Carnegie Carnegie Foundation for the Advancement of Teaching’sFoundation for the Advancement of Teaching’s website, website, (http://www.carnegiefoundation.org/classifications/) based (http://www.carnegiefoundation.org/classifications/) based on Carnegie Classification, location, private/public on Carnegie Classification, location, private/public designation, size and profit/nonprofit status.designation, size and profit/nonprofit status.
Some Web-based Sources of ResourcesSome Web-based Sources of Resources
OERL, the Online Evaluation Resource Library OERL, the Online Evaluation Resource Library http://oerl.sri.com/home.htmlhttp://oerl.sri.com/home.html
User Friendly Guide to Program EvaluationUser Friendly Guide to Program Evaluationhttp://www.nsf.gov/pubs/2002/nsf02057/start.htmhttp://www.nsf.gov/pubs/2002/nsf02057/start.htm
AGEP Collecting, Analyzing and Displaying DataAGEP Collecting, Analyzing and Displaying Datahttp://www.nsfagep.org/CollectingAnalyzingDisplayingDahttp://www.nsfagep.org/CollectingAnalyzingDisplayingData.pdfta.pdf
American Evaluation AssociationAmerican Evaluation Association http://www.eval.org/resources.asphttp://www.eval.org/resources.asp
Download the GuidesDownload the Guides
http://www.urban.org/publications/411651.htmlhttp://www.urban.org/publications/411651.html
or Google on “Building Evaluation Capacity or Google on “Building Evaluation Capacity Campbell”Campbell”