enhancing and measuring organizational capacity: assessing the results of the u.s. department of...

10
Mitchell Brown is associate professor in the Department of Political Science at Auburn University and research director for the Institute for Community Peace in Washington, D.C. Her broader research agenda focuses on the empowerment efforts of marginalized communities, particularly those enacted through organizations. From 2006 to 2009, she served as co–principal investigator of the evaluation of the Rural Domestic Violence and Child Victimization Enforcement Grant Program Special Initiative: Faith-Based and Community Organization Pilot Program. E-mail: [email protected] Enhancing and Measuring Organizational Capacity 1 Public Administration Review, Vol. xx, Iss. xx, pp. xx–xx. © 2012 by The American Society for Public Administration. DOI: 10.111/j.1540-6210.2012.02528.x. Mitchell Brown Auburn University and Institute for Community Peace A significant goal of public administrators in this era of shrinking public funds has been to find ways to enhance and measure organizational capacity and sustainability with minimal outlays of resources. One attempt to address this goal was the Rural Pilot Program, funded by the Office of Violence Against Women in the U.S. Department of Justice. Based on the evaluation of the program, this article (1) describes how capacity was measured, (2) discusses the validation and utility of a self-administered instrument, and (3) examines whether and to what extent organizational capacity is enhanced by an intermediary funding model. Modest positive changes were found in two areas—organizational staffing and information technology—but no changes were found in other areas. e article concludes with recommendations for designing future programs to enhance capacity and sustainability and for public administrators and grant makers in utilizing self- administered capacity instruments. O ne of the goals of public administrators in this era of devolution and shrinking public funds, particularly for human service deliv- ery, has been to find ways to enhance organizational capacity and sustainability with minimal outlays of public resources. is goal is particularly critical in rural areas, where human service delivery faces additional challenges not seen in urban and suburban environments, such as fewer resources, transporta- tion difficulties, minimal infrastructure, and human resource drains as residents leave for job opportunities elsewhere. One attempt to address these goals was the Rural Pilot Program (RPP), funded by the Office of Violence Against Women (OVW) at the U.S. Depart- ment of Justice. Based on the evaluation of the RPP, 1 which was funded by the National Institute of Justice, I examine whether and to what extent organizational capacity was enhanced by the funding model utilized in this program. is article begins with a discussion of the different ways in which organizational capacity (or organi- zational functions), service delivery capability, and longevity (or sustainability) can be enhanced. It then provides an overview of the RPP and its evaluation, the ways in which capacity was measured, and the results from validating the self-administered assess- ment instruments. Findings show that modest positive changes were made and sustained by the organizations funded by the RPP, at least in the short term, in two areas: organizational staffing and information techno- logy. However, changes were made in no other areas subsumed under organizational capacity, including other measures of organizational management and operations, board of directors and governance, key allies, resources, program planning and implementa- tion, and evaluation. e article concludes with a discussion and recommendations for ways to design future programs to both measure and enhance the sustainability of capacity-building efforts. Organizational Capacity and Sustainability A variety of factors are subsumed under organiza- tional capacity, but, generally speaking, they fall into two categories: functional capacity and the capacity to provide services. ese areas can be further refined into six components: management and operations, board of directors and governance, key allies, re- sources, program planning and implementation, and evaluation. Beyond the basic desire for programs to work, a key concern that drives interest in organiza- tional capacity for public administrators is determin- ing how to best ensure the sustainability of programs and organizations after government funding ceases. A variety of factors are important to sustainabil- ity, especially for community-based organizations. ese factors work together to develop and enhance sustainability. A first step in laying the groundwork for establish- ing organizational capacity is the ability to develop a vision or legacy (Center for Mental Health in Schools 2001; Cutler 2002; Hayes and Bryant 2002; Kaufman 2002; Wolff n.d.). A vision provides a structure around which organizations are able to plan and make decisions about the future. Without this focus, Enhancing and Measuring Organizational Capacity: Assessing the Results of the U.S. Department of Justice Rural Pilot Program Evaluation

Upload: mitchell-brown

Post on 27-Sep-2016

217 views

Category:

Documents


3 download

TRANSCRIPT

Mitchell Brown is associate professor

in the Department of Political Science at

Auburn University and research director

for the Institute for Community Peace in

Washington, D.C. Her broader research

agenda focuses on the empowerment

efforts of marginalized communities,

particularly those enacted through

organizations. From 2006 to 2009, she

served as co–principal investigator of the

evaluation of the Rural Domestic Violence

and Child Victimization Enforcement Grant

Program Special Initiative: Faith-Based and

Community Organization Pilot Program.

E-mail: [email protected]

Enhancing and Measuring Organizational Capacity 1

Public Administration Review,

Vol. xx, Iss. xx, pp. xx–xx. © 2012 by

The American Society for Public Administration.

DOI: 10.111/j.1540-6210.2012.02528.x.

Mitchell BrownAuburn University and Institute for Community Peace

A signifi cant goal of public administrators in this era of shrinking public funds has been to fi nd ways to enhance and measure organizational capacity and sustainability with minimal outlays of resources. One attempt to address this goal was the Rural Pilot Program, funded by the Offi ce of Violence Against Women in the U.S. Department of Justice. Based on the evaluation of the program, this article (1) describes how capacity was measured, (2) discusses the validation and utility of a self-administered instrument, and (3) examines whether and to what extent organizational capacity is enhanced by an intermediary funding model. Modest positive changes were found in two areas—organizational staffi ng and information technology—but no changes were found in other areas. Th e article concludes with recommendations for designing future programs to enhance capacity and sustainability and for public administrators and grant makers in utilizing self-administered capacity instruments.

One of the goals of public administrators in this era of devolution and shrinking public funds, particularly for human service deliv-

ery, has been to fi nd ways to enhance organizational capacity and sustainability with minimal outlays of public resources. Th is goal is particularly critical in rural areas, where human service delivery faces additional challenges not seen in urban and suburban environments, such as fewer resources, transporta-tion diffi culties, minimal infrastructure, and human resource drains as residents leave for job opportunities elsewhere. One attempt to address these goals was the Rural Pilot Program (RPP), funded by the Offi ce of Violence Against Women (OVW) at the U.S. Depart-ment of Justice. Based on the evaluation of the RPP,1 which was funded by the National Institute of Justice, I examine whether and to what extent organizational capacity was enhanced by the funding model utilized in this program.

Th is article begins with a discussion of the diff erent ways in which organizational capacity (or organi-zational functions), service delivery capability, and

longevity (or sustainability) can be enhanced. It then provides an overview of the RPP and its evaluation, the ways in which capacity was measured, and the results from validating the self-administered assess-ment instruments. Findings show that modest positive changes were made and sustained by the organizations funded by the RPP, at least in the short term, in two areas: organizational staffi ng and information techno-logy. However, changes were made in no other areas subsumed under organizational capacity, including other measures of organizational management and operations, board of directors and governance, key allies, resources, program planning and implementa-tion, and evaluation. Th e article concludes with a discussion and recommendations for ways to design future programs to both measure and enhance the sustainability of capacity-building eff orts.

Organizational Capacity and SustainabilityA variety of factors are subsumed under organiza-tional capacity, but, generally speaking, they fall into two categories: functional capacity and the capacity to provide services. Th ese areas can be further refi ned into six components: management and operations, board of directors and governance, key allies, re-sources, program planning and implementation, and evaluation. Beyond the basic desire for programs to work, a key concern that drives interest in organiza-tional capacity for public administrators is determin-ing how to best ensure the sustainability of programs and organizations after government funding ceases. A variety of factors are important to sustainabil-ity, especially for community-based organizations. Th ese factors work together to develop and enhance sustainability.

A fi rst step in laying the groundwork for establish-ing organizational capacity is the ability to develop a vision or legacy (Center for Mental Health in Schools 2001; Cutler 2002; Hayes and Bryant 2002; Kaufman 2002; Wolff n.d.). A vision provides a structure around which organizations are able to plan and make decisions about the future. Without this focus,

Enhancing and Measuring Organizational Capacity: Assessing the Results of the U.S. Department of Justice Rural Pilot

Program Evaluation

2 Public Administration Review • xxxx | xxxx 2012

sustainability, they must combat the challenges to sustainability as well. Th e geography of an organization can enhance the pressures on it, as well as limit the resources available to it (Wolpert and Reiner 1984). Low levels of community social capital limit the potential of organizations, particularly in developing support and attracting allies (Gomez and Santor 2001). Organizational size, age, and transitions infl uence structure, resources, and decision making, and organiza-tions must take care to avoid the pitfalls associated with certain characteristics (Minkoff 1993). Finally, one of the most signifi cant challenges comes from tensions in collaborations around funding

and planning (Banaszak-Holl et al. 1998; Brown and Hale 2011). Each of these chal-lenges must also be attended to if sustainabil-ity is to be enhanced.

The Rural Pilot Program and EvaluationTh e Rural Pilot Program was designed to fi ll a gap in the provision of domestic violence services in rural areas. To address these needs, the RPP provided small grants ($10,000 to $100,000) to faith-based and community-

based organizations in rural areas to provide domestic violence services over a one-year period (OVW 2005). Figure 1 provides a summary of the RPP logic model.

Like many other faith-based initiatives, an intermediary model was used to implement the RPP. An intermediary model is one in which a federal agency selects one or more organizations to fund, and those organizations, in turn, issue their own requests for proposals, receive and review funding applications, oversee subawardee work, provide technical assistance, coordinate reporting, and receive and respond to reimbursement requests.

As federal functions are devolved to the state and local levels, inter-mediary organizations are expected to provide important support to faith-based and community-based organizations, particularly those that are inexperienced. However, this model, particularly with respect to technical assistance delivery to subawardee faith-based and commu-nity-based organizations, has been limited because of geography and funding levels (Hall 2004; Klein and Wilson 2005; Miller 2004).

Th ree intermediary organizations were selected for the RPP and lim-ited to the use of 20 percent of their overall budget for both project administration and technical assistance provision. One intermediary provided subawards initially in a fi ve-county area2; this organization is referred to as the regional intermediary. A second intermediary provided subawards throughout its state; this organization is referred to as the state intermediary. A third intermediary provided subgrants to rural communities across the nation, except in those areas covered by the other two intermediaries; this organization is referred to as the national intermediary. Each of the intermediaries provided diff erent forms of technical assistance and capacity support to their subgrantees. Only one organization, the state intermediary, required formal partnerships between faith-based and community-based organizations to qualify for the grant.

Th e three intermediaries funded 54 subgrantees out of 176 applicants, or 31 percent of the applicants. Th ese organizations

organizations often fl ounder, responding to crises and opportunities that may divert them from their initially intended purpose.

Another signifi cant component of ensuring organizational sustain-ability is appropriate planning, evaluation, and commitment of key allies (Cutler 2002; Hayes and Bryant 2002; Kaufman 2002; Kubisch et al. 2002; Schorr, Sylvester, and Dunkle 1999; Wolff n.d.). Together, these factors provide organizations the tools they need to develop responsive programs with supports that will aug-ment their ability to continue providing services in the face of eco-nomic hardships and unexpected challenges. Performance measurement provides organiza-tions two added benefi ts: the ability to make midcourse corrections when problems arise, and evidence of program utility that enhances the ability to garner resources.

Organizational sustainability is also enhanced by the capacity to create and strengthen collaborations and partnerships (Hale 2011; Kubisch et al. 2002; Wolff n.d.). While not directly related to sustainability, such exposure increases contacts and helps organizations expand their allies and champions. Such relationships undergird an organization’s ability to weather unexpected dips in fi nancial support and strains on organi-zational resources.

Perhaps the most often thought of component of sustainability is suffi cient support for programs and staff (Hayes and Bryant 2002; Schorr, Sylvester, and Dunkle 1999; Staggenborg 1988). Th is includes funding and other forms of resources such as in-kind donations and volunteers. Th is component of sustainability is often thought of as suffi cient in and of itself; however, without the other components, the potential for attracting and retaining such resources is signifi cantly impaired.

Th e ability to infl uence policy and change norms is important to sustainability when the concept is broadly understood (Hayes and Bryant 2002; Schorr, Sylvester, and Dunkle 1999; Staggenborg 1988). Th at is, when prevailing policy and cultural norms work in opposition to the goals of a particular program or organization, the potential for sustainability is limited. When change occurs, that potential is increased. While eff orts at policy and norm change may have minimal direct impact on the organizations themselves, the long-term and indirect impacts enhance the sustainability of all similarly focused organizations and movements.

Finally, sustainability is enhanced by the ability of organizations to adapt to changing contextual factors (Hayes and Bryant 2002; Imig 1992; Minkoff 1999; Schorr, Sylvester, and Dunkle 1999). Environmental adaptability separates those organizations that fade while clinging to old models of success and understandings of problems from those that survive in the long term and thrive. Such adaptability is also purposive and forward thinking and includes preparation for leadership changes both internal and external to the organization.

Many challenges to sustainability have also been identifi ed, and, in addition to building on areas that enhance capacity and

Performance measurement provides organizations two

added benefi ts: the ability to make midcourse corrections when problems arise, and

evidence of program utility that enhances the ability to garner

resources.

Enhancing and Measuring Organizational Capacity 3

of the grant, six of the eight case study sites were visited (two of the sites twice), and monthly telephone calls were made to each site. Th e one- to two-day site visits were used to assess the organization’s strengths and needs, as well as their capacity to provide domestic violence services. For each organization, the project director and other organization staff were interviewed. Th e site visits were also used for document collection. Th e monthly calls covered the activi-ties of the organizations, the diff erent capacity-building supports that the subawardees received, and their successes and challenges.

An online, self-administered capacity assessment was sent to each of the applicants, whether funded or not, at three points: at the start and at the end of the subgrant year and 6–12 months after fund-ing ended. Organizations that could not fi ll out the online survey because of technical limitations were given the option of fi lling out and returning a paper version or being interviewed by telephone by an ICP staff member who then fi lled out the online survey for the respondent. Th e response rate for the pre-assessment survey was 90.2 percent for funded organizations and 48.8 percent for non-funded organizations; for the post-assessment survey, 98 percent for funded organizations and 28.8 percent for nonfunded organiza-tions; and for the second post-assessment survey, 33.3 percent for funded organizations and 23 percent for nonfunded organizations.4

Th e evaluation team convened a national meeting in Denver, Colo-rado, in September 2006, during which focus groups were held with the subawardees to discuss their capacity to carry out their work, areas of strengths and needs, the utility of the capacity support

were spread across the country, and all but one were located in rural jurisdictions.3 Of the applicants, two-thirds were community-based organizations and one-third ware faith-based organizations.

Th e evaluation of the RPP included a process evaluation, a study of the value added by the faith component, and a capacity study. Th e results from the evaluation of capacity building are used in this article. Th e capacity study was conducted using a mixed-method design. Capacity components were examined in six areas: manage-ment and operations, board of directors and governance, key allies, resources, program planning and implementation, and evaluation (see table 1). Th ese areas are the most frequently talked about and used aspects of organizational capacity in capacity studies. Th e capacity surveys and protocols, including pre-assessment and post-assessment questionnaires, focus group protocols, site visit proto-cols, and monthly telephone interview protocols, were developed in part through an adaptation of the McKinsey & Company (2001) capacity assessment report and tool, as well as through the instru-ments, trainings, and experiences of the Institute for Community Peace (ICP), a national violence prevention organization established in 1994 as a public–private partnership that works with grassroots community-based organizations to prevent an array of forms of violence.

Case study sites were chosen to refl ect the diff erent issues faced by faith-based as opposed to community-based grantees and newer as opposed to established organizations. Only organizations that re-ceived funding were included in the case studies. During the course

Figure 1 RPP Logic Model

Intermediary Activity #1: Intermediary Activity #2: Possible Benchmarks Anticipated Outcomes Solicitations Technical Assistance

Faith- and community-based organizations selected from pool of applicants in cooperation with OVW, with focus on organizations that can work with migrant workers, geographically isolated victims, the elderly, individuals with disabilities, and cultural, linguistic or ethnic minority groups

Solicitations issued in rural areas to small faith- and community-based organizations

Assessments

• Individualized assessments of faith- and community- based organizational capacity, strengths, and needs

Tailored Technical Assistance

• Mission, vision, and goals • Treatment, counseling, and

assistance, inc. immigration • Sustainability• Best practices • Timely and accurate

reporting• Outreach, recruitment, and

management of volunteers and nongovernmental support

• Legal assistance, such as incorporation or obtaining tax-exempt status

Organization and business management policies and practices, including accounting control and human resource management

• Individualized technical assistance plans created

• Increased knowledge on training/technical topics

• Greater funding, volunteers, in-kind support

• Diversification of funding streams

• Attainment of nonprofit status

• Increased and improved management control and functioning

Incorporation of appropriate and effective community education andprevention strategies

• Increased ability of faith- and community-based organizations to respond to technical assistance

• Increased ability of faith and community-based organizations to provide services to domestic violence victims

Intermediate Outcomes

Value added: Increased ability to reach and serve women and families in need in rural areas

Ultimate Outcome

Faith- and community-based organizations provide effective assistance to domestic violence victims through proven and sustainable practices

Subgrants Awarded

Education about activities thatmay compromise a victim’ssafety

Greater public exposure

4 Public Administration Review • xxxx | xxxx 2012

intermediaries’ perspectives on what did and did not work well were discussed during the monthly conference calls. Finally, on-site visits to the offi ces of two of the intermediaries at the start of their grant were used to discuss their programs; for the third intermediary, this process occurred at one of their regional technical assistance meet-ings instead of at their offi ces.

Self-Assessment of CapacityOur challenge was to develop an instrument that could be self-administered either online or on paper, could be repeated over time, and could be used by program administrators with a variety of levels of experience, educational background, and training. Our compromise was a largely close-ended survey with some open-ended questions, in which capacity and sustainability were pared down to six components with 58 elements between them (see table 1 for an overview of these components and elements). Th e assessment took about 30 minutes to complete, though response times varied. Th e questions stayed basically the same across the three iterations of the survey; however, some questions that were repetitive were stricken after the pre-test, and a set of refl ective questions was added to the version of the instrument that was delivered 6–12 months after funding ended. Our next task was to determine whether the simplifi ed instrument actually measured what would be captured with a longer interview protocol. (See the appendix for a list of the questions captured in the diff erent iterations of the self-administered capacity assessment.)

Arguably, all of the capacity measures taken in the evaluation of the RPP are components of an organization’s ability to sustain positive changes. However, an analysis of each measure individually was unwieldy. To simplify analysis, I compressed the capacity instrument into measures for each capacity component using a scoring grid and then turned each of these descriptive categories into numeric scores ranging from 1 (low) to 3 (high) (see table 2).

Graduate students who were not involved in the development of the instruments or in the data collection process were trained to score each of the organizations that fi lled out the capacity assessment, as well as each set of notes and documents from the case study orga-nizations (again, these were based on site visit notes and monthly calls). Th ese scores were used to determine whether the capacity as-sessment self-study instrument could provide as accurate a measure for each indicator as the case studies themselves. Th e correlations between the assessment and case study scores generally were high, ranging from 1.000 at the highest (p < .000 for key allies) to .476 at the lowest (for evaluation) (see table 3).

Th e most notable failure in the instrument was in the category of evaluation, which was intended to capture organizations’ use of needs assessments, asset mapping, theories of change/logic mod-els, service data collection, and process and outcome evaluations. Th is problem was also picked up in making pre-assessment/post-assessment comparisons between the organizations that were funded and those that were not. Th e funded organizations appear in the analyses to backslide on their capacity to self-evaluate. Specifi cally, the analyses showed statistically signifi cant drops in the use of needs assessments, asset mapping, and theories of change/logic models (in diff erence of proportion tests, changes for each were negative and signifi cant at p < .05). However, fi ndings from the case studies show

provided, and suggestions for program improvements. Th ese data were used to augment fi ndings from the case studies and capacity reports.

In-depth conversations with the intermediaries were held at the start of the grant period, in addition to monthly telephone conversa-tions throughout the grant period. Th e fi rst conversation focused on capacity-building plans, while the actual services provided and the

Table 1 Capacity Components and Elements

C omponent Items

Management and operations Accounting policies

Accounting structuresHuman resource policiesOperating policiesOperating oversightManagement information systemsNonprofi t statusReportingWork with/in budgetManaging transitionsPhysical infrastructureTechnologyStaff recruitmentStaff training/developmentPlan for sustainability

Board of directors and governance CompositionLeadershipClear roleBoard involvement/supportAccountability

Key allies Volunteer recruitmentVolunteer trainingVolunteer managementInstitutional partnersCommunity partnersGovernmental partnersFaith partners (within)Faith partners (outside)

Resources Diversifi ed baseFund-raising planGrant writingGrant managementIn-kind resources

Program planning and implementation Mission/visionStrategic planProgram match/goalsNeeds assessmentAsset mapLong-term focusRefl ectionPractice theory developmentCommunity educationPreventionResponsive servicesEffective servicesExpanded servicesEvidence-based practiceSubstantive expertiseTraining/qualifi cations

Evaluation Theory of changeProgram refl ects theory of changeOutcomes identifi edBenchmarks identifi edData collection and managementMidcourse correctionsProcess evaluationOutput evaluationOu tcome evaluation

Enhancing and Measuring Organizational Capacity 5

Capacity ChangesChanges in all of the areas detailed in table 1 were examined, although changes were seen in only a few areas. Positive changes in organizational capacity were experienced by many of the funded organizations compared to those that were not funded, although the changes were small. Th is should be expected, given the short time frame of the grant period and the small amounts of the awards.

Of the funded organizations that completed the second post-assess-ment survey,5 52.38 percent reported receiving funding after the grant that they most likely would not have obtained otherwise had they not fi rst received the RPP funds. Th e plurality of these grant monies were more than $25,000 (31.3 percent). Th e rest were for smaller amounts (25 percent received between $1 and $4,999, 18.8

percent received between $5,000 and $9,999, and 25 percent received between $10,000 and $14,999). While these amounts appear small, given the budget size of most of these organiza-tions (the budget range for domestic violence–related activities at the time that they received their awards ranged from $0 to $125,000 per year), these additional monies constitute a signifi cant addition to their resources. While it was not possible to track diff erences in funding over time between the organizations that were funded and those that were not, as

the OVW would not release data on the unfunded applicants, it was possible to examine funding diff erences between the time of applica-tion and the second post-assessment for the funded organizations only. For the funded organizations, the average annual budget at the time of the pre-assessment survey was $117,929; by the time of the second post-assessment survey, the average budget had increased

that, in fact, when respondents answered the pre-assessment survey questions, they did not understand the full extent of the questions being asked.

Th rough exposure to materials in these areas given by the intermediaries and, in a couple of cases, the evaluators answering direct ques-tions about some items, the organizations increased their knowledge but not their prac-tice. So, they did not backslide on these items through the course of the grant period, but instead answered more accurately in the post-assessment survey than in the pre-assessment survey. In other words, they learned, for ex-ample, that simply counting victims coming through their door for services did not constitute an adequate evaluation of their program. A disconnect between the capacity self-assessment results and the site visit data collected by the evaluators occurred because the evalu-ators were able to capture the organizations’ dearth of knowledge and practice through the site visits and interviews. Th e respondents were not able to refl ect this in the self-studies.

Table 2 Component Scoring

Management and Operations

Board of Directors and Governance Key Allies Resources

Program Planning and Implementation Evaluation

High Has accounting poli-cies and structures; human resource policies; both e-mail and Web site; formal processes for staff recruitment, training, and development; operating policies; MIS and reporting system; sustainability plan; and 501(c)(3)

Has a board of directors with a clear role and active support from board members, with accountabil-ity policies in place for board members

Has formal channels for volunteer recruit-ment, training, and oversight; diversi-fi ed community and institutional partners, including govern-ment, community, business, nonprofi t, and faith-based groups

Has a fund-raising plan and a diversi-fi ed funding base with at least four of the following: foundation, gov-ernment, business, in-kind, donation, and fee-for-service support

Has a clear mission and/or vision statement and program activities that match; has conducted a needs assessment along with asset mapping; activi-ties include some form of prevention work, along with community education; can demonstrate that services are responsive and effective

Program has a written (text or logic model) theory of change; pro-gram has clearly identi-fi ed benchmarks and formal process for data collection and manage-ment; program has conducted an outcome evaluation to document effectiveness

Med Has accounting policies or accounting struc-tures; Web site or e-mail; some form of staff training and development; capa-ble of and responsive to reporting requests from funders

Has a board of directors with documented role but uneven participation and minimal accountability

Volunteers receive lim-ited training or have minimal oversight; have some commu-nity partners but not much diversifi cation

Does not have a fund-raising plan or does not have a diversifi ed fund-raising base

Has a mission and/or vision statement and program activities that match; has conducted a needs assess-ment; activities include some form of preven-tion work or community education

Program has clearly identifi ed benchmarks and formal process for data collection and management; program has conducted either a process evaluation or some form of output assessment

Low Limited formal manage-ment and operating policies and proce-dures; no e-mail or Web site

No function-ing board of directors

Volunteer training and oversight minimal; few or no outside partner groups or organizations

Has no fund-raising plan and no diversifi ed funding base

If there is a vision and/or mission statement, programmatic activities do not match this; no sys-tematic needs assessment conducted

To the extent that pro-gram collects data on program, no systematic assessment or analysis is conducted to docu-ment work

Table 3 Correlations on Capacity Components between Self-Assessment and Evaluator Site Visit Assessment

Pairwise Correlation

Overall .927*** Key allies 1.000***Program planning and implementation .834***Management and operations .739***Resources .742***Board of directors and governance .577*Evaluation .476

Positive changes in organizational capacity were experienced by many of the

funded organizations compared to those that were not funded,

although the changes were small.

6 Public Administration Review • xxxx | xxxx 2012

will the organizations experience some freedom to increase spend-ing on second-level priorities. Staffi ng and information technology improvements count as second-level priorities after direct services to victims, as they potentially enhance the organization’s ability to at-tend to fi rst-level priorities. Th ese organizations also were more likely to require board fi nancial support after funding, and they saw sig-nifi cant increases in the number of community members and experts on their boards. Beyond these changes, the organizations that were funded did not realize statistically signifi cant changes in any other aspects of their administrative or fi scal controls. Again, this follows: though the budgets increased, they did not increase to levels that would allow the organizations to attend to less immediate concerns.

SustainabilityTh e simplest test of sustainability is whether the funded organiza-tions were able to maintain the changes they made during the grant period six months to one year later. Th e most signifi cant changes that the RPP subawardee organizations made were in staff size and technological capacity. Not only were the funded organizations that responded to the second post-assessment survey able to sustain changes in these areas, they also continued to grow. For example, at the time of the pre-assessment survey, staff size for funded organiza-tions averaged a little over two and a half people (2.567). By the end of the grant period, this had increased to a little over three (3.147). Not only did the subawardees sustain this growth, but also they continued to grow 6 to 12 months out, to an average of almost three and a half staff members (3.476).

Unfortunately, the second post-assessment survey was conducted prior to the economic downturn, and therefore it is not possible to know whether these organizational changes were able to endure the economic crisis that began in 2008. Two indicators of longer-term organizational sustainability are organizational adaptability and diversifi ed funding streams. Based on data collected from the case studies, our expectation is that, while funding bases diversifi ed during the grant period, reserves would not have been adequately developed to sustain these staffi ng increases. However, because of the enhanced adaptability of these organizations, combined with the fact that many of these staffi ng increases came from turning volun-teer positions into paid ones, their capacity to continue providing services should not be diminished despite the economic challenges; even if these positions must revert to volunteer positions, the com-mitment of the individuals involved has already been established.

Organizational technological capacity also realized positive and sustained changes. Th e proportion of organizations with e-mail capacity at the time of the pre-assessment survey was .911. It had risen to .957 by the time of the post-assessment survey. Th e orga-nizations maintained this change by the time of the second post-assessment survey (proportion was .952, but the diff erence was not statistically signifi cant). For organizational Web sites, the proportion

at the pre-assessment survey was .477. Th is in-creased to .522 by the time of the post-assess-ment survey, and further increased to .619 by the time of the second post-assessment survey.

To the extent that cash resources are an indi-cator of sustainability, the capacity of these organizations to sustain change increased

to $224,309.6 Th is represents a statistically signifi cant and positive change for these organizations. Th e case studies also suggest that state and local funders became more engaged with the organizations after they received their OVW grants. In particular, respondents felt that the grant gave them credibility that they did not have previously.

With respect to volunteers, at the time of the fi rst post-assessment survey, there appeared to be no change in volunteer recruitment and, in some cases, a decrease. Judging from the experiences of the case study organizations, the grant funds allowed these organizations to bring on volunteers in a paid capacity. Th us, the fi nding of no change or a slight drop in volunteers is misleading because, by comparison, the organizations that were not funded experienced signifi cant drops during the grant period, and several of the organizations that were not funded by the RPP stopped using volunteers altogether.

Of the organizations that were funded and responded to the second post-assessment survey, 42.9 percent reported recruiting from a greater number of sources of volunteers that they would not have been able to reach had it not been for the grant. One-third of those organizations also reported having more volunteers than they would have had if it were not for the grant. Th e two most common reasons given for the increase in volunteers were greater notice of the orga-nizations from attention in the media and increased inroads into the faith community. Other reasons given were increased collaborations and relationships with other organizations, opening up new places to recruit.

One of the ways that faith-based community organizations can increase their outreach, capacity to serve people, and resources is by gaining positive attention from other grassroots organizations, the media, and endorsements from public offi cials. Of the funded organizations that responded to the second post-assessment survey, 28.6 percent responded that they had received greater exposure from grassroots groups because of the grant. Th is exposure came from churches, other domestic violence organizations, e-mail and List-servs, informational gatherings, justice centers, funders, and support groups. Two-thirds responded that they had received greater media exposure because of the grant. Th e most frequently cited source was articles in local newspapers, followed by radio interviews and ads. Other sources included Web sites, free ads in newspapers, local tele-vision, and mailings and newsletters. One-third of the organizations also reported that they had received endorsements from public offi -cials that they would not have received had it not been for the receipt of this grant. Th e main sources of endorsements were community leaders and other organizations. Other sources included Catholic Charities, the Salvation Army, other churches, and law enforcement.

Funded organizations saw a statistically signifi cant increase in staff size and technological capacity (e-mail and Web sites) over the life of the grant (mean change for staff = .630, p < .000 and technology capacity = .07, p <.05). Each of these increases can be directly linked to having greater fl ex-ibility to spend cash resources on these items because of the OVW grant. Th at is, when organizational budgets are tight, especially for organizations providing social services, priority will be given to necessities. Only when funding expands in a signifi cant way

Only when funding expands in a signifi cant way will the

organizations experience some freedom to increase spending on

second-level priorities.

Enhancing and Measuring Organizational Capacity 7

over the grant period and did not dimin-ish after the OVW funding ended. For the funded organizations, the average annual budget at the time of the pre-assessment survey was $117,929; by the time of the second post-assessment survey, the average budget had increased to $224,309. Again, this time period is 6 to 12 months after the RPP funding ended, indicating that these organiza-tions were able to garner new and expanded resources to support their work.

Sustainability, however, is more complicated and multifaceted than this. Arguably, all of the capacity measures are components of an organization’s ability to sustain positive changes. To simplify analysis, the capacity instrument was compressed into measures for each capacity component using a scoring grid. Th ese descriptive categories were then turned into numeric scores ranging from 1 (low) to 3 (high). For the organizations that responded to all three of the capacity assessments, there were minor shifts in organiza-tional capacity. Th e overall capacity scores moved from 2.007 at pre- assessment to 2.052 at the fi rst post-assessment survey, but then dropped again at the time of the second post-assessment survey to 1.955. Th e rise between the pre-assessment and fi rst post-assessment can be attributed primarily to improvements in management and operations, board of directors and governances, and resources. Between the pre-assessment and second post-assessment surveys, there were continued increases in management and operations and in board of directors and governance. However, there were drops in each of the other areas. Th e largest drops between the pre- assessment and second post-assessment surveys were in program planning and implementation and key allies.

ConclusionsTh e RPP is an example of the positive, though in this case mod-est, eff ects of policy innovation and diff usion, with some caveats. Th e most obvious measure of sustainability is longevity and actual change maintained or expanded even after funding ends. Th e changes that were made as a result of the RPP, at least in the short run, were directly attributable to the infusion of monies and not to any other types of support given to the subawardees by the inter-mediaries. Th is should not be surprising. Given the work that has been done on sustainability for OVW grantees (see, e.g., Bowen, Gwiasda, and Brown 2004), the Rural Pilot Program design was not likely to induce the signifi cant, sustained changes far beyond the grant period. Th e initiative was hampered by (at least) three basic structural problems: the length of the funding period for the pilot study, the length and structure of the RFP process for subawardees, and the type of provision of technical assistance.

While the grants themselves were small in the RPP, the time frame allowed to spend the monies was 6 to 12 months depending on the intermediary organization. Both capacity building and enhancing the sustainability of eff orts are diffi cult tasks, not likely to be achieved in such a short time period. Further, the structure of the program did not allow the intermediaries extra time after the grant period ended. Th is meant that those organizations that were unable to spend their funds in that time period had to return the monies or ask for no-cost extensions. In turn, because of the intermediary model, this meant

that the intermediaries had to either (1) plan in advance for this possibility and ask for no-cost extensions to oversee the no-cost exten-sions of the subawardees, or (2) provide this oversight for subawardees on a volunteer basis. Each of these occurred in the RPP. Agencies must commit to longer-term funding if they intend for the programs they support to show improvements in outputs and outcomes, and if they hope for these programs to be sustainable in the long term.

Finally, there is tremendous potential and utility in using a sim-plifi ed self-study to measure organizational capacity compared with more involved instruments and site visits, which require, by their nature, greater outlays of resources. Once self-study data are collected, they have a variety of uses. Such a simplifi ed self-study instrument may increase the availability of data for administrators of grant programs. Th ese data would allow for more information about program impacts, which is useful for justifying the contin-ued existence or expansion of a program, as well as for making midcourse corrections or changes to future programs. Information such as that contained in our self-study obtained early in the life cycle of a funding program can also help identify grantee needs that can be addressed through technical assistance provision. Th ere are, of course, limitations to such an approach and situations in which a longer instrument and/or case studies should be used. Where subject knowledge is limited, a simplifi ed capacity instrument will not capture what is actually happening on the ground. Th is problem can be remedied by the simple addition of links with defi nitions and follow-up information, one-on-one interviewing for just these sec-tions, or, if the topic is not critical to the purpose at hand, omitting the section altogether.

AcknowledgmentsTh is project was supported by Grant no. 2005-IJ-CX-0050 awarded by the National Institute of Justice, U.S. Department of Justice. Points of view in this document are those of the author and do not necessarily represent the offi cial position or policies of the U.S. Department of Justice.

Notes1. Th e evaluation team included Andrew Klein, PhD, Advocates for Human Po-

tential, co–principal investigator; Mitchell Brown, PhD, Auburn University and Institute for Community Peace, co–principal investigator; Mark Small, JD, PhD, Clemson University, co–principal investigator; Rob Fischer, PhD, Case Western Reserve University; and Debby Tucker, MA, National Center on Domestic and Sexual Violence.

2. Th is area was later expanded, but the results of this expansion had no bearing on the evaluation results.

3. Th is organization was located in New Orleans and was funded by rural monies as part of the administration’s focus on supporting eff orts on the Gulf Coast after Hurricane Katrina.

4. Th ese percentages are based on 51 funded organizations and 160 unfunded organizations. Th ere was no indication of systematic capacity diff erences in the organizations that completed the second post-assessment survey compared to the fi rst post-assessment survey.

5. In all, 21 funded organizations completed the second post-assessment survey compared with 22 unfunded organizations.

Agencies must commit to longer-term funding if they

intend for the programs they support to show improvements in outputs and outcomes, and if they hope for these programs to be sustainable in the long term.

8 Public Administration Review • xxxx | xxxx 2012

Klein, Andrew, and Douglas Wilson. 2005. Domestic Violence Shelter and Advocacy Services. Report prepared by Advocates for Human Potential/BOTEC Analysis Corporation. Providence: Rhode Island Justice Commission.

Kubisch, Anne C., Patricia Auspos, Prudence Brown, Robert Chaskin, Karen Fulbright-Anderson, and Ralph Hamilton. 2002. Voices from the Field II: Refl ections on Comprehensive Community Change. Washington, DC: Aspen Institute.

McKinsey & Company. 2001. Eff ective Capacity Building in Nonprofi t Organizations. Washington, DC: Venture Philanthropy Partners.

Miller, Paul. 2004. Rural Challenges to the Faith-Based Initiative: Th e Public Sector in Montana. In Proceedings from the Rural Sociological Society Meeting 2004: Rural Challenges to the Faith-Based Initiatives. Columbia, MO: Rural Sociological Society.

Minkoff , Debra C. 1993. Th e Organization of Survival: Women’s and Racial-Ethnic Voluntarism and Activist Organizations, 1955–1985. Social Forces 71(4): 887–908.

———. 1999. Bending with the Wind: Strategic Change and Adaptation by Women’s and Racial Minority Organizations. American Journal of Sociology 104(6):1666–1703.

Offi ce of Violence Against Women (OVW). 2005. RFP, Rural Domestic Vio-lence and Child Victimization Enforcement Grant Program Special Initia-tive, Faith-Based and Community Organization Pilot Program. Application #2005-93556-MA-IJ. National Institute of Justice.

Schorr, Lisbeth, Kathleen Sylvester, and Margaret Dunkle. 1999. Strategies to Achieve a Common Purpose: Tools for Turning Good Ideas into Good Policies. Washington, DC: Institute for Educational Leadership.

Staggenborg, Suzanne. 1988. Th e Consequences of Professionalization and For-malization in the Pro-Choice Movement. American Sociological Review 53(4): 585–605.

Wolff , Tom. n.d. Coalition Building Tip Sheets. http://www.tomwolff .com/healthy-communities-tools-and-resources.html#pubs [accessed February 3, 2012].

Wolpert, Julian, and Th omas A. Reiner. 1984. Service Provision by the Not-for-Profi t Sector: A Comparative Study. Economic Geography 60(1): 28–37.

6. Th e median fi gures are $103,631 and $183,932, and the ranges are $34,000 to $224,988 and $65,000 to $524,000, respectively.

ReferencesBanaszak-Holl, Jane, Susan Allen, Vincent Mor, and Th omas Schott. 1998. Organi-

zational Characteristics Associated with Agency Position in Community Care Networks. Journal of Health and Social Behavior 39(4): 368–85.

Bowen, Linda K., Vicky Gwiasda, and M. Mitchell Brown. 2004. Engaging Com-munity Residents to Prevent Violenc e. Journal of Interpersonal Violence 19(3): 356–67.

Brown, Mitchell and Kathleen Hale. 2011. “State-wide Assessment of Alabama Women 65+: Organizations, Practices and Participant Perspectives, Final Report to the Alabama Women’s Commission.” http://www.alwomenscommission.com/Final_Grant_Report.pdf [accessed February 21, 2012].

Center for Mental Health in Schools. 2001. New Initiatives: Considerations Related to Planning, Implementing, Sustaining, and Going-to-Scale. Los Angeles: University of California, Los Angeles.

Cutler, Ira. 2002. End Games: Th e Challenge of Sustainability. Baltimore: Annie E. Casey Foundation.

Gomez, Rafael, and Eric Santor. 2001. Membership Has Its Privileges: Th e Eff ect of Social Capital and Neighborhood Characteristics on the Earnings of Microfi -nance Borrowers. Canadian Journal of Economics/Revue Canadienne d’Economique 34(4): 943–66.

Hale, Kathleen. 2011. How Information Matters: Networks and Public Policy Innova-tion. Washington, DC: Georgetown University Press.

Hall, John Stuart. 2004. Faith, Hope and Charitable Choice in Arizona: 2003. Albany, NY: Rockefeller Institute of Government.

Hayes, Cheryl D., and Erika Bryant. 2002. Sustaining Comprehensive Community Initiatives: Key Elements for Success. New York: Th e Finance Project.

Imig, Douglas Rowley. 1992. Resource Mobilization and Survival Tactics of Poverty Advocacy Groups. Western Political Quarterly 45(2): 501–20.

Kaufman, Martha. 2002. Building Sustainability in Demonstration Projects for Chil-dren, Youth, and Families. Washington, DC: Institute for Educational Leadership.

Enhancing and Measuring Organizational Capacity 9

Appendix Capacity Self-Assessment Items

Question Phase I Phase II Phase III

Does your organization have a 501(c)(3) status?√ √ √

Does the organization have formal policies and procedures for accounting?√ √ √

Before applying for this grant, have you ever applied for federal grants?√

Before applying for this grant, have you ever applied for local government grants?√

Before applying for this grant, have you ever applied for state grants?√

Have you conducted asset mapping?√ √ √

Have you been able to obtain funding beyond this grant that you most likely would not have otherwise obtained had you not fi rst received these funds?

If you were able to obtain funding beyond this grant, how much?√

Does your organization have a board of directors?√ √ √

What are the backgrounds of the people who serve on the board: community members?√ √ √

What are the backgrounds of the people who serve on the board: volunteers?√ √ √

What are the backgrounds of the people who serve on the board: community leaders?√ √ √

What are the backgrounds of the people who serve on the board: fi eld experts?√ √ √

What are the backgrounds of the people who serve on the board: lawyers?√ √ √

What are the backgrounds of the people who serve on the board: funders?√ √ √

What are the backgrounds of the people who serve on the board: clergy, ministers, etc.?√ √ √

What are the backgrounds of the people who serve on the board: staff?√ √ √

What are the backgrounds of the people who serve on the board: other?√ √ √

If other, please specify.√ √ √

How many people are on the board?√ √ √

Describe the role and responsibilities of the board.√ √ √

Are board members required to fi nancially support the organization as part of their board responsibilities?√ √ √

If board members are required to fi nancially support the organization, what are the specifi c requirements?√ √ √

Is this year’s budget greater than, equal to, or less than your last fi scal year?√

What is your overall budget for the current fi scal year?√ √

Organization’s annual budget at time of applying for RPP funding.√

As of today, are you providing domestic violence services?√

Does your organization have e-mail access for employees?√ √ √

Since the onset of this grant, have you received an increase in endorsements from other organizations or community leaders than you would have if you had not obtained the grant?

If you received an increase in endorsements, what kind?√

Since the onset of this grant, have you received an increase in exposure to grassroots groups more than you would have if you had not obtained the grant?

If you received an increase in exposure to grassroots groups, what kind?√

Since the onset of this grant, have you received an increase in exposure in the media than you would have if you had not obtained the grant?√

If you received an increase in media exposure, what kind?√

Is this a faith-based organization?√

From what faith-tradition?√

If you applied for federal funds before applying for this grant, did you receive them or were you denied?√

Other than this grant, does this organization receive funding from: businesses?√ √ √

If the organization receives funding from businesses, what type? √ √ √

Other than this grant, does this organization receive funding from: fee-for-service/products?√ √ √

If the organization receives funding from fee-for-service/products, what type? √ √ √

Other than this grant, does this organization receive funding from: foundations?√ √ √

If the organization receives funding from foundations, what type? √ √ √

Other than this grant, does this organization receive funding from: government?√ √ √

If the organization receives funding from government, what type? √ √ √

Other than this grant, does this organization receive funding from: other sources?√ √ √

If the organization receives funding from other sources, what type? √ √ √

Do you have a fundraising plan?√ √ √

Does the organization have formal policies and procedures for human resources and/or personnel?√ √ √

Which intermediary organization did the organization apply for funding through?√

Have you continued to receive technical assistance from your intermediary organization after the end of your grant period?√

Does the organization have formal policies and procedures for information management?√ √ √

Since the onset of the grant, have you received more sources of in-kind support than you would have if you had not obtained the grant?√

If you applied for local government funds before applying for this grant, did you receive them or were you denied?√

How are your programs and strategies developed? Using a local model?√ √ √

How are your programs and strategies developed? Using a national model?√ √ √

For national models, how have you tailored the program to local needs and strengths?√ √ √

If you have attracted more in-kind resources, what kind?√

Have you conducted a needs assessment?√ √ √

If you have attracted more in-kind resources, what kind?√

Since the onset of the grant, have you recruited from more sources of volunteers than you would have if you had not received the grant?√

Do you have an operating budget?√ √ √

What year did your organization begin?√

What other outside partners do you have to support your work? If business, describe what they do to help support your organization. √ √ √

What other outside partners do you have to support your work? If community groups, describe what they do to help support your organization.

√ √ √

10 Public Administration Review • xxxx | xxxx 2012

Appendix Continued

Question Phase I Phase II Phase III

What other outside partners do you have to support your work? If faith organizations, describe what they do to help support your organization.

√ √ √

What other outside partners do you have to support your work? If government agencies, describe what they do to help support your organization.

√ √ √

What other outside partners do you have to support your work? If non-profi t organizations, describe what they do to help support your organization.

√ √ √

What other outside partners do you have to support your work? If business, describe what they do to help support your organization. √ √ √

Does your organization do any kind of prevention work?√ √ √

Does the organization have formal policies and procedures for recruitment and hiring?√ √ √

Do you keep data on the services you provide, your clients, and/or outcomes?√ √ √

If you keep data on the services you provide, your clients, and/or outcomes, describe what kind of data you keep.√ √ √

How many paid full-time equivalent (FTE) staff members does your organization currently employ?√ √ √

Does your organization have formal policies and procedures for staff training and development?√ √ √

Are your strategies long-term (over 5 years)?√ √ √

Are your strategies medium term (1–5 years)?√ √ √

Do you have a strategic plan?√ √ √

Are your strategies short term (less than a year)?√ √ √

If you applied for state funds before applying for this grant, did you receive them or were you denied?√

Do you have a sustainability plan?√ √ √

What method of receiving technical assistance did you prefer most? Why?√

If you have continued to receive technical assistance after the discontinuation of funds, how often, and what type of assistance?√

Do you have a theory of change?√ √ √

Please describe your theory of change or e-mail formal document to . . .√ √ √

Has your theory of change been captured in a formal organizational document?√ √ √

Describe the activities that volunteers perform.√ √ √

Describe the content of the training.√ √ √

How many hours do volunteers receive training?√ √ √

How many volunteers has your organization had in the past 12 months?√ √ √

How are your volunteers recruited? Through word of mouth?√ √ √

How are your volunteers recruited? Through church/faith activities?√ √ √

How are your volunteers recruited? Through newspaper?√ √ √

How are your volunteers recruited? Through radio?√ √ √

How are your volunteers recruited? Through television?√ √ √

How are your volunteers recruited? Through internet?√ √ √

How are your volunteers recruited? Through other?√ √ √

Describe the other ways volunteers are recruited.√ √ √

Do you have volunteers?√ √ √

Who supervises the volunteers?√ √ √

Do volunteers receive training?√ √ √

Does your organization have a Web site?√ √ √