learning teams and virtual communities of practice: managing evidence and expertise beyond the...

13
http://rsw.sagepub.com/ Research on Social Work Practice http://rsw.sagepub.com/content/20/4/435 The online version of this article can be found at: DOI: 10.1177/1049731509339031 2010 20: 435 originally published online 18 June 2009 Research on Social Work Practice Yekutiel Sabah and Patricia Cook-Craig Stable State Learning Teams and Virtual Communities of Practice: Managing Evidence and Expertise Beyond the Published by: http://www.sagepublications.com can be found at: Research on Social Work Practice Additional services and information for http://rsw.sagepub.com/cgi/alerts Email Alerts: http://rsw.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://rsw.sagepub.com/content/20/4/435.refs.html Citations: What is This? - Jun 18, 2009 OnlineFirst Version of Record - Jun 21, 2010 Version of Record >> at University of Otago Library on September 8, 2014 rsw.sagepub.com Downloaded from at University of Otago Library on September 8, 2014 rsw.sagepub.com Downloaded from

Upload: p

Post on 16-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

http://rsw.sagepub.com/Research on Social Work Practice

http://rsw.sagepub.com/content/20/4/435The online version of this article can be found at:

 DOI: 10.1177/1049731509339031

2010 20: 435 originally published online 18 June 2009Research on Social Work PracticeYekutiel Sabah and Patricia Cook-Craig

Stable StateLearning Teams and Virtual Communities of Practice: Managing Evidence and Expertise Beyond the

  

Published by:

http://www.sagepublications.com

can be found at:Research on Social Work PracticeAdditional services and information for    

  http://rsw.sagepub.com/cgi/alertsEmail Alerts:

 

http://rsw.sagepub.com/subscriptionsSubscriptions:  

http://www.sagepub.com/journalsReprints.navReprints:  

http://www.sagepub.com/journalsPermissions.navPermissions:  

http://rsw.sagepub.com/content/20/4/435.refs.htmlCitations:  

What is This? 

- Jun 18, 2009 OnlineFirst Version of Record 

- Jun 21, 2010Version of Record >>

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

Learning Teams and Virtual Communitiesof Practice: Managing Evidence andExpertise Beyond the Stable State

Yekutiel Sabah1 and Patricia Cook-Craig2

AbstractIn the past decade, the Israeli Ministry of Social Affairs has been engaged in an ongoing effort to change the capacity of socialservice organizations and social workers across the country to use and create knowledge in order to achieve the bestoutcomes for the people they serve. Although there is an ever-growing mandate in Israel to demonstrate outcomes and useeffective strategies, social workers have historically experienced unique challenges in accessing and assessing availableevidence-based practice when they are available. The first step to addressing these challenges, the intra-organizational phase,was to design, implement, and test a model of organizational learning designed to teach social workers how to use learning tochange practice. The second step, the inter-organizational phase, was the introduction of virtual communities of practice as atool to support workers in the acquisition and dissemination of new knowledge. This paper presents a case study of thiseffort including a description of the development and implementation of the two phases and an agenda for future research.

Keywordsorganizational learning, virtual communities of practice

The professional commitment of social workers is to the needs

of their clients. As such, they are called upon to acquire and

possess knowledge about how to address these needs effec-

tively in the context of their society and in the organizations

within which they work. Yet, human needs evolve and social

policies change. Therefore, practitioners must frequently learn

new strategies for overcoming the challenges they face in a

changing environment (Klein & Bloom, 1995; Roberts &

Greene, 2002). This is particularly the case in Israel where the

state has been undergoing constant change (Kalekin-Fishman,

2006) and where social services have evolved dramatically

since the 1950s (Ronen, 2004; Weiss, Spiro, Sherer, &

Korin-Langer, 2004). Furthermore, Israeli social services are

facing a growing demand for transparency. Like other coun-

tries, they are increasingly asked to demonstrate that their inter-

ventions are informed by effectiveness research (Thyer and

Kazi, 2004). Consequently, the capacity of Israeli practitioners

to constantly acquire the evidence-based knowledge they need

in order to bring about constructive changes in the people they

serve is becoming essential. In this paper, we will argue that, in

the Israeli context, efforts to base practice on existing evidence

meet particular obstacles. We will then present a case study of a

decade long effort to address this challenge by developing an

organizational learning (OL) methodology for social services

and then putting it into practice, first at the agency level and

subsequently, at the interorganizational through Virtual

Communities of Practice (VCoP’s). The case study utilizes

existing empirical research and analysis of qualitative and

quantitative data that supports the model. The paper concludes

with an outline of a future research agenda designed to increase

the support for the use of this methodology.

A Growing Demand for a Transparent Bodyof Knowledge

The formal beginning of the social work profession in Israel is

often associated with the establishment of a Social Work

Department by the Jewish community of Palestine in 1931.

Since those early days, social work in Israel has undergone pro-

found changes (Weiss et al., 2004). Many of those changes are

associated with a growing demand for transparency and effec-

tiveness; that is, social services are increasingly asked to reveal

the distinctive body of professional knowledge they are using

and to demonstrate its soundness. It is, in part, the result of the

rapid legalization of the social work practice in Israel, a process

that culminated with the enactment of the Social Workers Law

1 Israeli Ministry of Social Affairs and Hebrew University2 University of Kentucky

Corresponding Author:

Yekutiel Sabah, Israeli Ministry of Social Affairs and Hebrew University,

2 Kaplan St., Jerusalem 91008

Email: [email protected]

Research on Social Work Practice20(4) 435-446ª The Author(s) 2010Reprints and permission:sagepub.com/journalsPermissions.navDOI: 10.1177/1049731509339031http://rswp.sagepub.com

435

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

in 1996. The law recognizes the clients’ right to information

about the care given to them and grants them the capacity to

monitor the soundness of social workers professional decisions.

Two years later, the Israeli Parliament adopted a Freedom of

Information Law that furthermore reinforced the right of the

public to examine the knowledge used by practitioners (Rabin

& Peled, 2005). Furthermore, although Israel has been experi-

encing a long period of economic growth, the government

expenditures per capita are declining and a tight lid is kept

on social spending (Swirski & Konor-Attias, 2007). This infers

that in order to guarantee the funding of their activities, social

services are increasingly asked to provide evidence of their

capacity to deal efficiently with social problems. The introduc-

tion of Information and Communication Technologies (ICTs)

to the social work practice had also contributed to the request

for transparency. Although this process has faced many barriers

and while the full application of ICTs in social work has yet to

come (Parkin, 2000; Tregeagle & Darcy, 2007), the Internet

already facilitates the access to professional knowledge by

practitioners and by nonprofessionals and it is changing clients’

expectations in terms of transparency. This is the case in Israel

wherein over 50% of the population is using the Internet (Inter-

net World Stats, 2006). Welfare recipients, funding organiza-

tions, and the public at large are all challenging the validity

of expert’s know-how as the determining knowledge (Domi-

nelli, 2004) and they require this knowledge to be explicit and

accessible to all.

It seems therefore that Israeli social workers, like their col-

leagues around the world, need a strong, updated, and certified

body of knowledge to base their practice upon. They have to

adopt an evidence-based practice (EBP) orientation that is the

‘‘conscientious, explicit and judicious use of current best evi-

dence in making decisions about the care of individuals’’

(Sackett, Straus, Richardson, Rosenberg, & Haynes, 1997,

p. 2). More specifically, it seems they need to follow scholars,

such as Straus, Richardson, Glasziou, and Haynes (2005), who

suggest that EBP is a process that includes tracking down the

best evidence with which to answer client’s needs, appraising

its validity and integrating that critical appraisal with the prac-

titioner’s clinical expertise and with the client’s characteristics

and circumstances.

Applying EBP in Israel

It is not in the scope of this paper to join the fierce discussion

between the supporters of the EBP movement and their oppo-

nents (see the September 2007 issue of Research on Social

Work Practice as well as the editorial of the fall issue of the

main Israeli journal of social work of the same year [Slonim-

Nevo, 2007]). It seems that supporters and opponents agree that

operationalizing EBP is a complex issue and that ‘‘there has

been limited success in moving from academic discussion to

engaging social workers in the process of implementing EBP

in practice’’ (Regehr, Stern, & Shlonsky, 2007, p. 408). We

do state however that, although struggling against factors that

inhibit social work’s ability to use knowledge in practice is not

particular to Israel (see Cnaan and Dichter, 2008), Israeli social

services face additional obstacles.

First, there is a lack of evidence-based knowledge due partly

to the ‘‘unvarying characteristic of social services in Israel . . .its ever-changing, dynamic nature’’ (Ronen, 2004, p. 113). As

stated earlier, the Israeli society and its social services have

undergone profound and rapid changes since the state’s inde-

pendence (Loewenberg, 1998). Although Israel became a wel-

fare state similar to western industrialized nations, it had

undergone exceptional mutations. Since 1948, the Israeli pop-

ulation grew almost tenfold; 40% of this growth as a result of

immigration from more than 100 countries. The GNP per capita

is now six times higher than in 1950 (Israel Central Bureau of

Statistics, 2008). In addition, Israel had been involved in

numerous armed conflicts with its neighbors. Those changes

imply that Israeli practitioners are confronted with professional

problems associated with issues that are specific to the society

within which they work such as, the rehabilitation of holocaust

survivors, the continuous integration of millions of immigrants

and the treatment of civilians under continuous terrorist attacks

(see Ronen, 2004). It entails that an indigenous empirical base

for professional practice is vital even though the pace of change

is often more rapid than the rate of the creation of relevant

knowledge. Moreover, the knowledge base of Israeli social

work and education is imported, almost exclusively from North

America and requires substantial revisions and adaptations

(Spiro, Sherer, Korin-Langer, & Weiss, 1998). The paucity of

the research relevant to the Israeli context is aggravated by a

chronic shortage of research funds in Israel as well as limited

time resources for practitioners to carry out research activities

(Auslander, 2000). Furthermore, access to foreign knowledge,

such as books, journals, and Internet sites, is limited because

many of the practitioners are not comfortable with the English

language (Auslander, 2000). It is perhaps why Rosen (1994)

who studied the knowledge sources used by Israeli practi-

tioners found that ‘‘almost no use was made of research-

based knowledge’’ (p. 569). Twelve years later, Eagelstein,

Teitelbaum, and Shor (2007) found a similar pattern among

Jerusalem municipality’s social workers. It appears that the

pace of social and professional change, the idiosyncrasy of the

problems, the insufficiency of local research, and the language

barriers imply that evidence is not easily ‘‘transferable’’ and

often simply not available, making EBP harder to implement.

These challenges fueled the Israeli Ministry of Social

Affairs to lead a decade long effort intended to (a) facilitate and

encourage the use of evidence whenever available and relevant;

and (b) promote the invention of actionable knowledge

(Argyris, 1993), whenever research-based evidence is lacking.

Actionable knowledge as defined by Argyris (1993) is knowl-

edge that is systematically gathered, guides behavior (in this

case professional behavior), and that is deemed to have validity

to solving the problem for which it is created. This endeavor

began in 1998 as an attempt to develop an intra-OL methodol-

ogy for human services and to implement it among staff in

social agencies. Seven years later, in Phase 2 of that enterprise,

the Ministry started to establish VcoP’s in an effort to promote

436 Research on Social Work Practice 20(4)

436

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

learning and knowledge creation across agencies. The metho-

dology we used to analyze this case and the two-stage effort

is described and evaluated in the following sections.

Methodology

The impact of the two subsequent phases on the development

of learning teams and the use of knowledge and evidence will

be explored using a case-study approach. This approach allows

for the in-depth examination of a project (Yin, 2003). The case

study approach reflects a strategy in which quantitative or qua-

litative methods can be applied to the analysis of data on a par-

ticular chosen ‘‘case,’’ which can be either an individual

instance of a phenomenon or a population of cases representing

that phenomenon (Stake, 2000). The data collected that were

used to analyze this case was subject to and received institu-

tional review board (IRB) approval.

The case selected here is the Ministry-led effort to engage in

the two-stage effort to promote both the use of evidence and the

creation of actionable knowledge through the use of OL and

VCoP’s. In the first phase of this effort, the case is based on

a pilot project that tested the OL model in eight sites in Israel

(four experimental and four control). The sites were Israeli

after-school programs that were recruited to engage in develop-

ing a learning team and implementing the model. Thirty-four

staff members participated in the evaluation of the model (see

Orthner, Cook, Sabah, & Rosenfeld, 2006).

Data in this phase were analyzed by reviewing findings from

an empirical study of the pilot project and from analysis of qua-

litative data from semistructured interviews conducted with

participants from after-school program sites. The review of

empirical findings includes findings from previous studies on

the impact of OL on the eight Israeli sites (and comparisons

with eight U.S. sites) and new analysis of secondary data from

the Israeli sites. t tests were used to compare longitudinal mea-

sures of implementation of OL, staff satisfaction, and program

outcomes. The analysis of qualitative data resulted in themes

that emerged from implementation of this first stage. Interview

responses that are representative of the experience of imple-

menting steps of the OL model are included in italicized sec-

tions below.

In the second phase of this effort, the case is based on sec-

ondary usage data from 15 VCoP’s. VCoP’s were established

to support intra-OL on topics of interest to Israeli social work-

ers. Social workers were recruited to join the VCoP’s by using

a variety of strategies including by encouragement of their

organizations and word of mouth through networking with

other social workers. In its first year, 2833 workers joined one

or more of the VCoP’s.

The case was analyzed using secondary data that are tracked

as part of the evaluation of VCoP’s. The secondary data were

used to explore patterns of usage among individuals who

accessed the VCoP’s in their initial year of operation. Usage

patterns include activities such as VCoP hits and initiating or

reacting to a discussion forum. A description of Phase 1 of the

Ministry project follows.

Phase 1

The Ministry effort to develop an OL methodology for social

services began in the framework of the ‘‘National Plan for Chil-

dren at Risk’’ launched by the Israeli government in the late

1990s. The premise of that program was that in order to address

effectively the needs of Israeli children at risk, a ‘‘knowledge

infrastructure has to be built’’ (Ministry of Social Affairs,

1998, p. 96). Because of the paucity of relevant research and

the language barriers to its use mentioned earlier, the plan was

to generate knowledge for action through an OL process. The

assumption was that a structured OL methodology will facili-

tate the hybridization of knowledge (Gredig and Sommerfield,

2008) that is the combination of knowledge from different

sources. The initiators were well aware that OL was initially

developed in the corporate sector to improve the responsive-

ness of business organizations to innovations (Argyris &

Schon, 1978; Senge, 1990) and their primary goal was there-

fore to develop an OL methodology for social services (Sabah

& Rosenfeld, 1999).

OL in Social Services. OL has been the subject of research for

more than three decades (Arthur & Aiman-Smith, 2001) and

although several attempts were made to summarize the findings

(Fiol & Lyles, 1985; Huber, 1991; Levitt & March, 1988), there

is yet no accepted definition of OL (Berthoin-Antal, Dierkes,

Child, & Nonaka, 2001; Garvin, 2000). In our context, we

found that Crossan, Lane, and White’s (1999) view of OL as

a process that involves assimilating new learning and using

what has been learned, especially pertinent. There are clearly

two separate foci in the literature (Chaskin, 2008; Easterby-

Smith & Araujo, 1999). The first emphasizes the structural and

technical aspect of OL that is mostly the learning mechanisms

that allow practitioners to exchange information and to reflect

on behalf of the organization (Argyris & Schon, 1996). In the

social work profession, the periodical meeting between a

caseworker and his supervisor in the agency is a good example

of such a mechanism (Brashears, 1995; Kadushin & Harkness,

2002). The other focus refers to the cultural and social aspect of

OL, which are the norms and values that support learning, and

to their linguistic, ritual, narrative, and symbolic reflections

(Cook & Yanow, 1993). Argyris and Schon’s (1974) ‘‘collec-

tive theories of action’’ and Senge’s (1990) ‘‘mental models’’

and ‘‘shared vision’’ are good examples of that aspect of the

early literature.

More recently, Orthner et al. (2006) and Orthner, Akos,

Cooley, and Charles (2007) further operationalized the struc-

tural and cultural dimensions of OL by identifying four cultural

dimensions and four structural ones. The cultural facets are (a)

innovation: beliefs that support getting, sharing, and using new

ideas to promote organizational work, (b) safety: beliefs that

promote freedom of discussion and the ability to test ideas that

may or may not work out, (c) goal-centered: beliefs that

encourage developing goals and setting objectives to achieve

them, and (d) leadership: an administrative philosophy that

encourages new ideas. The structural aspects refer to (a)

Sabah, Cook-Craig 437

437

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

collaboration: staff regularly meet together to learn from each

other, (b) planfullness: staff set measurable outcomes to be

achieved and make sure plans and activities link to outcomes,

(c) diffusion: staff actively share their program successes, and

(d) infrastructure: organizational resources and time are set

aside to promote learning (Sabah & Orthner, 2007, p. 243).

Although the concept of OL was thoroughly developed and

put into practice in the corporate sector (Argyris & Schon,

1974; Senge, 1990), it has been slow to penetrate the nonprofit

sector (Finger & Brand, 1999; Smith & Taylor, 2000). Several

characteristics of the public sector make the translation of OL

to apply to social services complex. In many countries, as it is

in Israel, social services are usually local monopolies that do

not face competition. There is no real economic incentive to

learn to do better as it is in the corporate sector. Moreover,

Israeli social services are usually public sector bureaucracies.

As such, they are often characterized by goal ambiguity, a strict

division of labor, an extensive fragmentation, and a hierarchi-

cal structure (Rainey & Bozeman, 2000; Taub Center for Social

Policy Studies, 2006). Those attributes may obstruct the free

flow of information and knowledge between colleagues and

impede learning. Furthermore, social services typically operate

with heavy caseloads, without sufficient resources and under

constant pressure to solve urgent problems. In this atmosphere,

learning, which is a long-range effort, is harder. The developers

of the OL methodology in Israel realized that those obstacles

imply that its implementation in social services is a complex

process. It entails a gradual and iterative implementation of a

structured methodology in the agency.

The OL Methodology. A first version of an OL methodology for

social services was developed and implemented in eight after-

school program sites across Israel. It was developed by officials

and practitioners who were inspired by the experiential

approach to learning in teams (Kayes, Kayes, & Kolb, 2005)

and by Schon’s (1983) work on reflective practice. It was first

presented at a binational seminar in Berlin in 2000 and then

published in Hebrew (Sabah & Rosenfeld, 2000). The metho-

dology was later applied in more than 60 agencies and govern-

mental units in Israel and has been continuously refined. It has

been adapted and implemented in after-school programs in

North Carolina and Israel, evaluated in the framework of a

research grant from the Bi-National Scientific Foundation

(Orthner et al., 2006) and later adapted to schools settings

(Sabah & Orthner, 2007). Below is a detailed description of the

most recent adaptation of the model as it has been applied in

Israeli social services. Few illustrations of the application of the

steps in social service settings in Israel have been included.

Step 1: The assessment of the agency’s OL capacity. An initial

assessment of the capacity of the social agency to implement

an OL methodology is important for two reasons. First, the

assessment measures the extent to which the organization has

in place the structural and the cultural components needed to

successfully engage in learning. This provides guidance as to

which structural or cultural components are weak or missing

and must be developed within the organization. Second, the ini-

tial assessment also provides a first opportunity for staff and

leaders to reflect on the ways in which their agency promotes

or inhibits learning. This is an important preliminary step

toward strengthening the organizations’ orientation toward

learning.

As a means of achieving this first step, an empirical tool was

developed in conjunction with the application of the methodol-

ogy in both the United States and Israel, which can be used to

assess the level of an agency’s readiness for OL. The Organiza-

tional Learning Capacity Assessment is a 24-item, eight-factor

assessment instrument that measures the extent to which an

organization has the necessary culture and structure to support

learning. The instrument includes subscales on each of the four

cultural dimension and four structural dimensions associated

with OL and described above (see Orthner et al., 2006, 2007,

for a full description of the measure and its psychometric

properties).

Step 2: The formation of a learning team. The first practical

step is the selection of a learning team. The team members will

be the forerunners in the future implementation of the learning

processes across the agency. It is therefore important that mem-

bers commit themselves to a prolonged process on behalf of the

agency. This commitment must be based on clear expectations

and it is the role of the consultant to explain the methodology

and to clarify the rules.

Step 3: The formulation of a learning question. The next step is

the formulation of a ‘‘learning question.’’ A learning question

is about an issue that imperatively needs to be addressed in

order to improve the practice of the organization, the quality

of the service it offers, and the fulfillment of its mission. It is

about an unresolved issue in the practice of the members of the

learning team involving a professional challenge that cannot be

addressed solely by looking at the existing evidence. It also

needs to address a core professional or managerial issue in such

a way that its resolution will have a strong impact and facilitate

the future dissemination of learning mechanisms and norms

throughout the agency. A ‘‘good’’ learning question does not

allude to a particular solution but rather encourages the learn-

ing team to think of alternative solutions to a defined problem.

It typically has the following format: ‘‘How will we . . . in

order to achieve better results on behalf of our clients?’’

In practice, we have learned that the inclination of practi-

tioners to join a learning team depends greatly on which learn-

ing question is chosen and that Step 2 and Step 3 are

intertwined. Typically, the learning team will include 8 to 15

capable practitioners who directly address the learning ques-

tion in their practice and seek to focus on the professional chal-

lenge collaboratively. We also found that often those

preliminary steps entail long negotiations between colleagues

and between managers and subordinates. Although this dialo-

gue is mostly about learning procedures and questions, they are

in fact an additional preliminary round of the reflection process

438 Research on Social Work Practice 20(4)

438

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

that will continue throughout the process. The process of devel-

oping a learning question is illustrated in the next example:

The majority of the residents of this town near Tel-Aviv, are

ultra-orthodox Jews. They live in closed communities led by

rabbis and religious leaders and are reluctant to refer to secu-

lar external social services whenever necessary. Typically, they

will apply for professional help at the municipality services for

children at risk when it is ‘‘too late’’, the efforts to solve the

problem ‘‘intra muro’’ failed and there is an immediate danger.

The learning question chosen by the learning team was there-

fore: ‘‘How do we get legitimacy from the ultra-orthodox lead-

ership in order to address effectively the needs of the children at

risk in the different communities?’’

Step 4: The search for actionable knowledge. In order to address

the learning question, the learning team needs first to track

down existing relevant knowledge. Practitioners are encour-

aged to search for that knowledge in a spectrum (Osmond,

2005) of sources including research and evaluation findings,

their own experience and expertise, the tacit knowledge of their

colleagues, the stories of their clients and the clients’ families,

the expertise of internal and external experts, agencies’ regula-

tions, case-studies reports, and ‘‘best practices’’ repositories.

This compilation of knowledge is typically done by subgroups

of the learning teams that do their research separately and then

meet on a regular basis to share their findings.

The collected knowledge is then appraised by the team,

rather than by external researchers and experts. Following a

‘‘learning from success’’ approach (Rosenfeld, Schon, &

Sykes, 1995), the activities and the modi operandi suggested

by the numerous sources are examined. The purpose is to iden-

tify key success factors and to generate ‘‘actionable knowl-

edge’’ (Argyris, 2005), that is practical knowledge relevant to

the learning question. Rather than ranking evidence hierarchi-

cally according to its scientific strength as it is the rule in EBP

(Roberts & Yeager, 2004), this appraisal process underlines the

needs of the practitioners to identify potential strategies that

have the capability of addressing the learning question. This

process is demonstrated in the following description:

The social workers of the department of social services in this

small town in the Northern part of Israel understood they ‘‘have

a problem’’ when they discovered that one single family in their

community appears in the caseload of no less than seven practi-

tioners simultaneously. Several members of that large family

had applied separately for assistance at different times. Each

of them was referred by the intake social workers to practi-

tioners in the department according to their specific needs. It

was clear that the department needs a new way to intake social

assistance inquirers. The practitioners of the learning team des-

ignated to deal with that learning question started to look for

actionable knowledge. Some of them – in fact those more fluent

in English – reviewed the relevant research literature as well as

‘‘grey’’ reports on case management. Others met with intake

units in social services all over the country and with ‘‘intake

like’’ units in large businesses in town and looked for successful

practices. Staff was asked to fill questionnaires. Clients were

interviewed about constructive encounters with intake social

workers. The Ministry’s recommendations were revised and

finally agency-wide meetings were convened to present the

knowledge collected by the learning team.

Step 5: The formulation of a tentative model. The appraisal pro-

cess is completed with the formulation of a tentative ‘‘solu-

tion’’ by the learning team practitioners. Typically, it is more

than the sum of the actionable knowledge the practitioners have

methodically gathered. It is a new concept, the result of the

team collaborative innovativeness. It is given a new name and

often metaphors emerge during the team meetings. Usually the

model includes a set of practical measures and a workable strat-

egy of implementation. A tentative new model for assessing

children at risk was one example of this step:

The learning team of this mid-size town south of Tel-Aviv

looked for a way to cut down out-of-home placement of children

at risk. They searched for tools that will enable them to evaluate

the magnitude of the risk. This ‘‘knowledge mining’’ took them

to many places including risk assessment tools developed in the

context of domestic violence. After numerous adaptations and

deliberations, they devised a computerized 20 items question-

naire adapted to their needs. They called it ‘‘Sicounometer’’

(‘‘risk-meter’’ in Hebrew).

Step 6: The reflective implementation. The model is then care-

fully implemented by each of the team members. Every mem-

ber is required to cautiously keep a record of his activities, that

is to reflect ‘‘in action’’ (Schon, 1983). It implies that after each

implementation of any element of the model and without any

delay, each member writes down (a) what they have planned

to do and what they hoped to achieve, (b) what they actually

have done, (c) what the results of their action were and to what

extent the results were as expected, and (d) their understanding

of the results. The record must be concise and practitioners are

asked to point out in particular their ‘‘surprise, puzzlement, or

confusion’’ (Schon, 1983, p. 68). Those ‘‘reflection in action’’

records are distributed to the team members as soon as they are

written, usually by e-mail. They are then periodically examined

during ‘‘reflection on action’’ meetings. This is an iterative pro-

cess of reflection, rectification, and refinement. During those

collaborative clinical audit meetings, the team members dis-

cuss the effectiveness of the tentative model in terms of the

results every member achieved while implementing it. They

examine collaboratively whether the model of action they have

designed increases the likelihood of helping their clients attain

the desired outcomes.

The process comes to a provisional end when the learners,

by critically appraising the results of their actions, reach the

conclusion that they have found a ‘‘good enough’’ solution to

the learning question they have start with. ‘‘Good enough’’

means here that the group of reflective practitioners, after a

comprehensive compilation of evidence and expertise and an

Sabah, Cook-Craig 439

439

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

inclusive reflection on the results of their interventions, is ready

to expose it to external scrutiny.

Step 7: The redaction of the new actionable knowledge. The

redaction of the actionable knowledge is an integral part of the

learning process, rather than just its conclusion. At that phase,

the learners are soliciting others to rigorously appraise the

actionable knowledge they have invented, rather than pro-

nounce their wisdom.

In the framework of the Israeli project described here, the

actionable knowledge was exposed to a large number of poten-

tial reviewers in several ways including the VCoP’s, which will

be described later, and annual ‘‘Knowledge Fairs.’’ Knowledge

fairs are large events that put on display the outcomes of the

knowledge generation activities and make them accessible to

many. The fairs are open and nonhierarchical in their essence,

mixing up all managerial and professional levels. They are

unique opportunities to interact with colleagues and to see what

others are doing. Almost 3,000 practitioners attend the last fair

in June 2007, more than a third of the professional workforce in

the Israeli social services.

The review of the knowledge by a large number of practi-

tioners cannot guarantee the quality of the knowledge in the

way a critical appraisal does. Nevertheless, since the whole

process of generating knowledge through OL assumes that in

some cases there is not enough solid evidence to base practice

on, the review, at least, exposes agency’s axioms and routines

to external scrutiny. In that sense, it is a practical way to avoid

authority-based practice (ABP, Gambrill, 1999).

Moreover, the methodology requires that the redaction of

the actionable knowledge emphasizes dilemmas rather than

solutions. It is compiled as ‘‘discretion guidelines’’ rather than

‘‘practice guidelines.’’ They are neither statements of empiri-

cally tested knowledge nor concrete tools for practitioners seek-

ing to utilize an evidence-based approach (Rosen & Proctor,

2005). They are reflection tested statements about the questions

a practitioner has to ask while seeking for the most effective

intervention on behalf of his client. This formulation encourages

further auditing of the knowledge that has been generated.

Step 8: The next learning question. The development of an OL

methodology was, as mentioned earlier, twofold. It meant to

encourage the use of evidence whenever available and to pro-

mote the invention of actionable knowledge by practitioners

whenever evidence lacks. Since knowledge needs to be con-

stantly updated, OL involves continually developing and seek-

ing answers to new questions. It is an ongoing process and the

end of one cycle is just the beginning of the next one. Step 8 is

Step 1 again, with a new learning question, a new query for evi-

dence and expertise, and a new reflection process toward the

formulation of a new ‘‘piece’’ of actionable knowledge.

Furthermore, Step 8 is about proliferation of the methodology

throughout the agency wherein practitioners who experienced

OL carry the message to new learning teams.

The OL method described here should not be viewed as

rigid. For example, an agency may already have developed

or invented a successful program and start the process by learn-

ing what the key factors of that previous accomplishment were.

This can serve as the basis for promoting the culture necessary

to lead a systematic OL process. An agency can also start by

forming the team on the basis of an informal existing ‘‘commu-

nity of practice’’ (Wenger, McDermott, & Snyder, 2002) and

then formulate a learning question. Systematic review of acces-

sible and tacit knowledge, methodical and collaborative reflec-

tion, supportive leadership and learning atmosphere,

formulation of explicit and practical questions and solutions,

all of the above are essential. The concrete implementation is

left to the discretion of each agency and its leadership.

Evaluation of the OL Methodology. Evaluation of the application

of the model has shown promising results. The original appli-

cation of the methodology was tested in a set of 16 after-

school programs in both Israel and in North Carolina. In both

countries, four experimental sites were trained and received

consultation on the application of the model and developed a

learning team and four sites were delayed implementation sites

which did not receive the intervention. Differences between

staff and child outcomes as well as changes in the structural and

cultural dimensions related to OL were examined at baseline

and 18 months (see Orthner et al., 2006, for a full description

of the study methodology, measures, and discussion of the

results). In this study, in sites where a learning team was estab-

lished, there were significant, positive changes in both struc-

tural and cultural aspects of the organization and in child

behavioral outcomes. Experimental sites showed significant

differences from baseline to 18 months in four dimension of

OL including overall culture, safety, goal-centered work, and

diffusion (Orthner, Cook, Sabah, & Rosenfeld, 2004; Orthner

et al., 2006). In addition, a significant, positive association was

found between adoption of OL and job satisfaction (Orthner

et al., 2004).

Selecting out and conducting secondary data analysis of the

data collected in the Israeli sites also showed promising results.

Data were measured at baseline, 18, and 24 months after imple-

mentation of the methodology. The four experimental learning

teams included 37 total staff members at baseline. There was

some attrition and staff changes during the course of the study,

and at 24 months, there were a total of 29 staff working in the

four Israeli experimental learning teams. The Israeli teams

showed increased means scores in all OL dimensions during

the study period. In addition, these changes were positively sig-

nificantly associated with better outcomes for the children they

served. Specifically, changes in the way staff in learning teams

approached their work was significantly associated with posi-

tive behavioral changes in children in the after-school pro-

grams (Orthner et al., 2006).

Findings related to changes in the way that new knowledge

is shared and disseminated were also promising. Result showed

that staff in the after-school programs in Israel were signifi-

cantly more likely to disseminate new knowledge acquired

after having applied the methodology for 18 months (t ¼ –2.14,

df¼ 30, p¼ .04). This is an important shift because it highlights

440 Research on Social Work Practice 20(4)

440

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

the potential use of the methodology not only to change the capac-

ity of workers to learn but to share that knowledge. Further exam-

ination of the data showed that when specifically asked whether

they shared new knowledge with similar programs outside of their

own after-school program, staff in the experimental sites were sig-

nificantly more likely to report sharing knowledge with other

similar programs after applying the methodology for 18 months

(t¼ –2.57, df¼ 30, p¼ .01). Staff was also more likely to actively

work toward disseminating new knowledge to a wide variety of

new audiences (t ¼ –2.165, df ¼ 30; p ¼ .04).

However, these trends did not hold for 24 months. No signif-

icant differences were detected between the experimental sites in

their efforts to disseminate knowledge (t ¼ .32, df ¼ 31; p ¼.75). One explanation for this might be found in the qualitative

experiences of staff in agencies in Israel who are trying to apply

the methodology. Lack of access to information and to each

other in order to share knowledge may make it difficult to sustain

efforts to share lessons learned with one another. For this reason,

it became necessary to further explore how to help social work-

ers in Israel to connect to one another and facilitate each other’s

learning. This realization was partly the foundation for the next

phase of work, establishing a method for more effective knowl-

edge creation and dissemination on the interorganizational level.

Phase 2—From Intra- to Inter-OL

In 2005, the Israeli Ministry of Social Affairs reached the con-

clusion that it had to move forward and use ICT to support

ongoing learning and interactions between practitioners across

agencies. At that time, OL had been applied in most of the large

municipalities’ social services departments nationwide (in the

overfragmented Israeli local government, ‘‘large’’ stands for

agencies with 30 caseworkers or more) and the Ministry faced

difficulties in implementing the OL methodology in smaller

ones. In these undersized bureaus, the growing diversity of

Israeli social work (Spiro et al., 1998) had resulted in the for-

mation of specialized microunits within the departments with

as few as one or two members. Intra-OL was not applicable

there and it was clear that OL across agencies was essential.

Furthermore, it seemed that in order to ensure the sustainability

of all OL efforts and to reach audiences that are not accessible

via strong organizational ties (Granovetter, 1973), it was neces-

sary to free practitioners from physical constraints of space and

to move from learning teams to VCoP’s.

VCoP’s in Social Services. VCoP’s vary greatly and have an infi-

nite variety of faces according to different combinations of

structuring characteristics (Dube, Bourhis, & Jacob, 2005).

They sometimes are designed as ‘‘electronic networks of prac-

tice’’ (Teigland, 2003, p. 95) and ‘‘distributed collaborative

learning communities’’ (Alvarez, 2006, p. 13). For the purpose

of this article, a VCoP is defined as a group of distributed prac-

titioners who share a sense of identity and association and a

concern or a passion for a professional issue and want to deepen

their knowledge and expertise through ongoing interaction with

reliance on ICTs. While learning teams rely mainly on face-to-

face meetings and interactions as their primary vehicle for

connecting and reflecting, VCoP’s are face-to-screen,

computer-mediated, mostly asynchronous, text-based commu-

nication (adapted from Amin and Roberts, 2006; Dube et al.,

2005; Wenger et al., 2002;).

The recent years have seen a rapid rise in online initiatives

established by professionals interested in developing and

exchanging knowledge. The research reviewed by Amin and

Roberts (2008) and Barker (2006) reveals numerous cultural

and structural factors associated with the generation of knowl-

edge in VCoP’s including: the participants’ commitment

toward the endeavor and their motivation to actively partici-

pate, the clarity of purpose and rules of engagement, the quali-

ties of leadership and intermediation, the possibility of offline

Table 1. VCoP Hits and Members During 2007

VCoP Topic Opening Date Hits January 2007 Hits December 2007 Members January 2007 Members December 2007

Mental retardation 12/06 89 862 69 452Adult delinquency 09/06 542 943 241 273Legal protection of children 03/07 0 616 0 203Families in court 09/06 196 215 100 159Blindness 01/07 10 581 7 245Knowledge management 09/06 203 249 99 304Juvenile delinquency 09/06 344 663 240 283Addictions 06/07 0 374 0 154Seniors 11/06 48 107 112 241Foster care 11/06 53 288 51 113Children at risk 04/07 0 55 0 163Inspection and supervision 11/06 62 131 33 117Training 12/06 92 17 19 31Community work 10/07 0 3 0 10Domestic violence 11/07 0 9 0 15Management of social services 11/07 0 67 0 70Total 1,639 5,180 971 2,833

Note: VCoP ¼ Virtual Communities of Practice.

Sabah, Cook-Craig 441

441

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

meetings, and a sense of community and common purpose. It

seems that VCoP’s, like learning teams, work best when struc-

tural (technical) and cultural (human) factors contribute to cre-

ate a rich texture of social interaction, marked by interpersonal

trust, purposefulness, reciprocity of collaborations, and strong

professional ties (Amin and Roberts, 2008). The Ministry offi-

cials (and the authors) were not aware of any existing research

devoted specifically to virtual communities of social workers.

Therefore, they relied on those general findings from other dis-

ciplines while setting up VCoP’s.

The Process of Setting up VCoP’s. The VCoP’s were housed at the

Ministry Web site (http://molsa.gov.il) as the prolongation of

the OL methodology development effort to encourage the use

of evidence whenever available and to promote the collabora-

tive creation of actionable knowledge whenever evidence is

insufficient. Therefore, the opening screen was designed to

include two main parts: a repository of knowledge artifacts and

an online forum. As elucidated below, the main effort was

directed to build an infrastructure for communities owned by

practitioners wherein they feel safe to collaborate, to elucidate

new ideas, and to innovate. Therefore, the initiators followed

the subsequent principles:

Ownership and self-regulation. The VCoP’s were established

as communities owned and regulated by their members. Indi-

vidual registration is required and the decision to give or to

refuse entrance to applicants is in the sole hands of moderators

appointed by the members and according to rules they have set,

usually to limit entrance to fellow professionals. The taxonomy

of the content in the repository is not imposed by external

experts. It is based on a ‘‘folksonomy,’’ a buzzword that

describes the practice of the VCoP members (‘‘folks’’) to col-

laboratively establish a taxonomy to categorize content.

Diversity. The VCoP’s were designed to involve members

with different profiles including managers, researchers, case-

workers, suppliers of outsourced services from competing

firms, nongovernmental organizations (NGOs), and Ministry

officials. This is a ‘‘weak ties’’ community and in many cases,

the VCoP is a unique opportunity for the members to

collaborate.

Safety and confidentiality. The VCoP’s were established as

closed and safe communities. A user name and a password are

needed to enter the VCoP and to read and write in it. Practi-

tioners must use their names while in the VCoP, and the hier-

archical position of the members is also disclosed. Whenever,

members raise professional queries, they know who may read

them.

Practitioner friendliness. The VCoP’s were designed to facili-

tate the use of evidence by practitioners. Budget was allocated

to VCoP’s and they used it to ask renowned scholars to sum-

marize the ‘‘state-of-the-art’’ evidence on professional topics

and to write synopses of seminal books and articles. The

subjects were chosen by the VCoP’s members. The scholars

were asked to favor relevance and usefulness over validity and

comprehensiveness (Weick, 2001). They were also invited to

translate the academic knowledge so as to integrate it into the

VCoP’s pattern by submitting concise actionable knowledge in

a way that encourages ongoing debate over the relevance of

evidence in practice.

Support and guidance. The VCoP’s were established by busy

practitioners with no free time to spend on maintenance, tech-

nical problems, and organizational issues. Therefore, VCoP’s

leaders were granted ‘‘consulting vouchers’’ to be used at their

own discretion. Computer literacy courses were offered to

members with inadequate skills and moderators received tar-

geted training.

Preliminary Data on VCoP Usage. The first VCoP was established

in September 2006. Since then, 15 new communities started

their activities (up to February 2008). The total number of

memberships (it is estimated that 5–10% members are enrolled

in more than one community) grew from 971 in January 2007

to 2833 in December of that year (see Table 1). There is no

available exact number of active social workers in Israel but

senior officials in the Ministry estimate that 6,500–7,500 of

them work in services that are under the Ministry responsibil-

ity. Based on those estimates, it is assumed than more than a

third of social workers working under Ministry responsibility

are currently enrolled in the VCoP’s. In December 2007, about

a third (1,110) of the enrolled members (2,833) entered the

VCoP at least once and altogether they ‘‘hit’’ the VCoP’s

5,180 times. Figure 1 shows the monthly comparison of mem-

bership in 2007.

Figure 2 shows the month-to-month activity in the VCoP

forums and indicates a slower pace of increase than member-

ship. The number of new items in the forums grew from 21 per

0

1,000

2,000

3,000

4,000

5,000

6,000

j f m a m j j a s o n dMonths

HitsMembersMembers/entrance

Figure 1. Total Member Activity in VCoP’s in 2007

442 Research on Social Work Practice 20(4)

442

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

month to 38 during 2007 with an average of 32.7 items. The

numbers of reactions to those items grew from 71 to 285 per

month accordingly. Yet, there were no new items in six com-

munities during December 2007. The average number of reac-

tions during this month was 25.7 (SD þ 17.0).

The rapid growth over the 1st year of the establishment of

the VCoP’s suggests a high level of commitment to using them

as a tool for learning about practice. The next step in evaluating

the VCoP’s is underway and will focus on the extent to which

VCoP usage changes worker practices, the role that VCoP’s

play in connecting workers to ‘‘weak ties,’’ and organizational

factors that support or inhibit inter-OL using VCoP’s.

Conclusion and Outline of Future Research

As this paper highlights, preliminary data and evaluation sug-

gest that adopting an OL model and launching VCoP’s has

promise as a means of encouraging learning, the use of evi-

dence, and the development of practice innovations. However,

the current evaluation data have limitations. First, the current

data are limited in its exploration of differences in OL imple-

mentation from site to site. This limitation has implications for

the fidelity of the model across evaluation sites. This should be

taken into account when considering the review of the literature

and the findings presented on Phase 1 of this project. More

research should be done in this area. In addition, the current

evaluation on both the development and testing of the OL

model and the VCoP’s endeavor does not speak to the costs and

benefits of adopting those methods. While preliminary data

suggest benefits of the model, both phases of this project

required strong commitment on the part of administrators, par-

ticipating organizations, and practitioners. Research specifi-

cally designed to explore the costs and benefits of using this

type of strategy versus others would be helpful in making a

stronger case for the use of OL and VCoP’s.

In her influential paper on ABP, Gambrill (2001) concluded

that she sees ‘‘little hope that the profession will change in a

timely manner from within’’ and that the profession ‘‘will be

forced to ‘fess up’ and clean up’’ (p. 172) and ultimately adopt

evidence as the sole basis for practice. We opened this paper by

mentioning some of the exogenous forces that may indeed

encourage more use of evidence by Israeli practitioners. We

argued, however, that operationalizing EBP is a complex issue

everywhere and that Israeli social services face even more

obstacles in applying it. The development of an OL methodol-

ogy for social services and the establishment of virtual commu-

nities of social workers are cautious attempts to address those

obstacles by setting up learning mechanisms. Those mechan-

isms are intended to promote the use of existing knowledge and

to support the interaction between practitioners willing to cre-

ate the knowledge they need. They rely on the intrinsic desire to

invent. As such, they help practitioners to avoid the pitfalls of

unchecked knowledge and the dangers of ABP whenever

evidence is in shortage.

The intra-OL teams’ model, while limited in scope, has

shown that engaging in such an effort made significant and pos-

itive changes for the staff and organizations that participated.

Teams’ members, however, are colleagues. They share ‘‘strong

ties’’ and may be reluctant to blow the whistle when ‘‘the way

we do things here’’ does not work. Even a disciplined imple-

mentation of an OL process could be not sustainable enough

to overcome this obstacle in the long run. That is why, it was

imperative to move the locus of activity from within the bound-

aries of a single agency to a VCoP nexus of relationships

between a variety of stakeholders, across organizational, hier-

archical, and spatial boundaries. Those relationships cannot

rely on administrative hierarchy since members are individuals

belonging to different autonomous organizations who joined

the VCoP voluntarily because they are looking to develop their

individual expertise. They are based on an informal recognition

of expertise rather than on authority. That is why, VCoP’s are

perhaps the only way the profession can change from within.

Whenever the pace of change is rapid and evidence scarce,

learning-based practice (LBP), inside the agency (OL) or

across agencies (VCoP’s), may be a vital supplement to EBP,

a necessary step in the continuous effort of the profession to

develop a verified body of knowledge. LBP is not as rigorous

as EBP and it does not evaluate knowledge according to its sci-

entific strength. This weakness is, we believe, compensated by

the features of learning teams and virtual communities: practi-

tioners seem to like them and as we show, they are willing to

join them; they appear to promote continuous collaborative

learning, the examination of an array of knowledge sources,

and the inspection of well-established routines; they require

practitioners, as individuals and as teams, to monitor the effec-

tiveness of their interventions. This, of course, has to be further

appraised.

LBP relies on the intrinsic motivation of practitioners. It

requires a constant and careful nurturing of both ‘‘structure and

0

50

100

150

200

250

300

350

j f m a m j j a s o n dMonths

Members/forumMembers/reactions

Figure 2. Monthly Total of Times Members Initiated a NewConversation in the Forum or Provided a Reaction in the Forum

Sabah, Cook-Craig 443

443

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

spontaneity’’ (Brown & Duguid, 2001). The OL methodology

and the emergent VCoP’s were designed to encompass this

duality, in terms of evidence and practice wisdom, planfullness

and innovation, self-regulation and organizational support,

safety and transparency, intimacy and openness. Do they

embrace this complexity? Does LBP produce reliable action-

able knowledge? Will our clients be served better? The study

of those questions represents a thrilling challenge and an emer-

gent topic that has not been investigated in social work. We

need to continue our efforts toward a more rigorous testing

of the OL model in other settings as well as research on

VCoP’s.

Declaration of Conflicting Interests

The authors declared no conflicts of interest with respect to the author-

ship and/or publication of this article.

Funding

The authors received no financial support for the research and/or

authorship of this article.

References

Alvarez, L. H. (2006). Distributed collaborative learning communities

enabled by information communication technology. Unpublished

doctoral thesis, Erasmus University Rotterdam, Rotterdam, The

Netherlands. Retrieved March 1, 2008, from http://publishing.eur.

nl/ir/repub/asset/7830/ESP2006080LIS_9058921123_Alvarez.pdf

Amin, A., & Roberts, J. (2006). Communities of practice? Varieties of

situated learning. Retrieved March 1, 2008, from the EU Network

of Excellence Dynamics of Institutions and Markets in Europe

(DIME), Web site: http://www.dime-eu.org/files/active/0/

Amin_Roberts.pdf

Amin, A., & Roberts, J. (2008). Knowing in action: Beyond commu-

nities of practice. Research Policy, 37, 353-369.

Argyris, C. (1993). Knowledge for action. San Francisco: Jossey-Bass.

Argyris, C. (2005). Actionable knowledge. In H. Tsoukas &

C. Knudsen (Eds.), The Oxford handbook of organization theory:

Meta-theoretical perspectives (pp. 432-452). Oxford: Oxford

University Press.

Argyris, C., & Schon, D. A. (1974). Organizational learning: A theory

of action perspective. Reading, MA: Addison-Wesley.

Argyris, C., & Schon, D. A. (1978). Organizational learning: A theory

of action perspective. Reading, MA: Addison-Wesley.

Argyris, C., & Schon, D. A. (1996). Organizational learning II:

Theory, methods and practice. Reading, MA: Addison Wesley.

Arthur, J. B., & Aiman-Smith, L. (2001). Gainsharing and organiza-

tional learning: An analysis of employee suggestions over time.

The Academy of Management Journal, 44, 737-754.

Auslander, G. (2000). Social work research and evaluation in Israel.

Social Work Research and Evaluation: An International Journal,

1, 17-34.

Barker, R. (2006). Homo machinus versus Homo sapiens:

A knowledge management perspective of virtual communities in

cyberspace. Communication, 32, 226-240.

Berthoin-Antal, A., Dierkes, M., Child, J., & Nonaka, I. (2001). Orga-

nizational learning and knowledge: Reflections on the dynamics of

the field and challenges of the future. In M. Dierkes, A. Berthoin-

Antal, J. J. Child & I. Nonaka (Eds.), Handbook of organizational

learning and knowledge (pp. 921-940). New York: Oxford Univer-

sity Press.

Brashears, F. (1995). Supervision as social work practice: A reconcep-

tualization. Social Work, 40, 692-699.

Brown, J. S., & Duguid, P. (2001). Structure and spontaneity:

Knowledge and organization. In I. Nonaka & D. Teece (Eds.),

Managing industrial knowledge: Creation, transfer, and utilization

(pp. 44-67). Thousand Oaks, CA: SAGE.

Chaskin, R. J. (2008). Dissemination and impact: Issues, lessons and

future directions, in research for action. In R. J. Chaskin & J.

M. Rosenfeld (Eds.), Cross-national perspectives on connecting

knowledge, policy, and practice for children (pp. 131-157).

Oxford: Oxford University Press.

Cnaan, R. A., & Dichter, M. E. (2008). Thoughts on the use of

knowledge in social work practice. Research on Social Work

Practice, 18, 278-284.

Cook, S., & Yanow, D. (1993). Culture and organizational learning.

Journal of Management Inquiry, 2, 373-390.

Crossan, M., Lane, H., & White, R. (1999). An organizational learning

framework: From intuition to institution. Academy of Management

Review, 24, 522-538.

Dominelli, L. (2004). Social work: Theory and practice for a changing

profession. Cambridge, UK: Polity Press.

Dube, L, Bourhis, A., & Jacob, R. J. (2005). The impact of structuring

characteristics on the launching of virtual communities of practice.

Journal of Organizational Change Management, 18, 145-166.

Eagelstein, A. S., Teitelbaum, R., & Shor, D. (2007). Knowledge

needs of social workers in Jerusalem. Jerusalem: Szold Institute

and the Ministry of Social Affairs.

Easterby-Smith, M., & Araujo, L. (1999). Organizational learning:

Current debates and opportunities. In M. Easterby-Smith,

J. Burgoyne & L. Araujo (Eds.), Organizational learning and the

learning organization: Developments in theory and practice

(pp. 1-22). Newbury Park, CA: SAGE.

Fiol, C. M., & Lyles, M. A. (1985). Organizational learning. The

Academy of Management Review, 10, 803-813.

Finger, M., & Brand, S. B. (1999). The concept of the ‘‘learning orga-

nization’’ applied to the transformation of the public sector. In

M. Easterby-Smith, L. Araujo & J. Burgoyne (Eds.), Organizational

learning and the learning organization (pp. 130-156). London:

SAGE.

Gambrill, E. (1999). Evidence-based practice: An alternative to

authority-based practice. Families in Society, 80, 341-350.

Gambrill, E. (2001). Social work: An authority-based profession.

Research on Social Work Practice, 11, 166-175.

Garvin, D. (2000). Learning in action: A guide to putting the learning

organization to work. Boston: Harvard Business School Publishing.

Granovetter, M. (1973). The strength of weak ties. American Journal

of Sociology, 78, 1360-1380.

Gredig, D., & Sommerfeld, P. (2008). New proposals for generating

and exploiting solution-oriented knowledge. Research on Social

Work Practice, 18, 292-300.

Huber, G. P. (1991). Organizational learning: The contributing pro-

cesses and the literatures. Organizational Science, 2, 88-115.

444 Research on Social Work Practice 20(4)

444

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

Internet World Stats. (2006). Internet users distribution. Retrieved

March 1, 2008, from http://www.internetworldstats.com/top25.htm

Israel Central Bureau of Statistics. (2008). Retrieved October 24, 2008,

from http://216.239.59.104/search?q¼cache:_Lh1YxjkArQJ:www.

cbs.gov.il/www/hodaot2008n/11_08_082b.docþcbsþ1948-2008þisrael&hl¼en&ct¼clnk&cd¼4&gl¼il&client¼firefox-a (in

Hebrew).

Kadushin, A., & Harkness, D. (2002). Supervision in social work. New

York: Columbia University Press.

Kalekin-Fishman, D. (2006). Making sense of constant change: Israeli

sociology between apologetics and radical critique. Current

Sociology, 54, 63-76.

Kayes, A. B., Kayes, D. C., & Kolb, D. A. (2005). Developing teams

using the Kolb Team Learning Experience. Simulation & Gaming,

36, 355-363.

Klein, W. C., & Bloom, M. (1995). Practice wisdom. Social Work, 40,

799-807.

Levitt, B., & March, J. G. (1988). Organizational learning. Annual

Review of Sociology, 14, 319-338.

Loewenberg, F. M. (1998). Meeting the challenges of a changing society:

Fifty years of social work in Israel. Jerusalem: The Magnes Press.

Ministry of Social Affairs. (1998). The national program for children

at risk and domestic violence. Jerusalem: Ministry of Social Affairs

(Hebrew).

Orthner, D., Akos, P., Cooley, V., & Charles, P. (2007). Measuring

organizational learning in schools: Development and validation

of a revised organizational learning capacity assessment instru-

ment. Chapel Hill: University of North Carolina School of Social

Work.

Orthner, D. K., Cook, P., Sabah, Y., & Rosenfeld, J. (2004). Evaluat-

ing the impact of organizational learning and knowledge manage-

ment on social service effectiveness: Final report. Chapel Hill:

Jordan Institute for Families.

Orthner, D. K., Cook, P., Sabah, Y., & Rosenfeld, J. (2006). Organi-

zational learning: A cross-national pilot-test of effectiveness in

children’s services. Evaluation and Program Planning, 29, 70-78.

Osmond, J. (2005). Knowledge use in social work practice: Examining

its functional possibilities. Journal of Social Work, 6, 221-237.

Parkin, A. (2000). Computers in clinical practice: Applying experi-

ence from child psychiatry. BMJ, 321, 615-618. Retrieved Febru-

ary 29, 2008, from http://bmj.com/cgi/reprint/321/7261/615.pdf

Rabin, Y., & Peled, R. (2005). Between FOI law and FOI culture: The

Israeli experience. Open Government: A Journal on Freedom of

Information, 1. Retrieved October 29, 2008, from http://www.o-

pengovjournal.org/article/view/324

Rainey, H. G., & Bozeman, B. (2000). Comparing public and private

organizations: Empirical research and the power of the a priori.

Journal of Public Administration Research & Theory, 10, 447-471.

Regehr, C., Stern, S., & Shlonsky, A. (2007). Operationalizing

evidence-based practice: The development of an institute for

evidence-based social work. Research on Social Work Practice,

17, 408-416.

Roberts, A. R., & Greene, G. J. (Eds.). (2002). Social workers’ desk

reference. New York: Oxford University Press.

Roberts, A. R., & Yeager, K. (2004). Systematic reviews of evidence-

based studies and practice based research: How to search for,

develop and use them. In A. R. Roberts & K. Yeager (Eds.),

Evidence-based practice manual, research and outcome measures

in health and human services (pp. 3-14). New York: Oxford Uni-

versity Press.

Ronen, T. (2004). Evidence-based practice in Israel. In B. Thyer &

M. Kazi (Eds.), International perspectives on evidence-based prac-

tice in social work (pp. 113-132). Birmingham, UK: Venture Press.

Rosen, A. (1994). Knowledge use in direct practice. Social Service

Review, 68, 561-577.

Rosen, A., & Proctor, E. K. (Eds.). (2005). Developing practice guide-

lines for social work intervention: Issues, methods, and research

agenda. New York: Columbia University Press.

Rosenfeld, J. M., Schon, D. A., & Sykes, I. J. (1995). Out from under:

Lessons from projects for inaptly served children and families.

Jerusalem: JDC-Brookdale Institute of Gerontology and Human

Development.

Sabah, Y., & Orthner, D. K. (2007). Implementing organizational

learning in schools: Assessment and strategy. Children and

Schools, 29, 243-246.

Sabah, Y., & Rosenfeld, J. M. (1999). The national program for chil-

dren at risk: Interim summary of the Project for Organizational

Learning and Knowledge Management in Social Service Agencies,

October 1998–July 1999. Jerusalem: JDC-Brookdale Institute.

Sabah, Y., & Rosenfeld, J. M. (2000). How to transform social agen-

cies into learning organizations. Mifgash, 15, 143-162 (Hebrew).

Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., &

Haynes, R. D. (1997). Evidence-based medicine: How to practice

& teach EBM. New York: Churchill Livingstone.

Schon, D. (1983). The reflective practitioner: How professionals think

in action. London: Temple Smith.

Senge, P. M. (1990). The leader’s new work: Building learning

organizations. Sloan Management Review, 32, 7-23.

Slonim-Nevo, V. (2007). Editorial. Society and Welfare, 27, 371-374

(Hebrew).

Smith, K. D., & Taylor, W. G. (2000). The learning organization ideal

in Civil Service organizations: Deriving a measure. The Learning

Organization, 7, 194-197.

Spiro, S. E., Sherer, M., Korin-Langer, N., & Weiss, I. (1998). The

professionalization of Israeli social work. In F. M. Loewenberg

(Ed.), Meeting the challenges of a changing society: Fifty years

of social work in Israel (pp. 29-51). Jerusalem: The Magnes Press.

Stake, R. E. (2000). Case studies. In N. K. Denzin & Y. S. Lincoln

(Eds.), Handbook of qualitative research (2nd ed.).

(pp. 435-454). Thousand Oaks: SAGE.

Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B. (2005).

Evidence-based medicine: How to practice and teach EBP (3rd

ed.). New York: Churchill Livingston.

Swirski, S., & Konor-Attias, E. (2007, November). The economy is

growing. Paper presented at the Knesset, Adva Center, Jerusalem.

Retrieved November 13, 2007, from http://www.adva.org/

UserFiles/File/2008%20budget%20in%20English.pdf

Taub Center for Social Policy Studies. (2006). Personal social services

and their development. Retrieved September, 2008, from http://

www.taubcenter.org.il/publications.asp?ID¼2007

Teigland, R. (2003). Knowledge networking, structure and

performance in networks of practice. Unpublished doctoral

Sabah, Cook-Craig 445

445

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from

thesis, Stockholm School of Economics, Stockholm, Sweden.

Retrieved March 1, 2008, from https://www.sse.edu/NR/

rdonlyres/4165BDC8-C42C-43CF-8EEF-57DCEB0939BC/0/

TeiglandthesisKnowledgeNetworking.pdf

Thyer, B. A., & Kazi, M. A. F. (2004). An overview of evidence-based

practice in social work. In B. A. Thyer & M. A. F. Kazi (Eds.),

International perspectives on evidence-based practice in social

work (pp. 9-28). Birmingham, UK: Venture Press.

Tregeagle, S., & Darcy, M. (2007). Child welfare and information and

communication technology: Today’s challenge. British Journal of

Social Work Advance Access. Retrieved March 1, 2008, from http://

bjsw.oxfordjournals.org/cgi/content/abstract/bcm048v1http://bjsw.

oxfordjournals.org/cgi/content/abstract/bcm048v1

Weick, K. E. (2001). Gapping the relevance bridge: Fashions meet

fundamentals in management research. British Journal of Manage-

ment, 12(Suppl 1), S71-S75.

Weiss, I., Spiro, S., Sherer, M., & Korin-Langer, N. (2004). Social

work in Israel: Professional characteristics in an international com-

parative perspective. International Journal of Social Welfare, 13,

287-296.

Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating

communities of practice: A guide to managing knowledge. Cam-

bridge: Harvard Business School Press.

Yin, L. (2003). Case study research: Design and methods (3rd ed.).

Thousand Oaks, CA: SAGE.

446 Research on Social Work Practice 20(4)

446

at University of Otago Library on September 8, 2014rsw.sagepub.comDownloaded from