monitoring and evaluation of area based tourism initiatives

7
Monitoring and evaluation of area based tourism initiatives NazimudeenSaleem The 1980s saw the emergence of a number of local and regional economic initiatives. This included a handful of area based tourism initiatives sponsored by the English Tourist Board (ETB). In policy terms, there has been a shift from national objectives to the achievement of regional goals. While some agencies and local partnerships begin to celebrate the apparent success of programme development and implementation, the 1990s could see an urgent need to evaluate the overall programme performance. This paper outlines a conceptual framework for monitoring and evalu- ating tourism based local economic initiatives and examines the implication for assessing the ETB’s most celebrated Tourism Development Action Programmes (TDAP) using the framework and the model presented in the article. In the 1980s a number of local economic initiatives emerged in the UK. They were either in response to rapidly disappearing manufacturing jobs in the urban inner city locations or to remedy the structural unemploy- ment problems of rural and farming areas. In policy terms, during the 1980s in England there has been a shift from national objectives to the achievement of local and region- al development goals. The localities that were first to experiment with local economic programmes depended very much upon the cooperation between governmental agencies such as the Urban Development Corporation (UDC), the Rural Development Commission (RDC), or the English Tourist Board. The area based tourism initiatives (ABTI) in England have always enjoyed the support of the English Tourist Board (ETB). The ETB, with the help of the regional tourist boards (RTBs), sought to encourage the develop- ment of area based tourism activity through public-private partnership initiatives. The Tourism Development Action Programme (TDAP) is one such partnership initiative pioneered by the ETB in 1984. The TDAP is a compre- hensive and integrated approach, and is designed to The author is with the School of Planning, Oxford Polytechnic, Gipsy Lane, Headington, Oxford OX3 OKP, UK. This paper is based on work carried out as part of PhD thesis on tourism planning and development at Oxford Polytechnic. address issues such as development, marketing and train- ing. While the 1980s saw the development and implementa- tion of an increasing number of local economic initiatives, the 1990s could see the emergence of an urgent need to evaluate these programmes. In England, virtually every major area based tourism initiative (ABTI) is sponsored by the ETB and should undergo some form of evaluation in the very near future. In fact. the ETB has already taken the initiative to monitor its ABTIs.’ The aim of this paper is to establish a conceptual framework for the monitoring and evaluation of the ABTI and to construct an assessment model. The paper will also examine the model’s potential application to TDAP moni- toring and evaluation. The framework presented here recognizes monitoring as part of the evaluation exercise and therefore incorporates it in the model. Further, it distinguishes between the evaluation of the programme implementation process and the overall performance of the programme itself. ABTI monitoring and evaluation: rationale, aims and constraints In general, the need to monitor and evaluate any area based programme rests on a number of reasons: to secure the formation of new programmes and the continuation of the existing programmes that are cur- rently under way; to assure the funding partners that their money has been utilized effectively and efficiently to achieve their policy objectives; to help the sponsors assess their local area policies and to make policy oriented judgements; to generate valuable information which can be fed into the future planning process of area based economic initiatives. To initiate monitoring and evaluation (M&E), however, we first need to determine its objectives. M&E is a management function and therefore targeted at accom- 0261-5177/92/010071-07 fQ 1992 Butterworth-Heinemann Ltd 71

Upload: nazimudeen-saleem

Post on 21-Jun-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Monitoring and evaluation of area based tourism initiatives

NazimudeenSaleem

The 1980s saw the emergence of a number of local and regional economic initiatives. This included a handful of area based tourism initiatives sponsored by the English Tourist Board (ETB). In policy terms, there has been a shift from national objectives to the achievement of regional goals. While some agencies and local partnerships begin to celebrate the apparent success of programme development and implementation, the 1990s could see an urgent need to evaluate the overall programme performance. This paper outlines a conceptual framework for monitoring and evalu- ating tourism based local economic initiatives and examines the implication for assessing the ETB’s most celebrated Tourism Development Action Programmes (TDAP) using the framework and the model presented in the article.

In the 1980s a number of local economic initiatives emerged in the UK. They were either in response to rapidly disappearing manufacturing jobs in the urban inner city locations or to remedy the structural unemploy- ment problems of rural and farming areas. In policy terms, during the 1980s in England there has been a shift from national objectives to the achievement of local and region- al development goals. The localities that were first to experiment with local economic programmes depended very much upon the cooperation between governmental agencies such as the Urban Development Corporation (UDC), the Rural Development Commission (RDC), or the English Tourist Board.

The area based tourism initiatives (ABTI) in England have always enjoyed the support of the English Tourist Board (ETB). The ETB, with the help of the regional tourist boards (RTBs), sought to encourage the develop- ment of area based tourism activity through public-private partnership initiatives. The Tourism Development Action Programme (TDAP) is one such partnership initiative pioneered by the ETB in 1984. The TDAP is a compre- hensive and integrated approach, and is designed to

The author is with the School of Planning, Oxford Polytechnic, Gipsy Lane, Headington, Oxford OX3 OKP, UK.

This paper is based on work carried out as part of PhD thesis on tourism planning and development at Oxford Polytechnic.

address issues such as development, marketing and train- ing.

While the 1980s saw the development and implementa- tion of an increasing number of local economic initiatives, the 1990s could see the emergence of an urgent need to evaluate these programmes. In England, virtually every major area based tourism initiative (ABTI) is sponsored by the ETB and should undergo some form of evaluation in the very near future. In fact. the ETB has already taken the initiative to monitor its ABTIs.’

The aim of this paper is to establish a conceptual framework for the monitoring and evaluation of the ABTI and to construct an assessment model. The paper will also examine the model’s potential application to TDAP moni- toring and evaluation. The framework presented here recognizes monitoring as part of the evaluation exercise and therefore incorporates it in the model. Further, it distinguishes between the evaluation of the programme implementation process and the overall performance of the programme itself.

ABTI monitoring and evaluation: rationale, aims and constraints

In general, the need to monitor and evaluate any area based programme rests on a number of reasons:

to secure the formation of new programmes and the continuation of the existing programmes that are cur- rently under way; to assure the funding partners that their money has been utilized effectively and efficiently to achieve their policy objectives; to help the sponsors assess their local area policies and to make policy oriented judgements; to generate valuable information which can be fed into the future planning process of area based economic initiatives.

To initiate monitoring and evaluation (M&E), however, we first need to determine its objectives. M&E is a management function and therefore targeted at accom-

0261-5177/92/010071-07 fQ 1992 Butterworth-Heinemann Ltd 71

plishing certain management objcctivrs. While monitoring is a process of gathering and recording relevant data and information during programmeiprojcct implementation, evaluation (er post) is carried out after the project or programme has been completed. The evaluation process involves analysis, interpretation and making policy oriented judgements based on the data and information generated during the monitoring process. Xlonitoring is therefore an integral part of the evaluation exercise. According to Maddock’ the term ‘evaluation’ means the process of extracting lessons for the future from the experience of the current and the recent projects and therefore should provide guidance for policy making and future project planning. When designing a XI&E system it is necessary to set clear aims and specific objectives. For example, one or more of the following can be considered as an appropriate aim:

l to assess and evaluate the programme implementation pr-ocess: whether implemented and if implemented, to what extent, whether within budget, etc;

l to assess and evaluate the programme performance on the basis of its outcomes and immediate impacts: did the outcomes match the target, and if not why?

l to assess and evaluate the overall programme perform- ance on the basis of its ultimate impacts: were there any impacts, were they efficient and effective, and if not why not’?

However, conducting evaluations of area based pro- grammes is not without problems. No matter how import- ant the need to evaluate area based economic policies and programmes, there is a rationale for not conducting evalu- ation. Four common complaints that can be put forward are:

Area based economic programmes lack clear, specific, and measurable objectives. There are a number of overlapping but similar policies/ programmes at any given period of time. It takes a long time to produce any results or impacts and usually ends up in premature evaluation. The lack of any proven and effective techniques of evaluation.

While some seem to support this view, others, like Storey,” argue that bases for such a rationale are ground- less. The framework- outlined here recognizes these prob- lems and tries to accommodate these deficiencies through a number of techniques.

A conceptual framework and model for ABTI monitoring and evaluation

When developing a general framework for monitoring and evaluation of ABTI, particular attention should be paid to the following elements:

l programme characteristics; l major M&E issues such as lack of clear goals and

specific objectives;

1. Prepare

2. Ca[ry out +

Programme evaluabillty assessment

(PEA)

1 Micro environment assessment

3. Set 4

1 Macro environment assessment

M&E objectlves

1) egonomic impact; wealth

creation 2) economic Impact; job creation

Effeciiveness (for objective 1)

Efficiency (for objective 2)

measurement

1) Regional per capita income

2) Cost per job created

draw conclusions

Figure 1. A framework for monitoring and evaluation

of ABTls.

Notes:

aWBS = work breakdown schedule. bCBS = cost breakdown schedule.

l programme environment (both micro and macro); l existence of more than one M&E goal and associated

objectives; l need to be evaluated by different authorities (local/

national) at different stages/levels; l use of a number of (or various) M&E criteria relevant

to different M&E objectives.

The framework (Figure 1) prescntcd here essentially consists of five steps and the above elements are taken into consideration when designing the M&E model. The aims of five steps are: 1 to prepare the programme descriptive statement (PDS); 2 to carry out a programme evaluability assessment [PEA); 3 to set M&E goals and to formulate M&E objectives; 4 to establish M&E criteria and to determine units of measurement; and 5 to conduct the M&E exercise.

Prograttme ducriptivc statetmwt

Programmes and initiatives can differ in purpose, charac- teristics or in their organizational structure. Programme

72 TOURISM MANAGEMENT March 1992

characteristics are also determined by the size and number relevant data. auditing and analysing. interpreting and

of participants in the partnership, number of goals, making policy oriented judgements. Obviously for practi-

budget. and the degree of complexity. Any programme cal purposes, the data collection can he grouped in to three evaluator should be thoroughly familiar with these charac- categories based on the aims or goals of evaluation. These teristics before an evaluation can be conducted. AS discus- three categories are the assessment of programme implr-

sed previously, an evaluator can encounter problems such mentation processes: programme outcomes; and program- as lack of clear goals and specific objectives. In order to me impacts. Consequently. M&E can be conducted inde- identify any hidden objectives, and to determine the most pendently. based on individual goals. According to the probable targets, the evaluator may be required to ex- amine and analyse programme literature (both pre- and post-implementation) and systematically categorize the key goals on the basis of priority as ranked by the programme manager during implementation.

Proyrrmtne evnlirtrhilit~ irs.se.s.sttzent

Basically, the PEA consists of assessing the programme environment..’ Successful implementation of programmes rests not only on the micro environment that it creates by such factors as adequate resource allocation, quality of planning, and the organizational structure, but also on the macro environment in which it is being implemented. The macro environment consists of such factors as the econo- mic, sociocultural , geopolitical and natural resource sys- tems. An attempt to evaluate an ABTI without assessing its micro and macro environments is a big mistake.

Through a PEA. we can try to determine the fit between the programme and the environment, and also examine any constraints such as resource availability or lack of quality in planning. One major objective of the PEA process is to determine the current appropriate M&E goals and to identify those objectives that are worthy of evaluation.

framework presented here. the actual M&E process which is the last step in the model starts with a M&E process sheet. Every M&E goal or subgoal under consideration has a process sheet. At this stage, every evaluable objec- tive or task under a subgoal or goal is listed with its respective M&E criteria and units of measurement.

Corporation.)

In theory, every task listed as a strategy/initiative for respective objective accomplishment needs to be im- plemented to produce outcomes and impacts. Therefore, a typical M&E process sheet can have a page for task implementation with columns to record planned versus actual time of beginning and completion and budget versus actual cost. Similarly, it can also have a page for outcome record with columns for planned or projected versus actual outcomes. A M&E process sheet is not complete without a page for impacts of the tasks or strategies. Here, by impact, we mean the immediate impacts, which are often monitored and evaluated by local authorities or internal management itself. As far as the long-term impacts are concerned, we cannot relate any particular task or strategy to the final impact such as the number of tourism related jobs created over a period of time. (By impact. we mean the overall impact of the entire programme; this can link to national or regional policy objectives sometimes impli- citly expressed by participating government agencies such as the Enterprise Board or the Urban Development

Two primary M&E goals of an ABTI (say that of a TDAP)

strategies, and economic and social development are to assess the effectiveness of tourism development

strategies. The former would include the assessment of programme implementation process and its outcomes. and the latter the impacts of tourism initiatives in the area in general. The M&E model has been designed to accommo- date one M&E goal at a time. Similarly, M&E objectives can be considered one by one for most complex situations having several objectives or can be modified to accommo- date a11 the objectives if there are only a few of them.

Tourism Development Action Programme (TDAP)

A Tourism Development Action Programme is an innova- tive partnership initiative between private sector partici- pants and public sector agencies or organizations. It has evolved from a local initiative in Bristol to a major regional policy mechanism nationally. Key participants in the partnership include, but are not limited to, local councils, the Regional Tourist Board and the ETB, local chambers of commerce, leading hospitality/tourism organ- izations, the Urban Development Corporation or the Rural Development Commission, the local Enterprise Board, and the Forestry Commission, etc. Currently, there are about 10 or more TDAPs in operation and another 10 have been completed since the idea was first launched in 1984.

M&L E criterk and units of ttlerrswetnent

Establishing criteria of evaluation is as important as formulating M&E objectives. In many cases, efficiency and effectiveness remain the most used criteria. M&E criteria are the basis by which objectives are evaluated using specific units of measurement. Even if two objec- tives can be evaluated by using only one criterion, the units of measurement would be different. The measure-

Some major activities that are common to all TDAPs are market research, feasibility studies for major projects,

ment units chosen should satisfy both the M&E objectives training programmes for tourism operators, promotional and criteria. campaigns, environmental improvement programmes,

visitor management programmes. business advice net- M&E process md methods works, and coordinating strategic resource development

The actual M&E process includes recording and collecting initiatives. As far as goals and objectives are concerned,

TOURISM MANAGEMENT March 1992

Moniroring und ewl~rariorl of crrea based touri.wr iniliulws

TDAP partners seek to develop tourism activity in a

particular area, to the benefit of the local economy and

community.

Evaluation of TDAPs in the long run not only helps

assess the success of key policy objectives of the part-

nership members but also the ETB’s policy of tourist/

tourism development through area based public-private

partnership initiatives.

Application of M&E model it1 TDA P n.vsessmer~t

Before a TDAP assessment is attempted, a couple of

things need to be said about the evolution in TDAP

planning process and its micro environment in general. In

terms of quality of planning. TDAP has evolved from an

entrepreneurial initiative to a corporatist model. During

TDAP planning with the help of research and analysis.

major issues/problems are identified. strength/weaknehs

and opportunity/threats are sieved out, and objectives that

are verifiable, if not quantifiable. are formulated. How-

ever, in most cases. goals and objectives are still vague and

the specific policy objectives of member bodies are not

clearly defined. As far as the outcomes/impacts are con-

cerned, no targets or efficiency levels are determined.

Although tasks/initiatives are ranked by priority, comple-

tion dates are not scheduled, and only rarely is the cost

budgeted. Nevertheless, the TDAP micro environment,

which is characterized by the quality of the information

system and planning PI-ocess. by the organizational design

and aclrninistrativc structure, and the commitment of the

constituent parties, is quite stable and predictable.

Prepurrrrion of the PDS

The PDS preparation involves the following: review of

TDAP goals/objectives and identifying the key objectives

under the rxibting categories; regrouping of objectives

under the suggested framework of segmentation (Table 1)

Table 1. Framework for segmenting and coding TDAP objectives.

1. 1.1

1.2 1.3

1.4

2. 2.1

2.2 2.3 2.4

3. 3.1 3.2

3.3

3.4

Strategic resource development capital improvement programme; infrastructure and public facilities education and training business development and promotion, including accommodation environmental improvement and protection

Product-service development research and development and product inventory planning existing attraction/product improvement product-service planning for new attractions development of major attractions or product- service mix

Market and customer development public relations campaign and image development marketing and promotion of tourism-leisure products visitor experience enhancement; ambience, sign- posting, TIC, and courtesy, etc marketing research, strategy, and marketing plan, etc

74 TOURISM MANAGEMENT March 1992

with PDS code numbers; selection of appropriate tasks/

initiatives or programme objectives (as it is called by

TDAP planners). for every primary objective; tabulation

of tasks with PDS objective code numbers, project/task

duration, budgeted cost, priority rank. and significance

level. At this stage. a detailed work and cost breakdown

schedule can also be prepared, if necessary, for major

tasks.

All new TDAPs that are currently being implemented

seem to follow the ETB guidelines for programme plan-

ning. organizational design, administrative structure, and

funding strategies. All TDAPs have now appointed prog-

ramme directors. Therefore, a far as the micro environ-

ment is concerned, one can ensure that implementation

will proceed as planned. The macro environment consists

of two elements: the immediate environment and the

remote environment. The immediate environment consists

of the local natural resource system. built environment.

social-cultural system. and demographic aspects; the re-

mote environment is made up of the external economic

system and the overall tourism market system. During the

planning phase of TDAPs. a well researched position

statement is produced and this should, in theory, attempt

to include an up to date assessment of the immediate

macro environment. This assessment and that of the

remote macro environment should Icad to the analysis of

SWOT (strength-weakness-opportunity-threat) factors.

This, in fact, is an important aspect of all TDAP planning

today. The purpose of macro and micro assessment or

pi-ogramme evaluability assessment is to check if the

goals/objectives are appropriate and significant, imple-

mentation is well planned and resource allocated. and

therefore worthy of evaluation. Within this framework.

the PEA involves two steps. First, ii general assessment of

micro and macro environment using ;I set of appropriate

instruments and weighted average scores by (in depth

interview) the key people involved in TDAP planning and

implementation. The second step is to assas every key

programme objective on the basis of cost, duration and

significance level. These steps will help decide whether the

tasks are worthy of monitoring and evaluation.

For any TDAP. obviously, the M&E goals are assessment

of programme implementation; programme outcomes;

and programme impacts. We may, indeed, use the

framework in Table 1 to formulate M&E objectives for the assessment of implementation and outcomes of any

TDAP. Then the M&E subgoals become the assessment of strategic resource development strategies; product- service development strategies; and market and customer development strategies. In this cast, assessment of the

components of the above three segments or primary objectives becomes the M&E objectives for TDAP imple-

mentation and outcomes. The model in Figure 2 describes this process. Aa for the overall assessment of TDAF performance, we may categorize the M&E process into

Monitoring and evaluation of area based tourism initiatives

TDAP

t

Assessment of Assessment of Assessment of

f Assessment of

1.1 Capital improvement programme

1.2 Education and training initiatives

1.3 Business development strategies

1.4 Environmental protection initiatives

+ Assessment of

2.1 Research and development strategies

2.2 Product/attraction improvement strategies

2.3 New product planning/developmenl strategies

2.4 Major new product/attraction development

Figure 2. Assessment of TDAP implementation and outcomes.

two areas: the assessment of tourism development strategies and the economic/social development strategies. This categorization is based on the synthesis of partners’ overall policy objectives. Figure 3 shows how assessment objectives can be derived from the above primary objec- tives.

or budget cost. The cost-effectiveness ratio that we discus- sed earlier therefore also becomes a measure of efficiency of the programme implementation process.

IMAE criteria and units of measwemerlt for TDA P

assessment

Under programme implementation, two major issues that are often faced are the completion of tasks in time as planned and keeping cost within budget. Within this framework, we can definitely apply both effectiveness and efficiency criteria. The percentage of the overall achieve- ment of a programme objective within the scheduled completion time and/or within the budgeted cost indicates the effectiveness of task completion. To put it another way, effectiveness can also be measured by the ratio of scheduled time to actual time or budget to actual cost. Similarly, the cost-effectiveness ratio of, for example, 3000:4500 would mean the budget cost was f3000 and the actual cost f4500.

For the assessment of programme outcomes. appropri- ate M&E criteria are again effectiveness and efficiency. In some cases, the assessment of outcomes can be used confidently to also evaluate the implementation process without wasting time and money in monitoring and evalu- ating the implementation process itself. However, such evaluation may not provide answers to the problems of implementation. In this area of assessment, we also in- clude the M&E of immediate impacts. By immediate impacts, we mean the final product or result that we expect to get by implementing a strategy or task. For example, if the objective of a direct mail advertising

Assessment of tourism development strategies

I

Assessmedt of economic and social development strategies 1

i These ratios can be added and averaged to arrive at an

aggregate ratio of effectiveness for a specific programme objective such as the business development strategy or for a goal/subgoal such as strategic resource development in general. As for efficiency, which is a ratio of actual economic input to actual output, the above concept does not seem to work here. This is because the output of implementation process itself is task completion, which cannot be quantified in units other than completion time

Assessment of tourist 1. Assessment of economic

2. Assessment of social

Assessment of seasonal 3. Assessment of residents’ percerved attributes

Figure 3. Overall assessment of TDAP performance.

TOURISM MANAGEMENT March 1992 75

+ Assessment of

3.1 Public relations campaign

3.2 Tourism product marketing strategies C

3.3 Visitor experience enhancement strategies

3.4 Market research strategies, marketing plan

Table 2. Assessment of TDAP outcome/impacts.

M&E objective 1.

1.1 1.2 1.3 1.4

St&e& resource development

Capital improvement programme Education and training Business development Environmental protection

2. Product-service development 2.1 Research and development 2.2 Product improvement 2.3 New product planning and development 2.4 Major product development

3. Market and customer development 3.1 PR campaign 3.2 Product marketing 3.3 Visitor experience enhancement 3.4 Marketing plan/strategy

Criteria Units of measurement Method

Efficiency NPBI” over NPRI” (AR/ST)” x 100 Efficiency TTS” over TTEd (AR/ST)” x 100 Efficiency NPRlb over TTS” (AR/STjh x 100 Effectiveness ARA” over PRA’ ARA/PRA x 100

Adequacy % expenditureg (AR/ST)h x 100 Adequacy % expenditureg (AR/ST)” x 100 Adequacy % expenditureg (AR/ST)” x 100 Efficiency Cost to benefit Net present value

Effectiveness Awareness Survey Cost-effectiveness Cost/visitor (AR/STjh x 100 Effectiveness Satisfaction Survey Adequacy % expenditure (AR/ST)” x 100

Notes: “NPBI = net public investment over three years.

bNPRl = net private investment over three years. “TTS = total tourist spending increase over three years.

“TTE = total tourism employment created over three years.

“ARA = actual resource allocation (f value). ‘PRA = planned resource allocation (f value).

g% expenditure = percentage expenditure of total (of the segment).

hAR/ST = actual ratio to standard (TDAP average) or actual to standard

campaign is to reach 5000 business travellers, then our target outcome is 5000. But the immediate impact will be the number of travellers who would respond to the advertisement.

The same segmentations as used in strategic resource development may also be employed. However, the out- comes or impacts of, for example, education and training strategies and business development strategies cannot be compared using ratios like completion time in weeks, or task cost in pounds sterling. This limits the performance assessment to individual tasks/strategies or programme objectives and therefore an aggregated effectiveness/ efficiency measure for the entire segment cannot be obtained. Nevertheless, a new technique called data en- velopment analysis can be used for such purposes.”

Tables 2 and 3 clearly outline such information as M&E criteria; relevant units of measurement; and, in some cases, appropriate methods that can be used for monitor- ing and evaluation of TDAPs. In fact, preparation of the above tables in itself is part of the M&E process. These tables can also be expanded to form M&E process sheets or separate sheets can be made with appropriate columns to record and monitor relevant data. In some cases, a few programme objectives may have to be further divided as in a work breakdown schedule (WBS) in order to assess certain key strategies or tasks that are worthy of special assessment.

Conclusion

In order to complete the entire M&E exercise, we also need to assess the overall TDAP performance based on its ultimate programme impacts. If premature evaluation is to be avoided, such assessment should not be carried out until a reasonable period of time has elapsed after the programme completion. Tables 2 and 3 show the most appropriate M&E criteria, units of measurement, and methods under two assessment areas.

TDA P ussesstnetu and M&E process

The TDAP M&E process is no different from the M&E model presented earlier. Obviously for the TDAP assess- ment, we have identified three major goals; monitoring and evaluation of the implementation process; the prog- ramme outcomes and immediate impacts; and the overall programme impacts in the long run. Further, Figures 2 and 3 show the breakdown of M&E goals and objectives, and

Rarely does a local economic initiative (LEI) undergo an assessment (ex nnrr) before its implementation. If such an assessment is called for, programme planners would be required to forecast the outcomes and impacts in advance. On the other hand, it is not unusual for the programme manager to estimate the cost of implementation and completion time. Of course, such estimates guide the manager through the implementation process and post- evaluation exercise. If forecasting of outcomes and im- pacts is included in the planning process, then it provides standards and targets for programme objectives, which essentially make the CY post evaluation exercise much easier. Unfortunately, hardly any LEI, or for that matter, any ABTI under consideration, falls under such a pro- gramme category, with an es Nnte evaluation or pre- assessment. Although all these programmes warrant some form of monitoring and evaluation in the end, conducting

76 TOURISM MANAGEMENT March 1992

Table 3. Assessment of the overall programme impacts.

M&E obiectives 1. Touksm development Criteria

1.1 Tourist development Effectiveness

1.2 Visitor satisfaction 1.3 Seasonal and sector stability

2. Economic/social development 2.1 Economic impacts

Efficiency Effectiveness Adequacy

2.2 Social impacts 2.3 Perceived attributes

Effectiveness Effectiveness Efficiency Varies Varies

Units of measurement

Physical capacity; % SRD expenditure” TTS’ over TPCd Average length of stay % occupancy spread”

Wealth creation; AHI” Job creation; NJC’ Job creation; CPJg Varies Likert Scale’

Method

(AR/ST)!’ x 100

(AR/ST)b x 100 (AR/ST)b x 100 (AR/ST)b x 100

(AR/ST) x 100 AJC/PJCg x 100 (AR/ST) x 100 Document total survey Resource survey

Notes: “% SRD expenditure = percentage strategic development expense to the total. bAFUST = actual ratio to standard (TDA average). “TTS = total tJurist spending. “TPC = total programme cost. “AHI = average household income. ‘NJC = net jobs created. gAJC = actual jobs created, PJC = planned jobs, CPJ = cost per job. h% occupancy spread = average off-season occupancy % or total average (annual) occupancy rate. ‘Likert scale = 1 to 7 scale with 1 being the lowest and 7 the highest.

an evaluation can still be a difficult and complex process. In this paper, we have discussed the nature and charac-

teristics of a special form of LEI or area based tourism initiative (ABTI). We have also examined the major

problems of assessing their performance, and why and how a M&E exercise can be conducted. The conceptual framework presented in the paper recognizes the intrica- cies and complexities of evaluating an ABTI and tries to accommodate the deficiencies of programme planning/ implementation. In conclusion, the paper also examines the implications for assessing the ETB’s Tourism Develop-

ment Action Programmes (TDAP) using the framework

and the model presented here.

Notes and references ‘English Tourist Board. Monirorirrg of Locc~l Areu Itlifictrives: A

for Cor~sul~rtrtrs, ETB, 199 1, London. ‘Nicholas Maddock, ‘On the monitoring and evaluation of rural devclopmcnt projects under drccntralization’, Third World Pi~n- rling Revirrv, Vol 12, No 3. 1990. “D.J. Storey, ‘Evaluation of policies and measures to evaluate local employment’, Urhtr,l .S/udi(‘s, Vol 77, No 5, 19Y0, pp 669-68-I. “H. Rudolph Moss, ‘Assessing the program environment: im- plication for program evaluation and design’, in E\~tr/utrring Program Enr~ironment, New Directions For Program Evalwlion: 40, Winter, 1988. ‘R.H. Silkman, ed, Mcclsuring E~$cirrq: At7 Assessmerzr of‘IIcrtcr Enve/opmerlt Antrlysis, An American Evaluation Association Publication. Jossey-Bash. San Francisco, 1986.

TOURISM MANAGEMENT March 1992 77