a statewide quality assurance program

9
Administration in Mental Health Vol. 10 No. 3, Spring 1983 A STATEWIDE QUALITY ASSURANCE PROGRAM Sutherland Miller, Sidney M. Glassman and David A. Winfrey ABSTRACT: The Colorado Division of Mental Health implemented a quality assurance program in all of its mental health facilities. The system was designed to help managers know how they were doing on operational issues. The program was simple, low cost, and easy to implement. All levels of management found it a useful tool in getting control of important processes. There has been a growing emphasis in recent years on accountability in the mental health field. Consumerism and tight dollars have provided the impetus for this change, and the responses are seen in new standards, PSRO's, Health Systems Agencies, insurance reviews, and an emphasis on effectiveness and efficiency. Compliance with simple rules and regulations is no longer regarded as evidence of adequate accountability to the public or to funding sources. Colorado's Division of Mental Health (DMH) was searching for a new approach to quality assurance and found it in a program developed by Roland J. Kalb Associates. This program has been tested and proved useful in other states. Within seven months all 23 community mental health centers (CMHCs) and clinics in Colorado had implemented this new quality assurance program. Within a year or two, state hospitals and the central office of the Division of Mental Health also had the system in place. This article describes why this quality assurance program was selected, how it was implemented, and what the Colorado experience has been. Sutherland Miller, Ph.D., was Director of the Colorado Division of Mental Health and is now a management consultant in the Washington, D.C. area. Sidney M. Glassman, Ph,D., is Director of Staff Development, Ft. Logan Mental Health Center. David A. Winfrey, A.C.S.W. is Quality Assurance Specialist, Colorado Division of Mental Health. Requests for reprints should be addressed to Dr. Sidney M. Glassman at Ft. Logan Mental Health Center, 3520 W. Oxford Ave., Denver, CO 80236. 195 @1983 Human Sciences Press 0090-1180/83/1300-195502.75

Upload: sutherland-miller

Post on 12-Aug-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Administration in Mental Health Vol. 10 No. 3, Spring 1983

A STATEWIDE QUALITY ASSURANCE PROGRAM

Sutherland Miller, Sidney M. Glassman and David A. Winfrey

ABSTRACT: The Colorado Division of Mental Health implemented a quality assurance program in all of its mental health facilities. The system was designed to help managers know how they were doing on operational issues. The program was simple, low cost, and easy to implement. All levels of management found it a useful tool in getting control of important processes.

There has been a growing emphasis in recent years on accountability in the mental health field. Consumerism and tight dollars have provided the impetus for this change, and the responses are seen in new standards, PSRO' s , Health Systems Agencies, insurance reviews, and an emphasis on effectiveness and efficiency. Compliance with simple rules and regulations is no longer regarded as evidence of adequate accountability to the public or to funding sources.

Colorado's Division of Mental Health ( D M H ) was searching for a new approach to quality assurance and found it in a program developed by Roland J . Kalb Associates. This program has been tested and proved useful in other states. Within seven months all 23 community mental health centers ( C M H C s ) and clinics in Colorado had implemented this new quality assurance program. Within a year or two, state hospitals and the central office of the Division of Mental Health also had the system in place. This article describes why this quality assurance program was selected, how it was implemented, and what the Colorado experience has been.

Sutherland Miller, Ph.D., was Director of the Colorado Division of Mental Health and is now a management consultant in the Washington, D.C. area. Sidney M. Glassman, Ph,D., is Director of Staff Development, Ft. Logan Mental Health Center. David A. Winfrey, A.C.S.W. is Quality Assurance Specialist, Colorado Division of Mental Health. Requests for reprints should be addressed to Dr. Sidney M. Glassman at Ft. Logan Mental Health Center, 3520 W. Oxford Ave., Denver, CO 80236.

195 @1983 H uman Sciences Press 0090-1180/83/1300-195502.75

196 Administration in Mental Health

THE SELECTION OF A NEW APPROACH

In the spring of 1978 Colorado's D M H reassessed its procedure for performing annual assessments of mental health centers by means of site visits. This process was oriented toward assessing compliance with various state and federal rules and regulations. A decision was made, however, that the site visit approach was too costly both in terms of D M H staff hours for the visit, write- up, and follow-up and time spent by the centers' staff in preparation. Not only were too many resources expended, but the results were either difficult to determine or minimal. Few examples of significant long-term change as a result of the site visits could be found.

"The D M H's primary assumption was that external quality control systems were less desirable than internal systems. "

As with the national activities in program evaluation, D M H and the agencies with which it contracted had emerging systems dealing with treatment outcomes, peer review, utilization review, needs assessment, and consumer participation. But none of these endeavors told program managers how well their everyday operations were performing. They could not determine the relationship between their efforts and program outcomes.

The D M H ' s primary assumption was that external quality control systems were less desirable than internal systems. A C M H C ' s commitment to a useable and meaningful internal quality assurance program would be far more effective in the long run than annual site assessments by the DMH.

To be meaningful, however, the system had to be useful to program managers. It had to be practical, measuring items of everyday importance such as how long it took to answer emergency calls or how long patients sat in the waiting room. Most program evaluation is research oriented, but to look at

operational issues, uncomplicated evaluations with instant feedback were required. Obviously, low cost and ease of implementation were additional factors to consider in selecting a system.

The D M H learned of an approach with growing use in hospitals and more limited use in mental health centers developed by Kalb Associates (Kalb et al.

1974). As our understanding of the approach grew, other qualities of interest appeared in addition to its management orientation, low cost, and simplicity. Mental health facilities could not be compared with each other because the measures were tailored to each individual organization. This avoided the pitfalls of inappropriate comparisons by the legislature and, as a result, administrators would probably feel safer in using it. Also, to use the system took little or no background in mental health. Citizens, board members,

Sulherland Miller, Sidney M. Glassman and David A. Winfrey 197

students, and staff could all do it easily. Using other managers as evaluators could increase sharing and understanding. Finally the system lent itself to recognizing noteworthy performance of units and individuals.

After an initial presentation by Kalb, a clinical director of a C M H C and a D M H staff member visited one of the hospitals already using the system. Information collected from their visit substantiated the initial positive impression of this approach, and the decision was made to implement it statewide.

The D M H made it clear that this internal quality assurance approach combined with a review of compliance with nonclinical regulations would take the place of the annual site visit. Four C M H C ' s volunteered to be the first ones trained in the new approach.

OVERVIEW OF THE METHOD

The first step in the implementation of this quality assurance system is the specification of the organization's units, departments, or sections. Next each team leader, department head, or section chief in the organization identifies 10 or 14 different activities of their unit that are important to its effectiveness. They then proceed to define the level they would like to achieve on each of these activities or indicators. For example, the head of an outpatient team might decide that client dropout rate is an important indicator of whether clients feel that the services they offered are helpful to them. The team leader, therefore, might like to have periodic feedback on this indicator and then determine as the criterion measure the level of service to be achieved. For example, a client dropout rate realistically attainable somewhat lower than the service is presently achieving might be chosen.

"'The service heads were encouraged to develop criteria in terms of goals that they would like to achieve rather than mere reflections of current levels of petformance. "

After each service head has defined the indicators and criteria for the unit, a monthly evaluation of these is done by the head of a different service in the same organization. The criteria for each service are designed so that they can be evaluated in 30 to 45 minutes. The evaluator changes each month; thus in time, each service head will have the opportunity to visit and evaluate every other service in the organization. The evaluation is tabulated into a score: the percentage of the criteria achieved each month.

The completed monthly evaluation forms are done in duplicate, with one copy left with the evaluated service at the time the evaluation is done to provide

198 Administration in Mental Health

immediate feedback. The other copy is given to the Quality Assistance (QA) Coordinator for compilation into a monthly report for that service and into a monthly summary of all the services in the organization.

TRAINING AND DEVELOPMENT

There are four major steps in the implementation of this particular quality assurance program: training, development, testing and implementation. Because of the number of facilities involved and Colorado's budgetary constraints, an inexpensive method of implementation was developed. Kalb Associates conducted the entire training including development, testing, and implementation with the first group of four CMHCs . This allowed the D M H staff to be trained together with the QA coordinators of the centers. Thereafter Kalb Associates gradually reduced their participation, and D H M staff took over.

The quality assurance program was phased in over an 18-month period. Groups of C M H C s went through the first of the four steps together, although the three subsequent steps were always done with each one individually. Four centers volunteered to make up the first group. Subsequent groups, consisting of up to six centers each, followed at roughly three month intervals. Two centers, one with a very rural catchment area and one with a very urban one, were trained individually to allow tailoring of the system to their unique needs. The central office of D M H and two state hospitals also received separate instruction.

In the long run, any qual i ty assurance program s tands or fa l l s on its usefulness to managers. '"

Training of the CM HCs , hospitals, and D M H was performed by the D M H staff person responsible for quality control, the director of staff development, and the director of the D M H . The Division director participated in every training session to underline the importance of the project.

Each C M H C director appointed a staff member to serve as coordinator. This person had to have good access to the executive director, a broad knowledge of the organization, the respect of key staff and management, and substantial skill in working with others. The coordinator did not necessarily have to be a clinician, but sensitivity toward clinical as well as administrative issues was required. The executive directors were told that the functions of the QA coordinator could take as much as 20 hours per week during the development and testing stages, but probably no more than five to ten hours per week once the system was in place.

Sutherland Miller, Sidney M. Glassman and David A. Winfrey 199

When the CMHCs were trained in groups, the executive director and the QA coordinator of each attended the training session and were responsible for conducting a comparable QA training session for each of the service heads. A "service head" was loosely defined as a manager or supervisor of any administrative or clinical team, section, department, or unit. It included as a minimum, all upper and middle level supervisors and possibly lower level as well. The actual definition of who should be a service head was left up to the director of that center. Also, each director decided what work units in the organization would be defined as "services" for the purpose of developing the QA program. A sample list of 35 possible services was provided (Figure 1).

The first QA training session with four centers took about six hours, but subsequent sessions were trimmed to three hours with no apparent loss of effectiveness. Background and orientation material intended to increase the participants' interest and understanding of the QA program was condensed

FIGURE I

SUGGESTED LIST OF SERVICES

PARTICIPATING IN THE QUALITY ASSURANCE PROGRAM

I . Emergency 19

2. Par t ia l /Day Care 20

3. I npa t ien t 21

4. Other 24-Hour Care 22

5. Outpat ient 23

6. Children Service 24

7. Geriatric Service 25

8. C l i n i ca l Records 26

9. Volunteer Services 27

lO. Staff Developmept/Training 28

I I . Adolescent Service 29

12. Intake 30

13. Adul t Service 31

14. Alcohol 32

15. Drug Abuse 33

16. Consul tat ion 34

17. C l ien t Housing 35

18. Program Evaluation

Finance

Administration

Personnel

Bui ld ing Maintenance

Receotion

Telecommunications

L ib rary

Mail Service

Messenger Service

Pharmacy

Vocational Services

Transoor ta t ion

Publ ic Information

Data Processing/MIS

Accounting

Housekeeping

Ouality Assurance Program

200 Administration in Mental Health

significantly. It should be noted, however, that the D M H required all its contractual agencies to use this system. Without this requirement, motivational problems would likely have been encountered.

After service heads had been through the initial training session, they each defined 10 to 14 indicators with corresponding criteria to assess their own services. Some indicators were as simple as promptness in answering incoming telephone calls, and getting certain reports done within a designated time period, or as complex as increasing percentage of client admissions from certain target populations and improving treatment effectiveness. In effect, the indicators represented brief descriptions of what aspects or functions of a service were to be evaluated. The criteria were the detailed descriptions of how, and at what level, each indicator would be measured. Examples of indicators and their criteria are as follows:

I N D I C A T O R S Prompt Telephone Answering

Timely Discharge Summaries

Treatment Outcome

C R I T E R I A Dial 777-8245 any time of day or night. Accept I f the telephone is answered within four rings and the person answering identifies the service and gives his or her own name.

Ask the Medical Record Librarian for all closed charts received from Unit X within the past 30 calendar days. Reject I f any of the discharge summaries are dated more than five days after the client's discharge.

Ask the ward secretary for the team's Goal Attainment Scores for the prior calendar month. Accept I f the average Goal Attainment Score for all clients terminated that month is 4.0 or above.

Some indicators, as in the first example above, included multiple criteria. In such instances, all of them had to be met in order for the indicator to be accepted. If the service chief did not want this to happen, then each of the sub- criteria needed to be written into discrete indicators, such as prompt answering, identification of service, and identification of answering person.

The service heads were encouraged to develop criteria in terms of goals that they would like to achieve, rather than mere reflections of current levels of performance. Therefore, they were encouraged to develop criteria that would cause no more than 30 to 50 percent of the total group of indicators to be

Sutherland Miller, Sidney M. Glassman and David A. Winfrey 201

accepted in the initial evaluation. This strictness allowed the work units to see progress as their scores improved each month. When scores for any work unit started to reach 80 to 90 percent, the service head was expected to revise the criteria to make them more stringent in order to drop the monthly percentage score back to a lower level to start the process again. Any criterion that had been met for four successive months would be a candidate for revision. If it did not seem sensible to make a particular criterion more stringent (like answering the phone within one ring), that indicator might be dropped, and a new one substituted in its place.

TESTING AND IMPLEMENTATION

Following the development of the indicators and criteria, each C M H C was ready to test its embryonic system. This step was completed in the third stage through a "d ry run . " The dry run was performed with all the service heads, including the agency director and the QA coordinator.

First, the QA consultants instructed the participants in how to conduct the QA monthly evaluation and how to complete the necessary forms. Then each service head proceeded to an assigned service (not his or her own), did the evaluation, and returned for a final meeting with the consultants. In this meeting they reviewed any questions, discussed problems encountered, and looked at needed changes in the indicators or the criteria. The scores of all the services were placed on a chalkboard and any service achieving an exceptionally high percentage of "Accepts" was encouraged to review their criteria for the sake of eliminating any that would routinely elevate their scores.

With the completion of the "d ry run" the quality assurance program was in place. Once the quality assurance material was finalized for each service, ser- vice heads were encouraged not to modify any criteria for at least four months so that emerging trends could be noted.

This dry run served the dual purpose of training the evaluators and testing the indicators and criteria for any further refinements needed. Within a week or two after the dry run, each organization proceeded with the implementation of monthly evaluations usually with only telephone consultation from the QA consultants.

EARLY LEARNINGS

In watching the implementation of the Quality Assurance Program in Colorado, it was discovered that the system not only provided the benefits anticipated but also some unexpected gains. For example, the work of creating

202 Administration in Mental Health

and using evaluation criteria helped develop skilled clinicians into better administrators and managers. It provided clinicians with more of an administrative perspective in dealing with many work issues and problems. The system has also given experienced managers a mechanism for bringing about desired procedural and organizational changes to improve their services. Many managers reported that they had known for some time what changes were needed but never had the tools for accomplishing them. The QA system gave them the leverage and the vehicle they needed to create change.

In using this quality assurance system, its limitations also became apparent:

1. The system did not evaluate the quality of clinical practice directly. Consequently, it has to be supplemented by some kind of peer review system.

2. The system required real emotional and time commitments from top management to insure its successful implementation.

3. Although Colorado saw it as advantageous that the system did not allow valid inter-service or inter-agency comparisons, some might see this as a disadvantage.

In the long run, any quality assurance program stands or falls on its usefulness to managers. Thus the true test came when the D M H made use of the system voluntary to all those mental health centers using it for at least one year. At that time, a formal survey was done to determine the perspectives of the C M H C directors and QA coordinators.

THE ONE-YEAR SURVEY

The centers were asked whether or not they planned to continue using the Quality Assurance Program once its use was voluntary. Almost all of them indicated that they intended to continue with some kind of Quality Assurance program and most planned to continue with this system. This high percentage indicates that the management of the mental health centers did indeed find the program to be useful.

Comparing the centers intending to continue the program and those intending to drop it, a number of interesting differences emerged. Those centers continuing it tended to be larger centers where the system kept them in closer touch with a broader range of services in the center than would otherwise be possible. Smaller centers commented that the procedures were too formal and unnecessary given their size. There also was a trend for newer centers, or centers experiencing rapid growth to continue the program. The utility of the program in these instances was to help develop new policies and procedures on

Sutherland Miller, Sidney M. Glassman and David A. Winfrey 203

an assigned timetable and to monitor them long enough to insure that they were disseminated and operational throughout the organization. It was also a mechanism for insuring that new staff were knowledgeable about policies and procedures.

In response to the question of how might the centers change the system when they were conducting it independently, many of them planned to change the frequency of evaluations. They indicated that while the monthly evaluations were useful initially, after one year of operation they felt that every second or third month might be sufficient. Some centers identified services that needed to be evaluated only annually or quarterly and others, such as emergency or fiscal services, that would benefit from monthly or more frequent evaluations.

" G i v e n the g r o w i n g accountabi l i ty . . , new evaluation systems are very much needed. '"

pressures for and monitoring

SUMMARY

The Colorado D M H adopted a system of quality assurance that was implemented in all the mental health centers and state hospitals in the state. The system has elements of management by objectives and participatory management while providing a uniform format and evaluation procedure. Given the growing pressues for accountability and additional responsibilities of state mental health authorities, new evaluation and monitoring systems are very much needed. The system described here, because of its short implementation time and relatively inexpensive maintenance costs, warrants consideration by other states.

REFERENCE

Kalb, RJ . , Silverman, F., Tanenbaum, et al. Quality Assurance Program Monitors All Services. Hospitals, 48, September 1974.