document resume ed 206 362 jc 810 519 · 2014-02-18 · document resume: t ed 206 362 jc 810 519....

182
DOCUMENT RESUME : t ED 206 362 JC 810 519 AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented Procedure. INSTITUTION Education Evaluation Associates, Northridge, Calif.. , = SPONS AGENCY California Community Colleges, Sacramento. Office, of the Chancellor. 4 t-' PUB DATE 81 .0, N3TE 193p. EDRS PRICE MF01/PC08 Plus- Postage. y - DESCRIPTORS Cotduity Colleges: *Employment Programs; *Evaluation Criteria: Evaluation Methods: *Infoimatios Needs; *Information Souraes: *Program Evaluation: Research Design: Research Needs: State Surveys: *Statewide Planning: Two Year Colleges IDENTIFIERS California: *Extended Opportunity Programs and Services ABSTRACT This seven-part report presents a 'reFommen3ed plan far state-wide evaluation of tpe Extended Opportunity Programs and Services (EOPS) program -in the Californii^Community College System. Chapter I describes the procedures used' by Educatiaaal Eyaluation_ Associates in devising the plat, discusses the role played by evaluation studygroups, representing legislative/executive and college constituencies. Chapter II presents an'Assessaent of existing sources of evaluation data, including the EOPS evaluation form, Eiscal and budget'reporting forms, student data cards, the Unified Statewide Reporting System, proposed on-site operational program reviews, fiscal audits, and,EOPS year-end reports. Chapter III describes the informaticm needs of the legislative/executive and " college groups for' determining EOPS, program - success:. Chapter IV 'compares the EOPS program data'sources with the EOPS information needs-As indicate& by the two study groups. Chapter V reviews several recent research studies examining different facets of the EOPS . program, Chapters VI and VII present the recommendations for the evaluation System. These recommendations include suggestions, for. changes in several of the existing data sources: the addition of `sevral new measures including surveys of EOPS student goals and lana-term success: and the implementation of An evaluation report schedule. Several appendices-provide suppartiVe information. (AYC) 4 , 7 ,*************************************************** ii* *4*** *wile* * : Reproductions supplied by EDRS are the best tWat c be, Ma e 1' * 1 from the original docwmeht.:' / , , *e*************************************************** *****4 ********* '/ og pew, p 4,,

Upload: others

Post on 24-May-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

DOCUMENT RESUME: t

ED 206 362 JC 810 519

AUTHOR. Alkin, Marvin C.: Stecher, Brian ti.TITLE Evaluating EOPS: A User Oriented Procedure.INSTITUTION Education Evaluation Associates, Northridge,

Calif.. , =

SPONS AGENCY California Community Colleges, Sacramento. Office, ofthe Chancellor. 4

t-'PUB DATE 81.0,

N3TE 193p.

EDRS PRICE MF01/PC08 Plus- Postage. y -DESCRIPTORS Cotduity Colleges: *Employment Programs; *Evaluation

Criteria: Evaluation Methods: *Infoimatios Needs;*Information Souraes: *Program Evaluation: ResearchDesign: Research Needs: State Surveys: *StatewidePlanning: Two Year Colleges

IDENTIFIERS California: *Extended Opportunity Programs andServices

ABSTRACTThis seven-part report presents a 'reFommen3ed plan

far state-wide evaluation of tpe Extended Opportunity Programs andServices (EOPS) program -in the Californii^Community College System.Chapter I describes the procedures used' by Educatiaaal Eyaluation_Associates in devising the plat, discusses the role played byevaluation studygroups, representing legislative/executive andcollege constituencies. Chapter II presents an'Assessaent of existingsources of evaluation data, including the EOPS evaluation form,Eiscal and budget'reporting forms, student data cards, the UnifiedStatewide Reporting System, proposed on-site operational programreviews, fiscal audits, and,EOPS year-end reports. Chapter IIIdescribes the informaticm needs of the legislative/executive and "college groups for' determining EOPS, program- success:. Chapter IV'compares the EOPS program data'sources with the EOPS informationneeds-As indicate& by the two study groups. Chapter V reviews severalrecent research studies examining different facets of the EOPS

. program, Chapters VI and VII present the recommendations for theevaluation System. These recommendations include suggestions, for.changes in several of the existing data sources: the addition of`sevral new measures including surveys of EOPS student goals andlana-term success: and the implementation of An evaluation reportschedule. Several appendices-provide suppartiVe information. (AYC)

4

, 7

,*************************************************** ii* *4*** *wile** : Reproductions supplied by EDRS are the best tWat c be, Ma e 1'

* 1 from the original docwmeht.:' / ,,

*e*************************************************** *****4 *********'/

og pew,p 4,,

Page 2: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ti

r:4

0,,

.

4

EVALUATING FOPS.A SER ORIENTED PROCEDURE

"PERMISSION TO REPRODUCE THIS tMATERIAL HAS BEEN GRANTED BY

. tlarvin C. Alkin-

TO THE EDUCATIONAL FIESOODOESINFORMATION CENTER (ERIC)."

U.S. DEPARTMENT OF EDUCATIONNATIONAL INSTITUTE OF EDUCATION

EDUCATIONAL RESOURCES INFORMATION

CENTER (ERIC)This document has beerLreproduced asreceived from the person or organizationonginating it

Minor changes have been made to improvereproducYron quality.

Points of view or opinions stated in this clompment do not necessarily represent °Morel NIEpositron or policy.

1E1541 EDUCATIONAL' EVALUATION ASSOCIATES .

Ve

. A Repoft to theChancellor's Office,,California Community. Colleges

Page 3: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ti

P.

1.

r

EVALUATING EOPS-: A USER ORIENTED PROCEDURE

(Presented to Chancellor's Office for

With assistance `from:

Frank Barrera (UnlVe

James Browne

California Community Colleg'es)"

Iby

Marvin C Alkin

rian M. Stecher

off iralifornia)

(Sen Aide) .

bly Consultant) t;

Ilot's Office, CCC)

alifornia State University,

(University Of California,

William Chavez -(A

Ron Dyste (Chan

Norma' FimpresNorpridg

Gretchen Fl4cherLos Atideles),r-

d-:Jennifer /rranz (Franz Associates)

Ernest dregoire (Mt. San Antonio. College)

George Hall (Chancellor's OffiCe, CC6)

Bruce Hamlett (California Post Secondary. Education Commission)

.t..

)

Karla Henderson (Los Angeles Pierce College)

Christine Johnson .(College of. Alameda)

Melinda Matsuda (Chabot College) .

Alfred Mendoza (Los Angeles Southwest College)

Vincent Montane (Senate Aide)

George Normington (Department of Finance)

. Paul Preising (Evergreen Valley College)'. .

BUster Sano (Chancellor's Office, CCC) .

Cole Shithasaki, (Legislative Analyst's Office)

Fl -.) yd Weintraub (University of California)

1

EDUCATIbNAL EVALUATION ASSOCIATES9230 jelliqo, Northridge, Oaliforniq 91324, (213) 993 -8070

.

X

Page 4: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

EXECUTIVE SUMMARY

.1)T is report describes a .study of EOPS done by Educational Evaluation \,.Associates (1p). The purpose_ of the. study was to develop a plan 'forevaluating the EOPS program. In developing thip ,plan EEA workedclOsely with two groups. r-One group represented community, collegesethroughout the state. The other 4r-lauded representatives of thelegislature and _various other agencies in Sacramento.

.. ..

In this "user-oriented approach," we based-the' evaluation on the kindsof_questions expressed .'by these two upergroups. We compared thesequestions with the information ,that' was already being collected. Insome cases the data matched the questions. 'In other cases there was a ,,mismatch. We determined how much useful information could come fromthe existing sources and then constructed new4measures to",provide themissirig data. . , ,, ,

. ,,.1 .

.,The key feattres of thfi, report are: . '

.

1. We suggest some changes in several of the existing dataNsources:i -- certain OS.RS ,data' elements, should be revised, an a , new

annual USRS J.survey -of 'all, students should be conducted;- - an automated plan for analyzing he final fiscal claim forms

(Form A-1) should be developed; .- -,the-Operational Program Review (OPR) should be expanded in

scope; . .,,-- the efforts of the California Post-Secondary- Education Com-mission (CPEC) to add an "EOPS identification to student data

. should be supported. -... . 0 ..

. . i.

2,. We suggest several new measures: ..

..,- Supplementer Survey of EOPS student plans end career goals_should be undertaken; i ,, .0

Longitudinal Study of EOPS students''Yt6ng-term sucotssshould be begun;a study of the differ-ences between program plar(Form 10)and program expenditures (Form A-1) should' be done... Thiswill tell us which data can be used7to 'answer certainquestions.,

3. We suggest an evaluation report schedule':(It is not enough to simply gather information. Information must,be communicated in a clear and timely manner.) , .

-- six evaluation reports for ea'ch' year's ,program arerecommended;

-- the focus of each report is desribed ,along witty.. the datarequired to complete each.

4

sr

Page 5: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

Abi

TABLE, OF CONTENTS

Page

Executive S mmary f r ES-1

. ,`Chapter ,1 - P ocedures-

.1-6t

Chapter 2 - Existing .Evaluation Systems 7-35. .

, I

Chapter 3 - The Need for InforMation < 37-53t .

Chapter 4 Statewide' Evaluation, of EOPS ' , 55-72

. Chapter. 5 - Insights from Other Studies . . "'Y.,

Chapter .6 - EOPS Evaluation System Recommendations:

Data CallectiOn

73-90

91=-110

C<,

Chapter 7 7, EOPS EAtuatiOn System Re commendations:,

Reporting and Analysis. -

111-118

IAppendix A - Activity Types ,- - . Al

Appendix B Reflections on Eva'uation of EOP*by James W. Trent B1 -B19

-

Appendix C - Revisions and Additions to USRS Data Elements . C1-C17

Appendix D Student Oriented Data Bases Maintainedby CPEC .

D1 -D3.

Appendix E What Type 'of. Student Are ,Yot.i? E1 -E2

Appendix F - Comparison of EOPS Applications and Final. CIALms

by Jennifer Franz and Marv.in .C. Alkin . . F1 -F17*

Page 6: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.Chapter

PROCEDURES

This report presents a recommended plan for state-wide evaluation

Of,' the_ _FOPS _program in. the California Community Colleg-e System.1/4

,.The plan developed by Educational -Evaluation Aisociates (.EEA), has

been devised over a. period of eight Months ,using a set of systematic. .

procedbres. The first step was the development of a complete,,descrip:x:

. . .

tion of data sources currently] available for evaluation of EOPS.

Concurrently, we sought to understand of various user constituencies'

information needs. The primary mechanism for attaining this "user. .

feedback" was the establishment of two "evaluation study groups"--one0. . .

repreteAting legislative /executive users and one representing college

users. .

aIn the next Step, an analysis .of the matchesbetWeen available

information anduser. needs' along with 'insight's froM other research,;

studies ;provided the bases fon. detailed recommendations. In Mese

recommendations we maintained existing data sources wherever possible,

made suggestions for modifications of some,, deleted others, and sug-

geSted new data sources to be added to the -evaluation system. EqUally13,

as importarrt as the analysis of the data sources are the recommenAa-

tions for evaluation reporting and the analytic procedures for producing

those repos.. The best evaluation data is of littlew

orth unrest- ito

finds. it way into a report or display that. helps users to improve

programs or -make otheri important program :decisions.

6

Page 7: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

2

)

Existing Dais Sources.As the initial step in the effort to devise a set of, evaluation pro-.

cedures, for EOPS, we examined the existing data sources. We focUsed

on six sources of information currentl,y the_..application form.,

the fiscal and ,budget reporting _form, student data, the on-site.. 4foperational. program ,review,. the fiscal audit; an year-end reports.

Information was 'gathered on each of these from ti-xee soullyEes:

examination of all .available materials; interviews with individuals who

i

were developing or ..modifying- existing procedures ard comments and

perceptions Of EOPS groups about' ,the of each source. We

described each data source', indicated existing procedures for data1%.

storage and reporting and commented 'on, they reliab lity and validity ofI

thedata.

. -The EvaLuation' Stud 'y Groups-4.

. 1This effort was'.guided-, by _the -feedback

.

4 Vrepresentating the various user constituencies./

' . /conceived of a combined 'study group representing

two study glioups

'the project initially

legislators, key state

decisiOn makers at the egecutive'level, local ,community college decision

makers -(including Deans and EOPS directors) and staff of the Special

Studies Unit of the Chancellor's Office of the California Community

Colleges. On further examination it seemed infeasible for the cons'ultani

to ..meet simultaneodsly with all of these individuals. Thus, EEi: along

with the Director of Special Studies (Ron Dyste) modified the procedure

and formed two evaluation study groups,

The legislative/executive study group included: James ,Browne

(Aide- tor Senator Alan Sieroty) William Chavez (Assembly Consultant on

Is

7

Page 8: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

3'

Post-Secondary Education), Bruce I-Imlett (Specialist, California. Post-

Secondary EducatiOn Commission), Vincent Moritane (Aide to_,Senator

Alfred Alquist); George Normington (Principal Program Budget Analyst,

Department of Finance), Dale Shimasaki (Legislative AnaInt's Office)

and Floyd Weintraith.(and sUbsequently"Frank Barrera) (Admissions and

datreach Services, University of California). k number of meetings

were scheduled during the course of the project in Order to glininsights into the information they required for decision making. Each

.

of the members of the legislative/ executive study group attended at

least one meeting; in addition briefing papers, draft chapters and

various summary tableS were mailed to all group members for comment'.

The second group, "the colleges group," consisted of the following

individuals: Norma Fibres (Spedial Programs, California .State

Northridge), Ernest Gregoire '(EOPS Director, Mt. San Antonio

College), Karla Henderson (EOPS Director, Los Angeles Pierce College),

Christine Johnson (Dean .of Student Services, College 'la Alameda),IMelinda I,Matsuda (EOPS Director, Chabot College), Alfred Mendoza

(Instructor, 'Cos Angeles SouthweSt College), Paul Preising (Instructor,

Evergreen Valley College), and Buster Sano (Specialist, Chancellor's

Office, C CC) . The EEA team met with the "colleges , group." on a

number of occasions during the contract period in Order to determine

the unique information needs at the local college level that might be met,

by a state -wide evaluation system.

In addition, members of. the two evaluation study groups not

present at a meeting were contacted subsequently by CEA staff to

obtain their inputs into the, process. Typically, these contacts involved

mailing the ;materials that had been distributed at the meetings and ,

8 4 4

Page 9: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4

e N

soliciting comments, but occasionally, phone interviews took place. Some

Members of the study group who were present at the meetings were also'-.

contadedy phone to obtain farther clarification of their comments.

ieverat other-individuals participated in each of the study groups

in' an ex-officio capacity. Ron Dyste, Director' of the Specie.' Services

Division of the Chancellor's Office Participated on a number of occasions

and provided feedback on various reportsr George Hajl, a member of

his staff, -was the contract monitor, made all errangements,for meetings

and also participated fully. Jennifer Frani, working on an allied pro-

ject, also participated. In addition, the ESA Consultant held Meetings

with the staff of, the Office of Special Studies, Chancellor's Office,

CCC4 The purpose of these meetings was to obtain added insights intoo

.

various evaluation data sources currently available.

As 4 part of the process of seeking the most broadly based con-

sensus possible and being responsive to user needs, a presentatioil and

discUssion also took place at iate-wide rrieeting.of EOPS directors in

- Long Beach.No,

Other Research Reports

'In addition to the study groups' input, we, examined several

prominent research reports that might prOvide approphiate insights for

devising a state"-wide EOPS evaluation system. While there were many,. .

related reports that might Piave been examined, four seemed particularly- / .

relevant. They were: The Sta wide Longitudinal Study of Community

College Students (Sheldon and . Hunter); What Happened to the SOPS

Students .of Fall' 1973? (Preisig); College Going Rates, in California

(Knoell); and The Study of Extended Opportunity Programs and

9

Page 10: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

. A.

5

,

,..

Services in California Community Colleges. (Far land, Rose, Nyer &1,-. . . I ., .4.

Thent; evaluation and Tfainihg- Institute..) Iii each_instance_we_obtained. .

. . ,

a copy, of the research report(s) and renewed it thoroughly. Some

reports, we're-single volumes (such as ETI ); in other instances the work

of the project constituted a number of separate but sma Ier studies#

(such - as The 'State-Wide Longitudinal Study). Occasionaily,, we bene-. .:.fited from a phone Interview with the author of, the report (e.g. Steve

.,,

, .

Sheldon) or a -Von casual conversation (e.g. Paul Preising); and, in

one instance a co-author was commissioned to write a paper specifically, .directed to the issue of development of the 'state-wide EOPS evaluation

based on his report (James Trent - ET I report). With respect to the

State-Widelongitudinal Study, we ,benefited from the recently completed, .

analysis of that study by Jennifer! Franz (1981), In summary, an_ .

analysis of these various studies providek an additional element for. . ,.

consideration in idevfsing` the recommended EOPS evaluation, system.

6

Devising a ;Plan

The 'remainder of. the repbrt is quite straightforward. . There is a

certain logiCal consistancy from chapter to chapter; 4hich. follows the

reasoning process ekemplified,by the following set of questions: ,..

°

(1) 1 at' are, they, existing sources of data and larAe, they usable in their, . ,

present form ?,.. i

(2) What are the information needs V the various user audiences?0 4

\

(3) To what extent do useable data sources meet the existing,

U

0.i

information 'needs?

(4) What insights can' be gained about state-wide evaluation of, EOPS. .-

from other research studies?

10

.

Page 11: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

6

(5) Should each' existing data' source be retained, modified or.deleted?

(6) What additional data sources are appropiate for meeting specified

and unmet user Information needs?

(7) What_is _the Ali:getablefondata =collection?

(8) 'What ,evaluation reports should . be produced to serve` he various

information needS? What. data sources are to be a part of each

report?..

(9)' Whit is the reporting time schedule?

(10) What analysis procedures are required for producing the specified

evaluation reports?

C

e - *

4,

4,

.

4

Page 12: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

111

r'U.m.

r4

a

- Chapter II

. EXISTING EVALU.ATIO_N sysTEms

Our work on-

review-of existing

community college

this project includes,' as.

I

I

7

one of its components, a

evaluation Lystems used for state-wide asdessment ofc 1

, .. .EOPS Programs. °' While no concerted effort has yet

. . .been, made to develop a state--wid e' evaluation system, -there are nopethe-

._./less, elements present that might be thought of as an "existing evalu-

tion system.", Data

analysis of this data

are collected from many sources. Suppleniental

is conducted As wekl as other project activities'

which have evaluation implications.,

throughout the system who,

Perhaps it is an to call this

Finally; . there

in some. w4, engag@

are personnel

in evaluation.

aggreg'at'e -of data sources,4

procedures . and people systemen but their presence must be under.;)4

stood, accounted for, and improved in the supsequent;0evelopm-ent- of a4'c

well integrated state-wide evaluation plan. We will consider eadhof

these areas separately.-

Data Source's .

e

A _discussion of the current EOP`51:1rOgram monitoring: sand evalu-

ation procedures of the' Chahcellor's Office of the California .Conimunity t-) 1.

,--,

Colleges, reveals six sources of evaluation daVa about the p.rograrry..These 'six ,sources are:

,

1. The' applieAtion form (EOPS .Focm.#10).

2. Fiscal -and budget reporting (Form A-1)

3. 'Student data (Student Data Cards and :"USRS'').

4.. ,

.0n-41te Operational program review (propoSed)

f 12

I

e t7.

vh.

^

Page 13: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

41.

.

4

5. Fiscal aud44. i

`S6. Year-end r ports (EOPS Form #12)

In the following N1/4x sections we will describe each data source, indicate , ,where and in what form the information 'is, stored, and, assess its.

reliability and- validity..

Application Form

EOPS Form 4410:' Descciplion. An application for EOPS, funding-,

must be submitted annually to'-the 1Chancellor's Office by each commun-,

, ity college. This application (EOPS Form #10),degcribes the proposed;

'program including the fiscal resources required for its operation. The

form includes -a listing,de personnel to be assigned to the prOjedt, of

services to be provided to students, of materials to be purchased, of

student population' to be served, etc.: In short, a rich data base

available describing the intentions of the EOPS programs thi-oughout the

state.

The program description begins with statement of project goals4

and an abstract of community-based needs. It includes a statement of6

need, a destriptton of planned activities, and a list of output objectives

neParted for each component. The application form requires that, the

projected organilational structure be broken down into nine programing-

related -categories; including, for example, Management Services (100),

Outreach. Services ,(200), InstrUCtional Development and Services (300),

Dir.5zt Aid to Students 000), etc. This infoI-n*19n fills about half.of

the pages in the application.

. The n,est of the application is ;concerned with report ing -budget

data in considerable detail. Each of the nine components is broken

"""

Page 14: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

7,

4

I9

down into between' three and 'five activity types;-and-Lail-poleeted staffsalaries are budgeted by activity type. Thus, for the /Instructional Development and Services Component (300) is ,zsubdivided

Into four activity areas;

Curriculum and Course Development 0310)

Instructional Services 02,0)

Tutoring (330)

Other. (34b) .4

r

Personnel 'costs for each employee, are reported by component, and, inaddition, the percentage of time each individual will spend on each

activity is indicated. (For a complete -listing of all 37 activity types, °see Appendix A). 7

, Non-personnel costs are also reported, by component. Standard,

four digit accounting codes are used in reporting these costs (materials,

supplies, equipment, etc.) Both those items funded from EOPS money

and those paid for, by the district'S own funds must be included in the

application.

AlSo reported, .are the number of- student contacts anticipated for

each component, and the total estimar unduplicated number of

students served.

As this brief overview clearly indicates, the EOPS program appli-,

cation is an extremely detailed document, 'contairying a wealth of

information about the anticipated program, One caution is in order,

however: it is only 'a. proposal. As such, it !Lai or nlai not represent

an -accurate description of actual program activities -and expenditures.

It will be important to .determine the differences between these proposed

budgets eeported on, Form #10, and actual program expenditure levels.

14

ktf

Page 15: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

41.

10

An estimate. of these differences' may, be 'available from other fiscal

,reports and this will be investigated a subsequent section: '

E0135, Form' #10: Data Storage' and Reporting ; The hancelltr's

Office Of.the California Community Colleges contracted with Survey Data

Associates of Los Angeles to develop' software for computer _coding and

analysis of the data .from EOPS Form #10. The software that wasSdeveloped can analyze all the quantitativ& data in the program appli-

/, ,

cations (e.g., budget, activity types, numbers of students, etc.), but

the 'program descriptions, (need, activities, etc.) are not being,

coded or analyzed. TKus a great deal of. the/,

information contained in

the,application forms Will not be accessible from the computer files and

no reports or analysis, are planned.

,,kowe,(./er, ,ans)ly;ing,only 'the ,numerical data is still a substantial/ .

task, and there is much infoi-mation to be found there. Survey Data`,fi ,

Associates has also been asked to conduct some data analysis of theI- .' . ,-

-

/,.

,

0-81 prog m applications, and they ,have already produced summarytabulations

of the .first '3 data categories. In each instance, thel

. ,,

nalyzdd data have been reported by -individual college and then. - ..

.

.aggregated. 1py district, geographic' region, size of school, school type

(urban /suburban/ rural) and,. finally, a statewide summary. The

completed reports describe:

'.a. Program Activity Types (Item 1.9 of EOPS Form #10). This

report, tabulates which of the 39, activity types is included in each

chool.'s program application. The' amount of activity -is not reported,-

but only its presence or absence as a part of the school's program.

b. other Funding SoUrces- (Item 2.8 of EOPS Form #10). This

( report sumMarikes the amount of money available to each college from all

1 5

Page 16: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

V

other sources which have potential or actual use in serving disadvan-

taged ,students. The report also indicates, in each. case, whether the

use of these funds is coordinated" with the EOPS program.

c. Direct Student Aid (Component 900 of EOPS Form W10). This

report, describes all grants, scholarships and work study funds paid to

students (both SOPS students and regular students) from all funding

sources.. This includes federal, state, institutional and private monies.

The final report- to be prepared this Spring and submitted by the

end of the contract year (J41y, 1981), is the most complex. It willt,

describe the complete -breakdown of all personnel costs ..by component

and by activity type (item 3.5 of Form #10).

At the conclusion pf the analysis, SDA will give the raw data tapes

and all reports to the Chancellor's 'Office where they will be available

for further analysis, if desired. The Chancellor's Office can then

conduct similar analyseS of Form #10' data in the future.

Some small refinements in the system for next year are still being

considered. In particular, Survey Data Associates has suggested Aome

changes in Form #10 to`, simplify_ future data coding. Final decisions on

these changes should be reached by' the Summer of 1981.

EOPS Form #10: Reliability and Validity. The dataithems:ilyes

-appear to be quite accurate. The chief analyst at Survey Data Asso-

ciates estimates an error rate of less than 2%. This should not be too

surprising, since there are built -in cross cheCks against arithmetic and

clerical errors in most budget reports.ei

There was one small exception to thisi-high accuracy estimate.

Apparently, directions were unclear and confution developed about,

the procedures for calculating the unduplicated student total on the

16

Page 17: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

12

summary pegef The numerous errors in ,this item appear to be..the

fault of overly complex instruction's. The problem is not too erious,b

however, because the correct calculatio can ibe perfo'rmed using the

data reported on the individual component parges.

The cyta also appear to fairly represent the entire system/ There.. '

are 106 coipmunity colleges in the system, and Survey Data Associates

had initialky been provided with applications from' 8i of these schobls

for its analysis. The reason for the missing data was not clear. Forf.,,

some reason, data for only 78$ of the colleges were initially forwarded. T.

to SDA, However, that problem' has now been remedied and the full

data set is available for analysis.. n

.0verall4 our analysis ishoWs that accurate and representative data

on proposed FOPS program. budgets and activities do exist. The.1.4f

Chancellor's Office has the caPability to continue 'generating similar

information in fdture years. It should be reiterated, however, that. if

these are only the projected quantitative data frOm EOPS Form 410..

Program date on project objeetivesi, needs and anticipated outcomes are.not available on the computer/ or .in any other aggregated4form.

o

4

Fiscal Reporting Procedures

The total funding level On which the EOPS program .algplicationf-

are based, does :not necessarily. equal.tbe total' allocation the school willt

eventually receive fore it EOPS project. As a result, the projected

budgets will probably differ from the actual expenditt.tres during the

year. Therefore, k becomes important to investigate the procedures

that are employed to. Monitor and report on acutalpronam experts.

We found. that actual expenditures are reported at the same level

of detail as in the .projected budgets in terms of standard, accounting

1

Page 18: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

13

codes; however, the expenditure reports 'show somewhat less detail in

program categories than the budgets. This will be described more fully

in the following sections.

Final Project Allocations. Final project allocations are not deter-I ,

mined until June or July, after the new state budget is signed into law.

A mathematical formula determines what share of the total statewideallocation is given to each, college. The formula assigns we hts to

different elements of the EOPS program proposal--number of tudehtsr t

served, demonstrated need, etc. - -and calculates each college's portion

of the total. allocation based on a weighted sum of elements of their

, proposal..

Before colleges _can begin their EOPS program activities, they must

first obtain approval for a revised budget which reflects the actual

allocation level, rather than the level of funding they projected in their

initial application. Revisions are begun in the spring after tentativeLi

figures have been provided, but final determination awaits the signing

of the official state budget. ,

Fiscal Reporting Requirements. When schools receive., the formal

notification of their final project allocations for the year, they mustc

submit a revised bddget and a request for budget approval using the

EOPS Budget and Accounting Form MOPS Form A-1)r This form

requires that expenditures be broken. down.2 by object code, using the14,

same standard 'four digitr,codes that were used in the program appli-

cation. In addition,, each expenditure must be coded, according to thea

major. program category which it supports. For fiscal reporting pur-

poses, the school's program has three major program categories rather

than the 39 program subcomponents used on the program application:

18

Page 19: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

14

o Category A -. Program Development and Maintenance;

o Category B - Student Serviceg;

o Category C - Direct Payment to Students.

In addition to the proposed expenditure levels, colleges report the

district's fin&lcial contribution ,to EOPS-related service s and the esti-.

. m4ed number of students 'served by program activities in categories Ba

and C. Once this form has been prepared, /it is submitted to the

Chancellor's Office for approval.

During the coarse of the yearn a variety of fiscal procedures are

in force. Mid-year invoices, ,claims and transfers are.each repor ted on

a separate EOPS form. Invoices and Claims (EOPS Forms A -2 and A-3)

must be submitted before any funds can .fz.*e transferred "to the college's

account. Such requests.at-6 usually filed quarterly, but can be 'sub:.

mitted as often as desired. These forms have only limited interes5t.from

the point of view of `program evaluation.'41.

Of potentially greater interest for evaluation are the EOPS

*Transfer Requests% (EOPS Form A-4). C011eges must obtain approval for

any transfers of funds between major program categories. A brief

explanation of the reason for the transfer mus accompany any such

request. Because such transfers reflect potentially important chan.geS

in the schoo l's prograth, the accompanyipg docuinentation may be useful.

in program °monitoring and review.

In practice, very few major transfers of the type describ'elOtbove

are actually made. Mor,e often, colleges make minor' ;djustmeiats within. .

program categories. These minor changes do not need advance

approval. However, as a matter of course, most colleges do notify the

Chancellor's Office of such changes.

1 9

/ 4

So.

Page 20: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

15,

At the end of -the year, a final claim 'must be submitted bS/ filling

out 'in its entirety the same form (EOPS Form A-1) that was 'sused for

the budget approval. All expenditures must be reported by object codes

sand by, major program category. In addition, all transfers (both

between categories and within categories) must be shown. The dist-

rict's financial contribution must be describdd,' and the total number, of

unduplicated students served in Categories B and C must be.reported.

These reports are sent to the Chancellor's Office where thpf are .

reviewed fa* accuracy and 'errorslare corrected.

Data Availability and 'Reporting. All fiscal reports we have de- \

Scribed are filed with the SpecijilV Fundea:Program,unit in Sacramento,

where they are maintained thoroughout the year. At the pre4ent time,

all auditing and review is based on the actual documents that are sub,-. ,.6'mitted. Nothing is coded for computer analysis or stored on ,magnetic.

tape., r

-Generally speaking, few , summary - reports are compiled. One end

of the year compilation is made of th e overall state-w)de aggregation, of

expenditures by object code and major program category. Unless there

are ,special equests for' Other information, no ,further analysis or

aggregation is usually carried out..

Reliability and Validity. The fiscal reporting system, is quite

accurate. Expenditures are closely monitored throughout the year to

insure that colleges do' not exceed their allocated funding level. End:

of.-tA-Vear report's must balance, of course, and.. they are carefully

checked and revised if there are any errors. Thus, the 'fiscal data

that is available is completely, accurate.

Page 21: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

b

4,

16r.

0.= . .;The only non-dollar figures reported- on the fiscal forms are the

number of students served. While it is acknowledged that this figgre

may not . be as accurate as the expenditure figures, it nonetheless,

merits attention as one, measure of program impact and level of service.

'Student Data

The community college system has had a mechanism for collecting

data qn student demography, and academic progress for a number- of

years. For the' past five or six years, supplemental data on EOPS

students have been collected using the EOPS Student Data Carp's. The

system is nym changing, and the collection of data on EOPS stUdent.s

soon. will be integrated into the Uniform Statewide Reporting System..(USRS). This system' is being pilot teited in the Spring of 1981, and

.

will be fully implemented in the Fell of 1981, w en the data from the

academic year, 1980-81. are to be" reported. by the colleges. We willr

describe ,both the existing Student Data Card system a d the new

USERS approach.f

EOPS Student Data CardsDescription.. Each college has.- beenf.

required by the Chancellor's Office to submit basic and. supplente..,

,data oil each E'OPS student. Three forms were used: -=oneirovideci

certain school-wide' summary data and the other, two -reported on, Char-,

acteristIcs of each EOPS student. The college was' responsible for

coding the dAta and submitting it on punched cirds or magnetic tape to.

the Chancellors' Office (some smaller colleges still prOvide documentary1

input)

Form. #1 requested various demographic data ,about each EOPS

student including sex, age, ethnic background, .highe4 grade attendedC

and marital status. Also included were a numberof items- related to.

K21.

0

./

Page 22: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

0

17

o

students' financial status and eligibility fgrthe program. A number of=items on 'Form 1 were optional: the directions indicated that these items.

"may be ,included or excluded as you deiire..'. these optional item's willo .4

not be processed by the Chancellor's Office `at this., time." As one, cmight expect, very few of these itemi were completed for most stu-

dents. Optional items included questions which might be considered#confidential (e.g. student's _ name, address, living arrangements,

. I, 0

4 -... ". .., , fh _ t.. fincome, family status, initial contact with the school and. initial contact

._ , .

with EOPS). .

. ,.EOPS Form 2 Student Data was designed.. to provide data aboat-,

I

(04).

-financial grants to students, EOPS- services prckidecl, and the student's, .

academic progress' and accomplishmentt: The form requested data on. i

. h

! the financial need of each student Js.g.:, dollar Amount of EOPS. grant,. .

_

. . . , ?., 0

scholarship ampunt). In addition,:. data I were- to =be "pros./ided on theiVr 4/6 4* .

services - received (e.g., hours of recruitment Ube, hours of tutorial'time, hours of c6unseli?tg). Iri- the area-$f ,academ ic progress, each

o 4. % .kr . a

''' Acollege was, asked to report The students' c011ege major; the number of -0 .4' ,

units_ attempted,' the number completed.,. and the students' cumulative

,GPA at the end_ of eaci-i-term. Finally, ,the, fdrm required some measure ,A- ....#

.,of students' "acComplishments" (e.g., wh'eiheeriheY received an AA-)i _ . ..,--s.---. ......,,.

..

certificatt, pursued a four. yearAegree or found employment)..

With the 'exception Of the student's narrie, all. of the information onr

Form 4. was required. (Students had the 'option of providing an ,I ,D.

number other than' their Social SecuritfUmbar, so long as they used

the same I.D. number on 4i forMs.),, k. . ,

. . The final form (Form: 3) provided a Suppar of institutional data,;4.

. ,This form was- designed to obtain statistics on' college-wide indicators,

?

Page 23: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

../

18..,

)ae I

O e

iA

such as the withdrawal rate and the overall GPA at the college. These-tiindicators might be used for comparisons between colleges.i

r Form. #3 provided data about the amount of 'EOPS funds'

.

seven specific program categories. , -

In addition,c.

expended in

EOPS -Student Data Cards -- Data Storage and Reporting., Each

. .-..

college was relluired to submit the EOPS Student Data Cai-d information,, s

,,1to the. Chancellor's Office 'by the en ` of October. These recora were. .,

compiled on magnetic tapes and stored at the Teal Data Center in-, , it ,Sacramenfo... .....)

*., ,f

Limited uses were made of data from EOPS Student DataiiCards. A*

single page SLImmai-y containing descriptive statistics was usually pre- _

...! \ ' .,

pared for teach school. Theeurficitt important" items from the forms wereA ,

reported, including units attempted and units completed. -.A state -wide. ,.. .

summary was also prepared, and it proved useful as a data base for.

describing the EOPS program in legislative bud t hearings. The\..

) report' provided_ evidence about the special needs of EOPS students and

thevlevel of participation in the program. 'Apart from theSe two docu-i, .-ments, little was done with the data onc6it had been collected.

(,...

,;., 1,7."\There are a number of reasons for the aseetiiiiig underutilization

..., .

the data. The data collection syStem.was established with good inten-1.

tions, under, the assumption that the information would be useful ..... 4 ^' .

s

4 However,. data analyses do not appear to he directed by the need for

answers to specific predetermined question's. In EEA's experience, data,. .

collected to be part of an "information base" rather than supplying..

specific information needs are seldom useful . Furthermore storing of

the student data tapei in Sacramento at the Chancellor's Office and. ..,, .

the computer tapes' for the 'program application (Form #10) at another44

23

t

A

o

Page 24: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

19

.6 location increases the difficitIty of merging these, two sources 'of date.

There are apparently plans to remedy this problem.

EOPSTStudent Data FormsReliabilitA_Bnd Validity. . These data

appear to be somewhat less representative and less accurate than the

program application data (Form CO, described above. The percentage

of EOPS students on whom data is factually reported has ranged from a

low of ,.35% of 'the total EOPS population to a high of 86% over the pastt ,. ,,

five years.'' The overall trend has been toward more complete report-

, ing; two *years\ a;jo.Idat Were received on approximately 52,000 EOPS4

An ....,. students out of total of60,000. No attempt hasbeen made to determine

. ..---.---H ,iawhether the characteristics of the studentss who did not report)differed

in ai(y 'significant manner' from those who did. .,.. ,1

.1In additiori to our concern about. the percentage of forms .received;.

we must also address le question of the accuracy of the data that are,.1t 4

reported. Depending on the. particular item ih question, this accuracy .

varies considerably. Bata range from "totally unusable td quite accu-.

rate" as one person at the Analytic Studies unit suggested.

For example, data on studeht enrollment--number of units

attempted number of units completed, grade point average--are prob-4%

ably accurate. Such information is central to the functioning of the5"

college: it plays a key role in fiscal decision, and is highly relevant to

academic adahistration. As a result, it is recorded with greater care

and maintained uri.-to-date throughout the year.

On the other' hand, many f e items reported on the Student

Data 'Cards are not the subject of such careful ,attention. For example,

reports of EOPS contacts--the recruitment, tutoring, counseling and

other EOPS service hours received by each studentare probably not

24

Page 25: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

r20

accurate. . In fact, only two,of, these items have been retained in the

clew USRS System., It was felt that only specific tutoring'and counsel-

ing hours could be compiled reasonably and accurately over the course

a -year.

The feeling among people who have worked with the Student Data

Cards over the years, is "st many of the data items are almost.,,,

worthless., In ,a-cliscussion of the directions that should be taken in

developing a new evaluation system, one member of the evaluation study

group reflected back on the Student Data Cards:.4

"Data cards? The data have not been used because they knew itwas not, good data. Whatever is put, together in an evaluationsystem must go forward and not laok back, becaute the previousdata are not good."

Several factors account for the inaccuracy of much information.

Different problems arose with diffth-ent information categories. We will

mention four problems, notin'g 'that' various- of these issue; wereA

relevant to different data items.

1) Students failed to report some items: Their: own income level,for example, was frequentlyR not reported.

2) Students lacked knowledge ofo some items, such. as 'their'parents' income level.,

3) There were ,flaws in gip -data' reporting system. For example,.

there were only four digits 41Iowed for zip code, but it wasnot clear whether the first four digits or the last four digitswere to be reporteff.

4)- The data reporting forms were not self-contairied,_and it wassometimes necessary to refer to a second document4to obtainan explanation of. an item. This was not always done..

In summary, the EOPS Student Data Cards were de'si'gned to collect

a great deal of relevant information'on EOPS students. Over time, the

number of students on whom data are collected has grown ,considerably.

A reasonably high return -rate has been achieved in the last 'two years.

However, the accuracy of the reported data varies considerably from

element to element.

5a

'.1

Page 26: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.

The information has, not been used extensively, as far as we could.determine. As one college members of the evaluation study group corn-

'mented; "Those of us in the field have never understood the use of"Those

Student Data Cards, for %kihy we need to go- through this exercise in

futility every yeir." a

Starting -withrthe Optober 1981 reportjng cycle, a new system wi41-. ;

e d' 'for collecting this information ao it is to that system that-we

now to our -attention..

.U RS--Desc`ri tion. As of October 1981, all csimmu iiy colleges, , ,,

. ,will b gin reporting OPS,.

student dat,a as part of, the !form Statewide. r ._

Report ng System (USRS),kithe* use,of the EOPS Student Data Cardsport

be' disrontinped. USRS is a comprehensive reporting system.

developed to standardize and simplify information gathering throughout

the community college system., It was pilot tested in Spring, 1981 ,, and,. ,

. .will be fullyhimplemented by Fall, 1981. °

..o

The4 USRS. system v71,11-..gather data 'on a variety of different topics,

principally, characteristics of the total student population and measures'eof instructional,' acti "ty. In addition, ipeEial sections deal with other

topicS "bsuth an enrollment in non-credit courses, the EOPS. program,\s- handicapped' students and the vocational education program.

a

Information p interest to 'evaluators or, the EQPS program will be

found in two places. Background data on I students will be found'

among the common student elements reported ea hiquerter for everyone

enrolled in the- system;, supplementarlfittrtVbn re lating specifically tt. .

. .,

itthe EOPS students will be found among the special EOPS studerit dataT 6

elements collected annually.

.8;

26.

w.

Page 27: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

a

22

Most of the information from Form #1 of the old Student Data

Cards will be found among the common student- elements. This broad

t category includes information about citizenship, high school education,

residence, personal characteristics (age, sex, race) and student aca-

demic progress. Of particular interest will be units of regular cr

courses attempted incripositive attendance course enrollment ,,(enrollme

in short -term or non-credit courses for which daily attendance records

are maintained). To al classroom hours is another data element

which is being collected. This, item takes on importance, because not

all courses offering the same number of credits meet for the same0

number of hours each week. Vocational education courses, for example, ,

usually require a much greater number of classroom hours for each unit

of credit earned.

Much of the information that was c tained in Form #2 of , the

Student Data Cards has been incorporated into the special EOPS data

elements. This includes the student's academic number of units com-

pleted, financial aid awarded, GPA for the term, cumulative GPA, and$

the number of hours of EOPS counseling and tutoring services received.

Some of the items from Form #2 have been dropped. Colleges are

no longer required to report "accomplishments" (T-e:e., certificate,

degree, job placement), information which was poorly reported in the',.

past. t(In many respects, the loss of this potentially important outcomet , i

data is 'quite unfortunate.) The section on EOPS services received hasii

been simplified,' requiring only tutoring and counseling hours to be

reported, and omitting recruitment, re-recruitment, and other serviceteal

hours. Academic major is reported as it was in the past, 'though,,

each se4ool uses its own codes to identify major fields. It would be.fr, :

Page 28: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

23

possible, though somewhat more difficult, to decode and aggregate this

piece of information.

Overall, there are only minor differences between the data that

was available on EOPS students under te olfl system and that which

will be available under USRS.

4JSRS--Data Storage and Reporting. As ,was .the case with data

from the previous student data cards, all USRS data will be compiled by

the individual colleges and transmitted on cards or magnetic tape to the

Chancellor's Office. Systemwide data tape will continue to be main-

tained at the Teale Data Center.

It has not yet been determ what analyses will be done or what

reports will be written. Certainly some reporting back to the schools

and some statewide aggregation wili continue' as in the past. Beyond

that; Irwever, there have been no decisions about what information will

'be ,tabulated and in what form it will be presented. As in the past,

this will be determined by the information nee4 of the varios offices

and agencies and by the_Priorities that are established between now and

the.iime The analysis begins.

USRSReliability and Validity. Careful consultationithas gone into

the development of USRS, and every attempt has been made to insure

that the, resulting data are accurate and representative. At this point,

discussion of the reliability and validity of data not yet collected -can be

only. conjecture; However, the plans for data handling suggest. that

reliabilitt, and validity will both be improved.

For _example, the first step will be to conduct preliminary valida-

tion checks of the reported data. Internal checks.of the coding itselfo 4

will detect errors. Summary statistics and reports of erroneous data

28

Page 29: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

24a

will then be returned to each college for further verification and cor-

rection before the data are finally analyzed.

It is anticipated that more attention Will be paid to some items in

the 7lidation' process than others. For example, there will Probabl be

no attempt to validate how accurately students' academic g are

reported.. On the other hand, sample census data on actual courser

enrollments will be returned to each college for careful cross checking.

Similarly, there may or may not be any attempt to validate the accuracy

of the counseling and tutoring hours reported for EOPS students. In

this respect, the attitude seems similar to the one taken toward the old

EOPS Student Data Cards; some information will probably be .deemed

less important than the rest' (based on a variety of criteria; e.g.,

percei'ved flow validity, low priority, cost).

The overall impression is that the USRS system Will be more reli-

ab)e than the previous data collection effort. There are other reasons

for this observation in addition to the careful organization that has

9 gone .into developing the system. For one thing., the data reporting

will now be coordinated by the data pr.ocessing 'offices at the colleges

'rather than by the EOPS directors. The individuals working in ,the

data processing departments are more attuned to the data collection

concerns than those individuals who most often had the responsibility

for the -Student Data--64rds. ..(O the other hand, they are further,

from the EOPS ptgram and may not have some data available.)a

Secondly, some of the more ambigUous or unclear items wer/e deleted

from the reporting formats. A third reason for thinking that the data

reported in USRS will be more accurate is that some of the. EOPS

-program functions (e.g. tutoring and counseling) have begun to receive

.29

Page 30: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

25

increased funding from .the Zederal government and this funding carries

with it requirements for more detailed and complete record keeping.

These improved record keeping systems should increase the reliability

of the data reported to LiSRS.

It is too early to tell how complete the reporting will be' under

-USRS. The reason for developing a unified system was to standardize

reR9rting requirements and thus make the process easier. More com-

plete and accurate results are anticipated, but-we have no basis for. ..

judging whether the percentage of student data reported under USRS

will increase or decline, or whether the data that are reported will be

more or less accurate than before.a

Operational Program Review Process, .

Beginning ii the fall of 1981, the Specially Funded program unit of

the Chancellor's Office hopes to implement an on-site monitoring and.

review program called the operational Program Review process.

Planning for this effort has been underway for some timer, but the final.decisions on the size and scope of the *Program will not be made for

another two or three months. The following reports the current

planning for these activities.

During Phase I of the planning for this activity, the initial moni-

.toring and review model developed by the Specially .Funded Prograin

unit was reviewed by field personnel from three special programs--, .

EOPS, handicapped and financial aid. These three task forces critiqued

the initial model,' made. recommendati ns for modifications and suggested

procedures for implemebtation.

During Phase II, which is currently underway, the in-house staff

of the Chancellor's Office is reviewing the recommendations from the

/ I . 30 c,

e

4

Page 31: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

f

2:6. r4 task° forces and preparing a. revised program model and implementation

'strategy. In' Phase Ill, the final model will be submitted to the Execu-

tive staff of the Chancellor's Office for approval and submission to the,

Board of Governors. .

D

, Description. We will present a brief description of the Operational

Program Reyiew process model eras it currently exists, with the fullt>

knowledge that it may yet undergo modifications. What follows should

be understood as a tentative picture which provides only a generale

outline of the OPR process.

As currently envisioned, the OPR process will be a two or three-

day on-site team review of colleges' Specially Funded programs--EOPS,

handicapped and financial aid. Time and resource constraints will

probably limit the number of colleges that may be visited each year.

One staff_ person in the Specially Funded Program Unit. "guestimated"

that between 20% and 30% of the community colleges might be reviewed

annually. Thus, all colleges would be visited withih a 4 -year cycle.

At the present time, the `'ChancelloPs Office staff hopes to begin the .

OPR process in 1,981-82 with 'a sartiple of up to 20 community colleges.,.:

The purpose of the OPR process is to assist colleges in preparing

for state. and federal program audits, reviews, evaluations and/or

accredidations. Within this general framework, two sets of documents

have been prepared to focthe review efforts. They are referred to

as the Compliance Documents and the Parameters Documents.

The "Compliance Documents" contain a complete listing of all state

and federal requirements for record keeping for each of the three pro-.4 . , 0.sv

grams. Using these as guidelines, the team members will review with, .

, the college's program staff all the documentation they should have to

comply with existing regulations:

31

,

Page 32: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

te'

27

The "Parameters Dbcuments" for each specially funded program

contain a comprehensive list of all activities, that one might envision in

an ideal program. Comparing these4 to actual programs, the team mem-

bers would be able to identify where the college is making exceptional

effort and whgre additional attention might prove worthwhile.

The previous two paragraphs might suggest that the OPR process

will focus on examination of documents and ignore the obsg-rvation of

actual' program activities. /, This is , not the case. The OPR process is

designed to be much more -than merely a "paper chase." Both, the

area of compliance' and in the area of program activities, the team

members will observe project activities, discuss issues of concern with

relevant staff and participants, and check to see the extent to which

actual program activities matched what had been specified on paper.

At the conclusion of the OPR process the team will conduct an exit

interview with the staff at the ,college and prepare a written report to

the college and the Chancellor's Office soon thereafter.

Operational Program Review in Evaluation System. It is; of course,

too 'early to tell exactly, how useful the written documehtation from the

OPR might be in a broad - based EOPS evaluation system. As envisioned,

the OPR process will consider botnnatters of compliance and program

effectiveness. Tpus, it might well offer valuable data on program

implementation, type 9,f services offered and effectiveness of program

*activities. The parameters document in particular, may provide a

useful Measure of program. implementation. Our experience with such

"checklists" in other contexts cautions us that its use may evolve into a

simple cross checking form which identifies neither level nor quality ofOft*-services rendered in, each category. Barring this, it has great'poten-

tial as an element in a larger evaluation system..

.32

Page 33: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

28

Fiscal Audit Procedures!

The authority for auditing EOPS programs stems. from a 1951

statute stating that "Audits were to be performed to determine the

validity of the-allocation of state funds for school/program purposei and

to establish proper' fiscal accountability" (Chapter 1259). The legis-

lature has reconfirmed the intent of this statue with the enactment of

Senate Bill 787.

In the spirit of Bill 787, the 'Department of Finance and the

Chancellor's Staff jointly carried out approximately 12 audits of EOPS

progranis in 1879-80. Since that time, the procedure has been tempo-,

rarily.discontinued in 110n favor of a new model--an EOPS self-audit, which

is only in its pilot phase at .the present time., The new sglf audit will

: be described below, but first we will review the.stcengths and weak-

nesse's of the existing audit 4process. This will also familiarize the

reader witlhe important content areas to be addressed in any fiscal

audit--be it the current model nor the projected self-audit format.7

The Existing Audit Process -- Description. The audit procedure,

itself is directed from the Chancellor's Office. its intent is to review

the documentation of all expenditure claimed against state EOPS funds.

In performing the audit, a team of fiscal experts spends approximately.

10 days to two weeks at an, individual EOPS program: site, examining all

records pertaining to vEOPS expenditures,, including both., finanical

records and documents'. At the conclusion of the visit, a report is .

ptepared detailing the team's findings and recoinmendations are

presented to the EOPS program director.

o -The present audit process has two. major parts: a funding review

and a program review. These functions are 'conducted by the on-site

I33

Page 34: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

2,9

,audit team, and should be reflected in the final report.developed by the

team.. 4411

The basic question to be .answered by the audit team in the ,fUnd-.

ing review is: *Are the expenditures reported on the EOPS final claim

format correct and substantially in compliance with EOPS regulations?"

The audit tee- carries out two procedurelt in responding to this

question: a fiscal audit test and a Title 0 audit test.

The fiscal audit test reviews the budget accounts to determine that

the ,EOPS funds are being accounted for separately, and reviews the7Thexpenditure documentation to determine the congruence between the

amount claimed and the documentable expenditures.

The Title V compliance test looks at several factors related to the

expenditure of funds: eligibility of students to be receiving EOPS

monies, eligibility of programs for fundyig under EOPS guidelines,

payroll expenditures, percentage of personnel time expended on EOPS-

related activities, eligibility of salaried positions, and costs excluded

under' EOPS guidelines.

The basic question io be ansv_29...1 red by the audit team in the pro-.. .

gram review section of the report ,is: "Are the programs and services

. reported in the approved project in operation and adequately meeting'

the -goals and objectives of the college?" -

In responding to this question, the audit team looks at wkether

the personnel exist and the program is actually in operation as planned.

They document whether there is an EOPS administrator directing the

program, and whether there is an active and properly constituted

advisory committee. They review the level of staffing to determine if

there are an adequate number of personnel to properly identify EOPS-

34 .;

Page 35: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

30

VI

4

co:`% ,tIi.

4 .,.,

eligible students and track their progress. They d4t4rmine hether1, , --

there are adequate procedures for maintaining audit trails and for

determining student eligibility. Finally, they examine whether thre are

adequate support services to meet the needs of the EOPS students and

\ to carry out the goals and objective's, of the college's EOPS program.._ 1- ° - .

Results of Previous Audits.sili. Some atype of EOPS program audit has-

been carried out for over a de*Cade, and during this time, the audit

results have been mixed. The main audit exceptions in the funding

reviews have been: awards of aid to ineligible students, inroper..

,accounting of EOPS funds, and a lack of Understanding regarding

regulations. While such examples of audit eXteptions;can Abe found, it4.

is fair to say that, by and large, the, audits conducted to date have

shown no major discrepancies. The major need is .for a procedure to

continue the same level of fiscal. scrutiny at a lower, level of staff/-effort. The guidelines from the Chancellor's Office to the audit tea

for the program review section or the report !naiades. an appendix

defining. each of the terms used. This is to insure consistency and

clarity in the reports, dc/Stemzwide.

. An examination of several sample audit reports s4ows that theIre.

actual activities carried out by the audit team. did not fully 'encompass

each of these guidelines. Rather, the audit team appears to have

concentrated on providing detailed examinations of each line-item, om.-4

the budget to judge whether the item compiled with guidelines and legal

requirements. By following a line-iteth-by-line-item format, the team

Adid not respond to the question of whether the_progratrist and services

were adequ'ately meeting the 'goals and objectives Of.the college.

Page 36: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

r

4

31

This program review section- of the report haf'tipically turned up. 7

a number of audit exceptions at each college reviewed? $xemplayt ofil, . , / . t -

the kinds of things noted arT:. services to ineligible students.,.queitionable advisory committee functions, and poor-record keeping

processes. a. ,

The problem of services to ineligible students seems to be a fairly

common one: "Students enrolled in this program were, in many.. .

instances, higli school students Ind- not eligible t7 apply .for federal' , 6

financial aid whose procedure's for determining eligiiStity are also the

basis for determining eligibility for the EOPS program"..."Students

enrolled in this program were-, in many .instances, found not.to have

completed a FAF 'form. which is utilized' as the basis of determining need

,

and thereby eligibility for participation in'the EOPS program".

Poor -record keeping processes were fouhd to bi'resporisible for*.

several audit problemsr: ".. the documentation made available 4to° the

audit team was inadequate to demonstrate (these claims)':..."in 'addition,

the purchase gf (item) ... was inappropriately, reciSrded under this

budget category".. *KJ, expenditur9,5 for student aides should not -be

recorded here, but should be recorded more appropriately in Parts B

or C.;'.

The EOPS program ineeds to maintain its credibility by bsing able.

to wTf?tand rigorous scrutiny from-the outside. 'Since its inception,;. , -

however, EOPS has not,,, had a well-defined and strictly,-adhered-to_ . .

means by which it could be fully :audited for fiscal and reg.ulatoty

*compliance. in order to begin standardizing the fiscal procedures and, .

to reduce the number of audit exceptions across all EOPS programs; a

special project has been funded to provide a mechanism fo-r addressingA 6 1.

. t .this common area o'f need.

9

,

,

Page 37: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

r

32

Self-Audit Guide. The Chancellor's Office of the California4.

Community Colleges has funded a special project at Sierra Community

College to develop an audit guide. It is- intended that`this Guide be

used by those in the field--directors, business office directors and

personnel, district auditors, independent auditors, and staff ,of they

California Community Colleges Chancellor's Student Services Sec,tion--inM

the 'auditing of their EOIDS,,,,programs. The Guide will be a "working

tool for Jay individuals"--, a"well-defined means" of auditing for fiscal

and regulatory compliance. Those developing the Guide intend to

adhere to audit standards prescribed by the American ,Institute of

Certified Public Accountants, the ,DePpartenI of Finance, and the

Chancellor's Office.

The Guide itself will have three parts: (a) ° financial audit of. -

budget activities, expenditures and claims;1 (b) illustrative reports as a'

result of fiscal activities; and (c) regulatory compliance reviews. It

will alSO include the following reference checklists: compliance; student

services /direct grant eligibility; regulatory reference material.

It, was projected that the finished Guide will be available and

delivered to the Chancellor's 'staff for review and distribution to the

field by August, 198.1, with the development team staff available.subse:-

quently for on site, in- servlce.training. A saiiip14 of 20 khools will be.

. 6.identified for ield review of the audit procedureS in 1981-82. Current

plans call for a detailed review of 3-5 of these sites the following year5

to make final adjustments in the guide before systemwide. implementation

is begun in 1983-84:

The Self-Audit Guide should reduce the incidence of audit

exceptions, while providing a Common' means ,of.,-measurement among theti

Page 38: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4

. t

-33

various participating (Colleges. It_iwill also provide a common point of1hr

reference for all EOPS program personnel. The Guide. does not, appar-

ently, address the issue mentioned in the Chancellor's Audit Procedures

for EOPS of whether "the programs and services reported in the

approved prdject are in operatiOn, and are adequately meeting the goals

and objectives of'the college".

It is. impossible to determine, at this early phas in the develop-::

ment of the self-fjudit guide whether it might be useful in an evaluation

system. There is no doubt that it addresses, jmportant issues. How-

ever, its form and accompanying procedures are far from final, rd it

wou I cj, be premature to do anything beyond making note of its existence

for future scrutiny.

Year-End Report EOPS Form #12--Description.

The EOPS Program 'Unit 5t the Chancellor's Office of the California

Community CollegesColleges requires EOPS programs kio present a "Year-End

Report. This report .is intended to provide data on the extent t(4.which- .

- ,

the various program activities haveimpleMented and the intended

future status of each activity.oahe report does not provide arty indica-

tion' of wh'etPiesr or not program objectives have been attained. Thu's,1,

7

in evaluation terminology, the intent is simply to evaluate the extent of

implementation of the instructional process, rather than the-educational

outcomes produced-by that process.

The year-end report is really quite a simple affair., Each college

is asked to.duplicate the prograth application description ppges from the

project application. For each program description page, (typically- an

objective), the college fills in a Form EOPS #12.

.11

Page 39: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

-

, 34

ti

Form #12 asks for two things: (1) a desCription of how each

activity listed in the program proposal was accomplished and the pro-,gress toward accomplishing each objective .listed; and, (2) the status of

the activity. EEA's review of a sampling of year-end reports reveals

that colleges put forth very little effort in _describing progress toward

accomplishing each objective listed. On the other hand, there is some

self-evaluative data on each of the activities. Thus, colleges might-1-. . .

desrtcribe the way in which the activity was conducted, the number of4%.,

planning meetings held, the extent' to which the activity was under-

taken, or whether there was variance in implementation, etc. This is

potentially interesting information. However, it is not clear what use is

made of these 'data. On the one hand, the anecdotal and highly

idiosyncratic nature of the data makes them incredibly difficult to inter-24:0

pret. On the other hand,,the detail available about the extent of

purported prograln implementation throughout the state provides a rich4 .

base for further understanding about EOPS programs.s

The final solumrf of Form EOPS #12 asks fort an indication of The

future status. of each activity. Community college EOPS directors

provide their own estimate of whether that particular activity ought to

be maintained (M), Reduced (R), Expanded (Q, or Discontinued (D).

Not surprisingly, a sampling of year ,end reports reveals that, EOPS

directors. feel that most activities listed in project plans ought to be

either Maintainelppor Expanded. There are very fewlttances of a

request for reduction. Typically, requests for discontinUance of animp

activity. follow the inability of the .project to implement it; thus in

instances where personnel could not be hired or where the activity iyas

simply unattainable', one occasionally finds a request for discontinuance.

39 .

Page 40: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4,-

35'

4

It seems to us, that the futurre status category pfovides little in the

way of usable evaluative data. It may be helpful for the EOPS program

staff at the Chancellor's, Offite in reviewing next year proposals, but it

does not provide much indication about ClAnt year programs' success.

In part, we: make this assertion, becauh of the lack of guidelines.

4 presented to program directors Ln determining future status. In short,.

a program; director might have any of .a wide variety of motivations foro

making particular judgments about the future status of each activity.

EOPS Form 12-Data Availabiit and Re ortin No information is

currently available frOm this d4a source, because its use has been\tempprarily suspended for the current Ithool year--pending development

of a complete evaluation A . In the prior school year ('80 -'81),

only alfampling of colleges had "tried out" the procedure.

IP

I

4n

elf

1

Page 41: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.

oei

Chapter III

THE NEED FOR INFORMATION

37

All parties (the legislative/executive group and the collegeS group)'

have the same major concern. They want to be a le to interpret thedata convincingly enough to indicate whether or not the EOPS program .

has been "successful." There exist, however, many differing Per-ceptions of the meaning of the term ."success," and these perdeptions or.

definitions inflmence the type's of questions asked and the corresponding

data-gathering requirements. In this section of the report we willdesCribe what the . two groups-legislative/ executive and the colleges -

a

perceive as appropriate evidence of success. We have deriVed these

views from comments made by each group, in earlier meetings, as well as

from follow-up interviews where appropriate.

FLegislative Executive Branches

From the evaluation study group meeting and from subsequent

phones onversations with several membet's not present, some points of

view have emerged about appropriate information for legislative /execu-

titfe needs. Generally, particidahts from the legislative/executive

branches -of government primarily identify "success" as the rate ofstudent retention' and the subsequent transition to four-year insti-tutions. Transition, itself, is seen as the logical and desirable outcome

of _the ed ucattonal- process 'which begins with recruitment, continues.

witb the ,formulation of student goals after coUnseling, and cOrIcludes

with the completion of theacourses attempted in r to gain acceptance

into a four-year institution.,

41

Page 42: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

38

This theme of retention /transition recurs throughout the course of

the discussions. For example:

"I'm' interested in .numbers with regard to recruiting, reten-tion--there are measures of retention, some standard measures,class dropout rates, that sort of thing--another sort of measure isthe length of timeit takes a student to complete their objectiveversus other students 'having similar objectives--and also whattheir success is when they leave the institution, which we call'transition. Those are outcomes that I have art interest in, and I

think the legislature would...."

Another comment further clarifies this point:

"There's a big source of transfer students in community colleges,and it really hasn't been transferring...and _some, b..se, we'vegot to get EOP directors to see that th students get in and outas soon as possible a d go on to a four-year institution."

As a consequ nce of 'adopting this view of "success," the data-

gathering 'requirements focus around several categories of data:

student data, program data, and financial/budgetary data.

StRent Data. Participants in the discussion recommended the

gathering of a number of types of student success data. Ideally; the

measurement of "success" Would include the following:

a) Longitudinal data from recruitment to retention to transition

b) Numbef-s of students who go on to four-year institutions

o). Follow-up studies on students after transition

d) , Minority representatiori In two-year vs. four-year programs

e) Comparable records for both EOPS., and regular students in

terms of progress through the system, grades, units'. 4 /

attempted vs. units completed, graduation. and placement,.. -Aside from these measuces, a number of tt)ter comments were made by

.4.. -.study group- members abofut Aata pcoblems.,or information needs relating

°' el A, `' o 'to students. # I v,

.24,o

. '%. .. , .

r; 0 u,

't0.1 0( 'a t, .

,, . p'. ,i

I. 12a

Page 43: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1

39

Legislative/executive study group members generally felt that more

emphasis 'should be placed On collecting additional anecdotal/descriptive

data than has been the case in the past, in order to better understand

hoW students make decisions and the contributing factors to retention

and transition. As one participant stated:

"My sense is that we get a better response in legislative hearingswhen we have students who are warm bodies there, giving storiesabout their experiences with EOPS, indicating that they're ndw atlaw school here or there or president of their own business now.Demographic data does not really hold great interest. for the legis-lature.'

Another participant contributed:

"I remember one time I went with three students just to visit. Wehad no intention of testifying. It was the Senate fiscal committee....the Chancellor's staff was in a state of surprise because it did-n't look like they were even going to go along with the governor'sbudget request--they were going to cut from there. So I Washustled up, and they wanted us to provide some testimony insupport of augmentation. And what happened is that my studentsliterally got into arguments with (names of legislators), so thatwas proof of the pudding..."

Follow-up studies of students after transition (to a higher educa-

tional institution) are seen as further appropriate documentation of the

effectiveness of the program outcome:

"We've been asking for that (data on follow-up after transition) forthi-ee years. And it seems like, given the fact that it pops up onevery set of budget hearings, it, seems, worthy of being incor-porated into whatever ongoing evalvlation we have:..That's,'one ofthe things that strikes me. the ,strongest about future budgetaugmentation - -would be to find that EOPS students just propor-tionately outnumber non-ECIPS in going to state. colleges. That'svery important."

Study group members .also pointed to the importance of reporting

student data on progress through th college (units, attempted versus

units completed) and retention within the college. Furthermore, a

# 1

representative from the University of California ° noted the desire to

have evaluation data that would allow them "...to track how U.C.

43-.

J.

Page 44: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

(

4

er \ 4

40

Outreach 'students. do ii the Community Colleges.. would like to be able

to track and work with these students."-.4-

It should los noted, however, that EOPS students may have very

different goals from uniformly preparing to transfer to a four-year

institution. Although the participants seemed to agree that the legis-

lature, itself, woUld' not put priorities on serving one type' of studentQ.

(those who were Manning to. transfer to a four-year institution) as, ..

opposed to another, they also reaffirmed that the particular issue of

retention/transitin is the one Which recurs over time. 1For example, in

response to a question about possible legislative pressure in one

direction or another, one particip responded that:

"Well, let's put it :this way the extent that you can anticipatethe policy directions of th eg!slature, which is very difficult, butthe legislature has alr brought this _issue up for a couple ofyears (retention/te sfer)--the extent tp which you can come upwith solutions and directions, the legisl ture is going to be muchmore comfortable with giving youmoney tb do it."

. 4.$Another participant continued, .

. ,. -

i "I don't think they're going to say your should have that as apriority, but they're sure going to say how are you doing? Whatpercentage of your EOPS students are transferring, etc. ?' Wehave a problem which has been identified by everyone now."

4 ...--

By emphasizing retention and transfer-at-a recurrent criterion fore- 4 ,

"success," much less attention is being paid to the achievement of other:, *

types of goals In other termS. Thus, The power of the data purporting.. I .

to measure' total "success" in other terms is correspondingly weakened.---,

Some further, concerns and comments were expressed by legisla-

tive/executive study group members about the topic of student goals.

Most of these comments ielated to the goal data as an inadequate source

Of information. The first, type of problem revolved around the formula-

tion of the goal question on the student- data cards:

"1" Y .., 0

41..

Page 45: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

41

"That's one of the surprises I got in looking at the statewide datathat's requested...they were saying that the categories for goaldid not express the kinds of things their students pursued attheir campuses. So what they -did is they made that particularquestion optional. And very few people 'responded, I think it is

1 much less than 50 %. It's not mandated.-:.But the other is thatamong the ones who responded, there were a lot who said "none."Maybe' that says the question is asked poorly, or it means thatstudents are not getting much in the way of counseling..."

4

The second type of problem stemmed from a lack. of expressed

goals on the part of the student entering the college. The difficultywas summarized as follows:

"I can see people coming in and putting down an undeclared major.They might say, don't put a particular major down, -even if you1p-sow you're going to declare a particular major, just basically laylow your first year, because .you might want to change your mind.It's b.etter to shift from undeclared than fro& another 'major.Maybe there is that sort of, 'well, yOu don't have to come up witha. goal at this point, let's see where it takes us.' But the point isthat you're going to use the dollars better, if people have someidea of what.they plan oft using. their time in school for. I -don'tthink it's unreasonable to have a goal in mind, an AA, BA, or acertificate in a particular area...i'm not into the idea of a revolv-ing door. We've got a lot of students gaiRg in and a lot going* outbefore they've completed any kind of deg)-ee, and it's unclear--Imean l'rn sure it -has some beaming on their life goals. Well-defined goals seems like a real important starting point."

Many students simply do not know what their goals are and view

college as an opportunity to further. refine and understand their own

capabilities and opportunities. . This "searching" part of a college.1,

education is not confined to EOPS; indeed "undeclared" is the most

popular major at four-year institutions throughout the United States.r.

However, one study group member felt that such comparisons are

inappropriate because the legislative/executive branches demand greater

evidence of goal direction among college students who are receiving

special services in state-funded programs.

A final question arises regarding the "meaning" of student data

that is collected, within the context of an ruoben 'access" policy at

" \45

Page 46: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

r-

42

community colleges. Many

the EOPS program. These

point in their development

data has been gathered on

different types of-4tudents now enroll under

students are not all starting from the same

and/or academic achievement level, .yet little

the rates of individual student differences.se

11,

As one participant'stated:

"...'let's put a minimum GPA requirement' i think that, kind ofattitude still prevails. And I think a lot ofthe legislature doesn'tunderstand the -meaning of open access, the open door policy ofthe community colleges, and what it is all about...The other partrelates to the youngster who has grown up (in a disadvantagedhome) and then gets, the notion to go on to get a BA and a Ph.D.You need some data on what kind of student you're getting in thefirst place... Cleanly , the students are poor. And, of course, thatcan have a lot of bearing on other things, too."

Since the factor prior preparation and readiness for college is

seen ins having an effect on the eventual ".success" of the strident in

terms of retention and transition, the consensus is that more emphasis

should be placed on gathering this type- of background student data.

program and Process Data. Several legislative/executive study

group members identified the importance of careful documentation of

services provided with EOPS funds in any evaluation report.

"The argument we make' for EOPS is diet it is a iPtegtirical pro-gram designed to target- disadvantaged students, and what are :wegetting foratriis? You guys (Chancellor's Office EOPS 'Staffs), willget *up and testify that we get finantial aid-,we get counselingservices, all those sorts of things. ' Well, we ought to know towhat extent those services are being provided and to what_extentwe're getting any benefit out of.it."

The _point is that the EOPS program, along with most other pro.:

grams,- simply does not. do a good enough job of describing fully and,

completely the serxdrces that are provided. Legislators need to have full

and 'complete (and understandable) information on what specific services

were provided to what specific 'student's in what time frarrie. (They also

want to know "with what results"--but, are satisfied to a great extent

if the evaluation of services provided is well done).

, 46

Page 47: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4 3

This is illustrated by the following example: A legislative repre-

sentative on the- study group identified the annual report of the

California Cooservation Corps as a model of a good evaluation report./\This relatively brief report, analyzed. by EEA staff, documents the

achievements of the CCC workers in performing public service tasks/

public service program objectives. While the' report fails to offer any

evidenpe of the implementation of student improvement program objec-

tives and 'student achievement outcomes, it 'does an excellent job of

documenting services provided to CCC workers and to communities.

This had a strong impact on the person who saw the report. The

report also highlights the importance of well-packaged anecdotal data,

as was mentioned previously.

A further elaboration of the program services theme is provided by

those who feel that an important element of relevant'evaluation data.is

the description of program type and program organization. The interest

is not only' in the description of the program characteristics, but in an

analyses of the extent to which those-characteristics match the.colldge's

EOPS goals and are sensible in light of that mission. One evaluation

.study group participant, stated,

"In other words, I'm also interested in how-the program'is organ-ized to achieve what it is supposed to do. And are there anypatterns in terms of the number of components funded and any ofthe results that seemingly are achieved? I'm also interested inwhat-type of personnel are providing the service..."

Financial and Bucidetary Data. Another area acknowledged to be

important was fiscal and budgeta'ry data. Sludy group participants

noted that auditing the proper expenditure of funds was important, but

emphasized that fiscal concerns want beyond merely auditing that no

irregularities habd occurred, that .,regulations had been followed, etc.

4"1

Page 48: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.44

The types of information generally conceded to be useful in relating

evaluation to financial/budgetary issues would include data to help

answer questions such as the following: (1) Whtt is the, relationship

between the resources used and the number and type of students,

served?; (2) What ;is obtained for the fEOPS'money in terms of program

components, implemehtation Of programs, benefits,°' improvements ?;

(3) What is the relationship between EOPS financial id per student and

retention/transition rates?;- (4) What are the( hidden costs of the

program?

Participants affirmed the importance of accurately describing the

financial aids provjdd ttr students in the program. "After all," noted

one study group participant, -"I've never heard any cre of those ads

(recruitment) that didn't mention, EQPS financial aid."

The, post-effectiveness of EOPS services was a continually recur-.ring theme. The concern is reflected by the following example of

counseling procedures at different colleges. (Counseling was only an. -

example, the cost-effectiveness issue extends tQ all services.). ,

"The best. illustration' that comes to my mind is the diffei'encebetween .-City College (A) and .City .College. (B) in the area ofcounseling. (B) spends a lot of its money in hiring. professionalcounselors, so they, have a very high cost- for counselors. (A)spends most of its money on student counselors who are trained toprovide more minimal functionS.4 So what I'm interesteil in doinii islooking at what are the expected-outcomes. If you're going to.payfive times ps much_mohey for proteisional cotteling vs. a peer,what do you expect is going to be theresult on the student inthis difference?....That puts us in the middle of lots of:(ciontro-versies) but at the moment we don't really have an .idea as to whatthey are doing."

While it has been very attractive to talk about, the cost per FOP'S

student and the average costs of different:kinds of services provided,.these numbers are deceiving-Ahere are hidden costs. One of the

partiCipants provided a caveat:

48

Page 49: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

v

45

"One of the selling points,of the program over the years has beenthat its looked like a bargain. ,But there's been a hiddencost....and that is that those figures don't include the _additional moneybeing put in by the. college ADA revenues. There. is also thequestion of students actually served by EOPS: How many studentsof those eligible for EOPS (who didn't get funde50 didn't continue-in college?

Thus, it may not be right to consider cat-effectiveness within the

riarrow limits of EOPS proat activities. A w der view of these -issues

is likely lo. be required. This is illbstrated in a other Way by potential

"data problem" related to the economic self interests of colleges,. partic

ularly with regard to transfer studepts. As one articipant noted:

"In fact, there may be fiscal disincentives (to encouraging transi-tion). If you have problerps getting students this year, a collegemight suggest that a student come back -next year--that studentsbe encouraged to stick around. To summarize, .1 think the otherpart of it is (other than the fadt. that maybe they're going to helpsome human beings get a degree and all that) what is economicinterest of the community colleges?"

this person suggests, the incentive structure at 'the adminis-

trative level may actually work in opposition to th,e 99431 of increasing

transition rates to four year colleges....

As was suggested at the outset, the legislative/executive study

group had very basic concern's in three areas. The preceding

cussion illustrates.their focus q student data (particiilarly measures of1 .

retention and transition), pr4:a. da% (with a focus on documentation-....., .. :.,. ;':,, , -

of services) and fiscal and buaget-data.$ The next section will describe.. ., ,. .

the 'colleges' study group's point of vieW of the types of informationiA

kq,

needed in An effective EOPS evaluation' system.

Colleges

Several types of information needs emerged from, the discussions

with participants representing the college& point of view. Many ,v9 ere

49

Page 50: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

46

_similar to those expressed by th legislative/executive, brtnch partici-.pantS. One obviOu'S' difference,; however, was that tbdre were "many.(and broader) percePtions, of what could be ton..sidered "success" by 'the)

colleges group. .. . . ...

. .

participants agreed that' -their basic focus in measuring program, .

outcomes accurately is to just -the programs themselves and to help. ). . . . ., ..

both the local campus and the statewide system. impt4d:ye program

planning...

"...there's two things that: have been Surfacing.. We definitelywant information,* for`- eXpleirkilg pr justifying the programs, . state-

o , wide, or lote17--, and also, we want !lore information to. help the local 49,. catnpus or:. statewide iMprove planriing:'' And around these two

, fictorg%.,.-planning end justtfiCatioq...we need information that's&log-to:meet both 'of those neeils .11 \

i> : ,,.

, e two issues elaborated no require both dela about students, . 1 , I , , - k -.'1..,

V, t , :and a 'abdut progr-am organization a d resource usage. Althugh

, ....;,...., . ., (---.

4"ilvs., Ord is' del4 amount of overlap,,,we will- organize our "digcusiion of.,

,. ..,., , , . ,.- =,

lithe'. iii_fetrination needs in terms of ttlesePmtwoicategories.,Asa - 1-7 (

stuagnt''Qate4 One dominant theme 'that sounded throughout theS

various_ conversatio '!v!th ,the college leVei evaluation study group was; ,1$ .

the colleges need fored; ,-`MoregAnArmatilin° (thal) is presently available .,ro

, ,-t4)College study,.gUli Members also, expressed concern about the `quality. .

of data that-was availeable: and 6it lii'Wi;osedy res fo-r gaining *access. :. r c,-",

t state- collected data. he cat C., la

study group members provided,.t.

some detail as to the. type of inform at th y desired.. The information, t, : i

type% can be classified as follosds:,:??

)- ,.

*v.,

a) background information on.,ttfe studen s starting point before, , ',..

he/she entecathe EOPS ,pro ran,.;

b) reliable information on s,'b

50-

Page 51: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

0-

47

. c) comparative data between EOPS students and, regular students

in terms of number of units attempted/completed, goals and

retention rates,

d) demographic information

The colleges need more background information on the 'starting

point of each student becau'se of difficulties in interpreting retention/

transition rates. At one participant commented,

"It may be that different colleges are serving different studen'tgroups... I'm ju*st simply saying that we need to think of son', wayto be able to get better understanding of the meaning of thetransition data, and for that reason we've got to have some betterunderstanding of differences between student groups at differentcolleges."

'Another participant questioned whether:

"...in terms of looking at the unitsjiattempted and completed, weshould adjust to the type of student that's actually being retained.Thus, the starting knt of students is importackt in understandingreteption."

One 'problem with the existing information on student goals arisesI'

from poor question formulation on the student data cards.

4

"...the student information system now asks for stu ent als asan optional. item. Very few people respond 'to i Among hosewho do, over half say "unknown" or. "don't-kno ". Superficially;that looks odd that so many Vudents would not know what they'rethere for but what it more typically means is (that they wanted to

diresparfal "none of the above." Our tools are, not sensitive enoughto pick those up. So we're just missing it."

Members of the colleges group concur that retention and transfejare appropriate dimensions for the measurement of success.' 'But they

caution that it is naive to consider these data without an understanding

of student goals and student educational andhome,background factors.

As one participant stated;,.

"See, we don't have aity.iinformation that eve could say, nowwait aminute--this particular group of students has been characterizedas having the need to be in remediation for up to a, year and a

1

,51

Page 52: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

...

4

v

48 ---......

/

.

thalf, before they enter into the regular programs --that sometimeshappened. So you just can't take a straight- out comparison ofhow many with given goals start out in year, one and are done byyear three, or year two. Thecommunity college students, EOPSstudents included, are not homogeneous. Although in EOPS,there's a tendency for people to think of these stud nts as ahomogeneous group, they're not.' And, we don't really eve anyindication of what those differences are."

,

Another participant pointed out that., .n

"...there is another question of success rel d to the types ofthings we've been discussing -- retention data. W 9pens whenthey leave? What are the transfer rates of the paPS students?And so forth (looking at these kinds of questions-) that strikes meas being a real dangerousa*practice if it were done unqualified(i.e without knowledge of student background and goals)'. Wehave some programs that serve any 'student who meets the incomecriteria. And we have other programs that deliberately set apriority on students who are exceptionally high risk...And thenyou might look at retention, or the length of time it takes for the, student to progress through school (which is another measure ..ofretention that . is often used) and again come to a false ' conclu-sionOr, where are they three years after they entered? Wemight wind up with many false conclusions that one program is notdoind..as good a job as another if you just compare outcomes."

Another df iculty with measuring "success" by Nention and transition'4rates relates to the problems of chang s in student goals. For example,

.

"A student comes in and says tha\\ his goal is to be here for oneyear to 'improve his skills and ge a job and they're there andthey improve their job--is that success? If a student says thathis goal is to transfer to a four yea institution, but then at .theend df one year doesn't transfer to four year institution anddrops out and sets a job--is that succ ss? Or is that a functionof whether the "student's initial goal w realistic? If a studentsays that his goal is to be here and get two year credential, butthen transfers to a four yee'r , college - -i that super success?"

Program and Resource Data. Participants p inted, out that theI %

-----'.7".

colleges have a multiplicity of missions. As a .resu it is difficult do

pleasure program success.

"The community colleges serve multiple purposes compared to just,transfer and vocational education. There are com unity services,and a lot of student service programs, certificate p 4;:)grams, 'etc...Something 'has happened to the community colleges a ?d it's difficultfor legislators to get a focus on what its mission js, because itseems that there's not a lack of mission, there's an abundance ofmissions."

. 52

'

Page 53: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

49

..,

College representatives also pointed to another dimension of theproblem: making other parties gnizarit of this expanded mission. As.

it'one, participant stated,

"This is certainly a big problem, because as you know, the com-munity collegeg are under attack right now for not transferring asmany as they did 20 years _ago, 15 years ago. And; up here theperception is that we have 'a problem; we don't know how big,with the legislature, which tends to see the community colleges interms of the master plan that's 20 years old, rather in terms ofwhat ttre- institutions .have become in the intervening time. Thdcommunity colleges serve multiple purposes.:.".

Another issue broUght out by the group is the problem of whether

programs are serving the "appropriate" participants. Appropriateness

is partially a fujXction of the Mission (or purpose) selected,. In addi.-

tion, no clear cut guidelines detail, how selections should take place.

As one of. the participants defined the problem, ."You mig be helping

111,000, but is it ich, fact, the group you should be helliin The ques-

tion' of which group should be helped is, pf course, a nebulous one and

open to substantial debate. The current guidelines for the program

provide. indication that participants with highest priority are "first timetudent with the greatest need." 'The question, of course, of what is

"greatest need" is still open to debate--and to individual interpretation

by each college. As one study groups member noted:,

"The need of the community' is what defincs the highest prioritystudent needs at your college."

Aside from accurate description of the population served and tiofe

a appropriateness of programs, colleges also want to be able to mallclaims about success in terms of pro rLaj.tioutcomes.. Participants

pointed withAefrustration,to 4e inability, thus far, to differentiate EOPS

p'rog'ram effects:

53

Page 54: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

50

-,

"....you resort back to general outcome measures. You wind upmaking an assumption that if there are differences, and if EOPS isa major service area that other students are not benefiting from,that you can claim that some of (some claim' all of it) is due tothe program."

Again, study g p members felt the necessity for certain, kinds, of

evaluation data to describe program effectiveness. '. Transition was

mentioned as convincing evidence of program impactL As one parti-

cipant pointed out;

"I think* the transition of our students is ynvimportant. If wecan find out, especially on a state-wide jevel,...but I think it'd bevery interesting to know how our students do as they transfer on,and what percentage of our students transfer. I think that wouldbe valifable information for EOPS to have, not only as ammunition, ;to

.11 we could show that 15, 20, 30 percent. of EOPS do make thetransition. from community colleges to a four-year college, bficausewe know that somewhere from 3 to 7 percent of the ,egularstudents' make that transition."

Another participant continued along the same line of thought:

"But my big concern right now- is the transition-:-the movement-from the 'community 'college to the Aar-year' sscene.,..1 think whatwe don't know is the definite impact statewide that 'the, program ishaving, and on the other hand, the transitionhow many gf ourstudents are actudlly pesistinp and getting into a fo r-yearschool.",

Although we have alluded to'this in other sections' of, this paper,/-\it is important to' note that members of the colleges study group alsp

pointed to the need foca,..trasic program demogrphic data., The following

data illustrate the kinds of information that would be useful:

Number of students who applied for admisston

iNumber attending class

Number' persisted beyond firsensus week

Total dropouts

Financial did

Registration: day/eventnglocdupational

Etc.

"14

r

I6

Page 55: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

51

,

Also, of greatseportance is a description of the way that pro-

grams are organized. It is important to document the .characteristics of

prbgrams.. One participant noted that it is importarit to provide "a

picture 6f the way various progranis have organized themselves in terms

of the functions they're addressing and ithe kinds of activities they're

engaged_ in." One participant cautioned, however, that there - i s a.,

c1442.ger 'in this procedure. There can be "an implicit message that this

is the kind of thing you ought to do": Such an assessment could

inhibit 'a campus' from develoPing different types of programs to serve,

the particular needs of its students in favor of installing an existing

program which might not necessarily be the most effective.1

A better .understanding of the utilization of program services was

often mentioned by the panel. Which elements of the EOPS program are

selected by which students? All participants agree that they needed a

.cleaier idea of which programs students chose to take, which program

component; wer successful, an why. In additiOn, they 'saw great

benefit from having comparative student/program data systemwide.

However, they felt thbt the present system of data collection and

I

'analysis' was not providing this information. There is a very clear

indication from the colleges study group that data, of this type would be

welcomed for local information purposes and would be iimportant for

program 'legitimation.

The topic- of fiscal resurces was prominent among the issues men-

tioned by college participants, An accurate summary of prOg4n costs-

is invaluableto the schools and is an important itern",iti describing EOPS% ,

to the legislature and others. At the most basic level, data ought. to, :-

be available on each college's allocation and' the 'number of documented.c

e`

Page 56: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

52

eligible students. More general fiscal _descriptions of the funds

'allocated to different" program components. are also seen - as valuable

information by the colleges. Comparative fiscal data can serve as a

useful guide to what. \other colleges are doing: One participant- noted:

"I would like to have comparison information beyond what I mightgenerate at the local level. In particular, I would Iikeo seestatewide data on amount, of money for direct aid versus amountfor services, the'n within services the amounts for counseling,.tutoring, basic skills,_ed." : .

Several of. the participants pointed out that data conc4rning the

relationshiii of fiscal resources to. categories of services would be help-

ful in determining how the 'EOPS monies could best be utilized. As one9.participant states,

II .. :all of this is part of the current era...where people are goingto have to make better use of the monies they've got because it's.not going to be coming in. large increases as it did in the last

-decade." . ,- . i

Most of ,the participants also, saw another dimension to the fiscal'- -

. 4 i

issue., There is -a need for better, under:stan_ding of the ;relationship,

`between studeht- background and fiscal resource' requirements. Study

group partitiOnts feet- that if they. "had information that would indicatefthat there are important -differences in the needs of students ...wne rd

they must be served longer if they're going to suceed, it puts us in a

stronger position to argue for- dollars." One participant further.

analyzed' this prObJern as folloWs:-

1'

JA

We have gone through three years of an agreement which- wasstruck betWeen the Governor and the Chancelloes Office about the" .level of funding- the Governor. would ,proir.ide- for the EOPS prd-.gram. The agreement"was that he iwould allow so many additionalstudents to be served, and a forenUla- was deviied to be able to.provide4'money to students who had entered the program. in theprevious year4, and those who werecontibUing in the second year,for a periOcr of /three years; that's called_ the ripple, formula- -well, the agwreement!s up, it's over. So we're 'now positionwhere we're _going to have to, in effect, tenedotiate ,a process, 6a

Page 57: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

5.3

mechanism, for °generating' new resources from the state pie.. Now,it's _my feeling...that if we are facing the situation where students

. need to continue in. the program longer, or there is a need formore students...well, that implies a completely different mechanismfor requesting augmentations."

Comparative data is seen as a desirable additional piece of infor-

mation which has not always' been collected in the last. Two types-of

comparisons are desired: (1) statewide comparisons of different EOPS

programs; and (2) comparisons of EOPS with regular college programs.

. What -is being discussed. here is not ,so much the nature of the data to

be collected, as the manner in which it might be analyzed:

Some of the participants felt that statewide comparisons of EOPS

programs could be a politically sensitive issue:4 "There's a comparative

issue here, whether (or not) we'r e going to yse this.data as a 'basis for

making comparisons of EOPS with the general student populations--now

that gets into shaky political groiSnds." A further comment made on

this point(

":'..There is a fair amount of data generally available that can beused for comparing EOPS to itself, (colleges with other colleges).We could toil that now. What we can' do now is compare EOPS tothe student body, because. the general studept inforMation systemdoes .not include, fozi instance, data qn units completed. We onlycollect that information state -wide on EOPS students."

The colleges study group echoed many of the concerns of the legi-.

slative/executive group, but they saw a number of additional information

needs. Notwithstanding the difficulties, the consensus was that it

would be ,helpful to be able to obtain statewide data concerning: the

overall pattern of the program/student :goals; program descriptions;

EOPS vs. general student data; and transition/reitrention rates. Also

seen as usefUl information, would be a 'clear description of what the

Colleges' .recruitment plans 'are, and a general understanding ofpast

historical trends.

4

0

.44

Page 58: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

P.

e.

55

Chapter alV

STATEWIDE EVALUATION OF EOPS

This section of the report compares the EOPSil program data

sources '.(Chapter I9 with the EOPS information needs as envisioned by

the two study groups (Chapter III). The correspondence between

inforMation needs and existing data will elucidate the questions of

interest which can currently be answered. Furthermore, the com-/parison should point to the majordata shortfalls of the current EOPS

evaluation system. As will be noted, while much excellent and valuable

data exist, there are some major discrepancies betWeen needed data and

_data that are currently available.

Existing Evaluative Information

The existing information sources Trst m Chapter II are summarized in. ,

Table I. , The data have been categorized into five areas: student

background characteristics and demographics, student academic informa-

tion, EOPS program characteristics, EOPS service levels, and financial

aid to -students.

At first glance, Table I appears to suggest that almost everything

anyone might want to know is collected in one faShion or another, but

this 'impression is misleading. The footnotes at the bottom of the table--1

point out some of the limitations of the existing data. Much of the data

are either ,unreliable, not coded for computer analysis or collected on

. only a small number of-colleges eachyear.

There- are two important exceptions; the fiscal and the USRS

reports are both reasdhably accurate and readily accessible. The most

accurate and useable data is that obtained from USRS, relaying primarily.4.

Alf

Page 59: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

56

DATA ITEMS

TABLE1

Existing Evaluation Information

DATA SOURCES

Student Data Application Fiscal.Cardsa USRSb Form 10c Form A-1d

Student Background Char-acteristics and DemographicsSex x )<Age/BirthdateRacial, ethriic groupMartial statusFinancial needResidency

EOPS Student Academic Information

Highest grade completed *-xDeclared major .

x -xCollege of last attendance xHigh school diploma? xH.S:- of last attendance xAcademic standing x , xUnits attempted x xUnits completed x xAcademic status at

beginning of year' kYear GPA . x - xA,--Cumulative GPA x xAcademic goal (OPTIONAL) xAccomplishments (OPTIONAL) xOverall college with-

drawal rate x

EOPS Program Characteristics

Number of components xTypes of sub-activities (27 types) xNeeds . 5, xComponent goals statement xActivity descriptions xEstimated number of students served x x .

.. Actual number.. of students served x

59

or

4

Page 60: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

DATA ITEMS

. .

57

TABLE 1 (ContinUed)

DATA SOURCES

, Student Data Application Fiscal

Cards a 'USRSb Form 10c form A-1d

EOPS Service Levels

Resources per component(by object class code -

-9 components) xResources per major category (by

object class code - 3 categories , xResources per Activity (by

object class code - 37 activities) xPersonnel per component xPersonnel per category xPersonnel per activity . x.Estimated number of students

served by component \ xEstimated number of students'served by category x

Actual number of students. served by categbrl 7 ' x

Districts' financial contri-bution (actual) x

Distrists' financial contri-bution (estimated) . . x

4 Counseling hours (EOPS at__ f

non-EOPS combined) ., .. xTutoring hours (EOPS & -,

.

non-EOPS combined) . x . .,Recruitment hours . xEOPS. counseling hours- XEOPS tutoring hours x

Financial. Aid to StudentsStudent need x xFinancial aid from

. each-sourcet

Note: Refer to previous section for a more complete discussiona Student data cards are no longer being used, and have low reliability.b USRS will first be fully implemented in 1981-82.c Application forms reflect projections,. not actual expeditures; only fiscal

elements are coded for computer access._ not program characteristics.d Fiscal forms are not coded for computer access.

60

Page 61: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

A

58I

to student academic success and the level of individual student financialr

aid. As we will sle later, these are the twos area ig which a compre-,

hensive evaluation system is best supported by current data collection

efforts.,.

Necessary Information for Evaluation Questions

In Table I I R we .have categorized they concerns expressed by the

study groups into six major questions of interest. In addition, the

,kinds of information appropriate to a swer_ each qUestion have been

specified. The six major questions are:

1) What are the background chars teristits of the EOPS popula-v

tion, and how do these compare with non-EOPS students?

2)- What are the elements of the. EOPS program, and how 'do4,t I

these compare with services offered Outside. SOPS?

3) What level of services is provided in EOPS programs?

4) How are EOPS resources expended? .-

5) , How successful are EOPS students during their tenure in the,

college, and how do they compare with non-EOPS students?

6) How successful are EOPS students when they leave the corn-*-

munity college, and how does this compare with non-EOPS

students?

A cursory glance -at .Tab.le II may give the misleading impression

that most of the inforMation necessary to answer these questiOns will be,

found among the dpta that are currently collected on EOPS programs

and students; indeed, the information categories have a striking similar-

ity to those in the first table. The apparent 'data match' partly results

from the use if similar term in both tables. The tabular format further

61

o

Page 62: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

I

QUESTIONS(

TABLE II

Desired Information on EOPSa

59

Id°

RELEVANT INFORMATION'

What are the backgroundcharacteristics of 'the EOPSpopulation?

4

..a. How do these compare with

non-EOPS students?.,.

2.- What are the elements of thes c EOPS Program?

4

a How do these compare withservices offered outside EOPS?

f

3. What level' of services are providedin EOPS programs?

4

4. How are resources expended?o

Prior school experience:High school diploma/

cerlificateHigh school GPACollege Gl?A

Age.Sex

Race. Parent& education level

Parents' income levelStuden% income level

Same

Coirmiunity heeds *Program componentsActivitiesSerVices offeredNumber of*type of personnelOrganigationat structure

Same.

Recruitment activitiesInstructional supportTutoring hoursCounseling'Administrative activities .

Othe activitiesFina cial aid to stuclehts

Cost per categoryCost per componentCost per activity

Page 63: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

60

J

;

TABLE II (COntinued)0

.1

1

'QUESTIONS. °

7 RELEVANT:INFORMATION.-.

5.i Hpw successful' are EOPS students Ln terms- of their own goals:.during their tenure in the' college: Sttident educational goals

In other academic terms:Units attempted/Units .

- completed. . ,

41Academic standing.t ,CumulatiVe GPA

A .Term.GPA.

trop -out rate/retentio9rate

A.S. or A.B. certificate. ., attained?

;

f

a. How do they compare with r Samenon-EOPS students?, .

G. How successful are -EOPS students In terms of theirelown goali:when they leave the community StUdent goal. stMertientscollege? i In ptivr terms:

Vocational skillsclotr placements

,Transfer/transtition'to 1.-four-year institutions

Suoce'ss aft& transitiorik

a. How does this.obmpare withnon-EOPS students?

Y.

a.Desired information could be provided by program, coTparisons, 'between programs and compar9sons with data.

'a

:

63

Page 64: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

V

suggests that each question hat eque 'importance. In reality this is

/o

61\

not the case: the questiohs for which the greatest amount of informa-

tion is available are not the ones stressed most strongly by the study

groups.

This picture will be clearer if we consider each of the six ques-

tions, comparing the information that would be necessary to answer the

questions with the available data. It is interesting to examine which of

the six evaluation questions pretented in Table II can be-answered with

inforniation currently being collected. This will provide a more accurate

appraisal of the gaps in the current data as the basis of an EOPS

evaluatidn system.

The usefulness of existing data for answering the questions .4:3f

interest is shown in Table IIIAvailability of Desired Information. The

bebacr evaluation questions are presented in column one of the table,

and the types of information that would be Reedeci to 'answer these4:questions are shown in column. two. All existing ,information sources

that provide . such measures are cited in column three, and those

Sources that are both reliable and computer accessible are indlitated in

column four. Specifically, column four provides an answer to the data

useability question. Only those data sources for which the answer is

"yes" in column four properly contribute to an evaluation system.0

Availability of Desired Information. The first question deals 'withto

the background characteristics of EOPS students. Members of the

study groups felt ,. it. was littportant to document more accurately the

entry characteristics of the EOPS populatiopz These data might be

relevant to an analysis of the appropriateness of EOPS services, the

expected level pf LOPS outcomes, etc. Both the research literature and

64%%\

Page 65: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

TABLE III A

Availability of Desired Informationa

'QUESTIONS .RELEVANTINFORMATION CURRENT AVAILABILITY USEABLE IN CURRENT FORM

1. What are,..tke back-ground characteristicsof yhe population byEOpS7

Prior school experience:high school diploma/certificatehigh school GPA(prior) &ollege GPA

AgeSexRaceParent& income,. velStudent income ie velStudent finapcial need ,

USRS

USRSUSRSUSRSUSRS

USRS

YesYesYesYes

Yes

. How do these 'com-pare with non-EOPS

K students? Same (except fOr-financial need)

Same Same

2. What are the elements ofthe EOPS Program?

Community *wedsActivity goalsProgram componentsProgram subcomponentsList of activities offeredNumb% of type of

per nnelOrganizational structure

Form 10 (program)Form 10 (program)Form 10 (program)Form 10Form 10

Form 10 (progr-am)

Form 10 (6rogram)

22'

a.. How do these comparewith serviices offeredoutside EOPS?. Same

Ni

Page 66: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

QUESTIONS

st.

TABLE III (Continued)1'

RELEVANT INFORMATION CURRENT AVAILABILITY USEABLE IN CURRENT FORM

3. What level of servicese: are provided in EOPS

programs?

Actual/estimated numberof students served

Recruitment activitiesInstructional supportTutoring hours aCounseling hoursAdministrative activitiesOther activitiesFinancial Aid to students

, \Work-study grants tostudents

F.

Form A-1, Form 10-iv

For 10 (programsForm 10 (program), USRSForm 10 (program), USRSForm 10 (program), A-1Form 10 (program)Form 10, USRS, Form A-1

USRS pe

1, 2

YesYes

2.

Yes

Yes

4 '

4., How are resourcesexpended?

Expenditure per category(A, C)

Project 'cost percompo ent (100-900)

Projected cost peractivity (37 levels)

District contribution

Form A-1

Form 10

Form 10Form A-1

1

2

21

5. How- successful are EOPSstudents during theirtenure at the communitycol ege?

67

In terms of their own goalsStudent academic goals

In-other academic terms:Units attempted /units

completed,Weekly -student contact

hoursPositive attendance

enrollmentStudent levelAcademic standing

beginning /end of term

USRS (Optional)

USRS

USRS

USRSUSRS

USRS

3

Yes

Yes

Yes

Yes

6

Page 67: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4QUESTIONS

/TABLE 4II ( Continued)

#.

RELEVANT INFORMAT,ION CURRENT AVAILABILITY USEABLE INCURRENTFORM.;

5. (continued).

Cumulative SPATee& GPA 2%

Drop-ot,it rate) reter)tionrate

A:5.cate attained? v .t

4.

USRSUSRS

YesYes

a. How do thesecompare withnon-EOPS students?

Same

d'r

USRS (Only academicgoals: units attemptbdand weekly studentcontact hours)

t

Same

6. How successful ae-e-EOPS students when,they leave the com-munity college?

In terms of their-oWn goalStudent goal statements

.1n other terns:Job placementsTransfer/transition to

four-year institutionsSuccess after transition

OW

a. -How does this In terms of their own-goals:compare with non-- Student goal statementsEOPS students? Other appropriate meas-

ures4

69 .aDesired information could be proyided by comparisons between EOPS and non -EONSprograms and.comparisons with statewide data.

NOTE: 1. Form A-1 - Reliable but not computer coded.2. Form 10 - rojected levels at time of application3. Optional element in USRS - Low response rate

70

Page 68: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

65J

x.4 ,

41

.common sense' suggeet that it would probably be most important to

0 ,consider educational and socio-economic variables. The current system

does provide some information on racial-ethnic background, agi and sex,--.

of each EOPS student. It also provides a gross measure of prior.

academic performanCe ,. by indicating if the student has a high school. .diploma and if he or she has previously attended a college. But more

Meaningful measures of academic performance, such as high school GPA

or college GPA are not collected. Similarly, no information is collected.).° on socioeconomic indicators (such a parents' income level, parents'

education ,r etc). As a result, only the most cursory description of the

backgroUnd characteristics of EOPS students can be drawn from

existing information..

.

Data on race and prior .academic experience are part of the common

USRS data elements gathei-ed from all community college students and

thus 'provide a basis for examining potential differences between EOPS

. and non-EOPS pbpulations. However, data on-academic accomplishment._

(Iand financial -aid ore unique to the EOPS data forms. USRS does not

. . ,gather comparable data on -non-EOPS , studenti. While such comparative

information; might exist' in some other 'form' in some- other location, to

our knowledge, it does rpt. This is uhfokuna,te, .bedause . potential. ...

_comparisons' between. EOPS ands non-EORS' students are important to

many Members of the study groups. ,

. The second question i.* directed toward ,t description of the EOPS

program components. What are the' elements of the EOPS Program, how-"do they meet the needs of the students and how are they ,organized?

The most extensive, and most detailed description is found in the, pro-

gram 'application (Form 10). The drawbacks of the programatic datar .

1

A

7 . 4

Or

Page 69: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

66

rreported in Form 10 are twofold.: (1) they are not presently coded fir

comput r storage and access; (2) there are no sys matic revisions of9.

the descriptive sections of the application in lig t of actual brudgetI

allocations.. Analysis would need to be based on the "proposed" pro-

6. gram description only. Thus, many of the more interesting questions

on program organization could only, be answered after a great deal of

coding and analysis work- -and then only "approximately."

The only other source that maintains any programatic digtinctions

are the fisdal forms (Form A-1). However, they actually contain very.

little about the organization of program activities. Form A-1 disting-

uishes only three "major components," rather than 37 activity sub-.areas

as described in the program application. ,Thus, only the unsatisfactory

Form 10 data are really relevant to que0onS of EOPS program develob-

ment and organization. The comparative question is equally difficult tot

answer; we knoW little about the program characteristics of special 47-

services provided for non-EOPS students.

-Two other questions with aw slightly narrower focus can be

3nsweredmore successfully using information from the existing data

collection systems. The first of these relates to the levels of EOPS

services provided to each individual student; thd second to the fiscal

resources-allocated to each of the services within the colleges.

First, we will consider the_ direct measure of service hours-64

of tutoring and hOurs of counseling provided to each EOPS student.

These two pieces of data are reported on USRS. (Recruitment hour's

were estimated on the old Student Data Cards, but the estimates provedfl )

to be quite inaccurate and the item was not retained in the switch to1

USRS.) Even here there is a problem, however,-.tecause tutorkg andA

72_

Page 70: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

67

counseling hours are reported as a combined total for EOPS and non-

EOPS services, and it is impossible to differentiate between the two.

Thus, for each student- there is some measure of EOPS services

received, but not a perfect one.A

As an alternative, one can measure EOPS services aggregated at ,

, the college level in terms of the dollars expended . in different areas

using the application form (Form #10) and the fiscal forMs (Form A-1).

It is. possible to answer the question of how EOPS resources are

expended collegewide with a fair amount of accuracy. The b dget

sections of Form #10 are computer eroded (unlike the program descr

tions) and there is easy access to this data. Though not computer

coded, Form A-1 might be useable to check changes in expenditure

levels and adjust Form 10 projections. This would require some extra

work, but not as much as analyzing the descriptive sections. Thus,

describing the expenditure of EOPS funds by activity (with fine tuning

only at the aggregated level of 3 major program categories) is one k

that can be performed within the current data ,collection framework.

When we turn to the question of EOPS students academic perform-

ance more extensive information is available. USRS provides a number

of useful indicators of success. The number of units attempted and .completed are the most direct measure. That, combined With:-rrrt,Ilative.

.fe'

GPA and term GPA, allow' us to develop a fairly,accurate

short-run" academic success of EOPS students.

tire of the

)/lembera-of `the study groups suggested that cpmparisons of the

educational success of EOPS and non-EPS students- might be an impor-

tant measure of program success,..tt The common data elements of USRS

do not include measures of units completed nor bf GPA. Thus, unless,e,

4

7 ti

Page 71: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

66

.bthese figures can be obtained from some other source, no direct measure

of these data on non-EOPS students is available for such a comparison.

Another academic item that was suggested is a comparison of

drop-out rates among EOPS' and non-EOPS students.' We surmise that it

would be possible to compile figures on EOPS drop-outs using student.4(

identification numbers from one term to the next (though it is not clear.

how one differentiates between drop-outs and gradules). The other..,

lement in the comparison - -the overall college withdrawal rate- -was one

o the pieces of information collected on Form #3 of the old Student

Dat#Carkgds,

but it is lot collected. under USRS. Is possible that,

these' data could be acquired for nbn-EOPS students in a manner similar

to that described for EOPS students. 106this is the case, comparison of

withdrawal rates among EOPS and non-EOPS, students would be possible.

As a final note, we find it hard to believe that better- measures of

non-EOPS student academic performance are not available somewhere.AWhile such information does not appear to be part of the six' information*.

sources we reviewed,. we suspect that such data is compiled °Iri some '

form. As a consequence, we will leave the question of comparable. . ,_.

academic performance data on non-EOPS open for . additional

) investigation. - i.. I

. ,,,.The sixth questiOn suggested by the stUdy group may be the most

important one. Certainly it, was the issue addressed most often in our

discussions of the EOPS evaluation system. .The question is, what is

the success of EOPS stuaents, when they leave the program? Do they

achieve their personal educational goals an associate degree, voca-

tional training, job placement, etc. What is the overall EOPS program

success in preparing students for transition to four-year colleges? How .

'74

Page 72: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

69

do EOPS students compare with non-EOPS students in achieving such

goals?

Very little information is available about students' personal goals.

USRS has one item regarding student's academic/vocational goals.

However, 'Cis an optional item. If past experience with the'Student

Data Cards is any measure, we Should anticipate a very low response

rite on this question. As a result, we will have no dependable measure

of the students' own goals, and will be unable to answer the firstsub-question.

We are similarly handicapped in our attempts to answer the ues-

tion about transition to four-year colleges. No data are collected on

EOPS or non-EOPS student post-community college activities. This is .

unfortunate because transition appears to be a very important issue. It

may be possible , to compare studehts' identification numbers from one

year to the next and perhaps obtain a measure of transition rates to

the university and sti4.e cdllege system. The critical question' is

whether procedures exist to follow individual students throughout theirI 1p

carder in the different California college,-systems. Even so, gathering

Athe data will probably be difficult and time consuming. Barring this

complicated, conceivably impossible cross - referencing process; nothing

in the existing data sheds-any light on the-issue of transition. To our

knowledge there is no regular, annual co prehensive information of this

type on non-:EOPS students, either."

As an additional note, we should mention one further analytic task

thatwill be 'po'ssible in the future--year to year comparisons and review

of historical trends. USRS will be a valuab le Sausrce of year -tolyear.- .

o. .. . .

comparative data. Any new elements that .are developed to supplement

0

4.

ns

Page 73: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

f.

(

70

the evaluation system should- be responsive to this ability to make

comparisons over time.

In summary we found that many of the questions that would beo

important in an evaluation system for the .EOPS program cannot be

answered --with the existing data sources. Reviewing the six broad

evaluation questions in light of Table III, we found that the questions

regarding background characteristics, service levels and student aca-

demic success can be answered to a certain extent with the existing

data sources. On the other hand, the questions relating to programa,

characteristics, resources and expenditures, and student success after

EOPS cannot. Similarly, some of the comparisons between EOPS stu-

-dents and non-EOPS students can be made, but not all. In the next

section we will consider "both scaling down the questions of inteeest

regarding' EOPS and developing procedures for adding critical items to

the data base.

Before we turn our attention to proposals for developing an evalu-

ation,"; system based on this .analysis of current needs and information

sources, it is interesting to examine the data currently collected which

Eloes not appear to have any relevance to evaluation of the EOPS pro-

gram . Table IV we have collected a list of some car the extraneous

information which is now collected on EOPS programs and students.

These data are not useful in their current form for answering any

of the questions posed about EOPS by the meifibers of either study

group. For -example,- the elaborate statements of needs,' component

goals earn descriptions of activities included in Form 10 are not usable

fOr large scale analysis. They. are too bulky, 'cumbersome and

syncretic.' ,The budget projections by object class Lode and by program

Page 74: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

71

TABLE IV

A Sample of Extraneous Information

(Not Useful for Evaluation System in Present__Form)

SOURCES

Year-EndUSERS Form 10 Forma A-1 Report

Student Background Characteristics

Citizenship xVeteran Aid Status

Academic Information

Student Declared MajorH.S. of Last Attendarace

EOPS Program Characteristics

piebds\Component GoalsActivity DescriptionHow AccomplishedFuture Status

EOPS Service Levels

xxx

1;rbjected Budget by Object ..Class Code by Activity : k x

Projected Budget by Object tClass Code ..bys,Component & x

Projected Personnel by Activity xProjected Personnel by Component - x

.

Combined EOPS & Non-EOPS ,,

Counseling' Hours x 4

Combined EOPS & Non-EOPS.Tutoring Hours x

(

1p

4

r

4,

4/ et

-

Page 75: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ea,

t .

.72

activity code are too detailedand change too much when final allocations

are actually made. Other information--such as citizenship, high school

of Last attendance, etc., is simply irrelevant.

ThiS is not to suggest that they are entirely meaningless to other

agencies and for other purposes. _But, the paperwork burden th'at

many administrators complain about is real. If changes are to be made

'in current data collection efforts, one would do well to try to achieve

some balance--deleting unnecessary reporting requirements at the same

time that new supplemental information requests are being added.

With this novel recommendation in mind, we now turn our attention

to the task of integrating existing information into a usable evaluation

system..f

yY

K.,

76

.1

0 .

.

Page 76: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

9 Chapter V

A INSIGHTS FROM OTHER STUDIES

Introduction

73

I n the past few years a number of studies have been conducted to

examine different facets of the EOPS program and the community college

system. While norie of these had as it'-s direct focus the development of

an evaluation' system for EOPS, much can be learned from the analyses.

In the following sections, we will discuss several of these studies,

describing briefly their research fOcu§ and draJiing what implications we

,can for. a yiatewide EOPS evaluation system.

-Statewide .Lon4k tudinal Study Of Community College' Students

Descr

Pierce College

M. Steven Sheldon and Russell Hunter of Los Angeles

conducted Pa three year longitudinal study* on commmunity

college students in the state of California beginning with academic year

4978-79. A fifteen campus sample or students who entered the com-

munity col lege system that year served as the basis of their research.

'Using telephone interviews and review of transcripts they followed

thlke students for three years in an effort to obtain a more accurate

description of the type of students who3enrolled

ir theme community

colleges and the various ways that they progressed through the

e system .1 Data were collected on a number of potentially relevant:r C

1 The study had six specific goals, -which will not be recountedhere. The interested reader is referred to the reports. tjiemselves andto an excelleht analysis of 7the utility of the studies for assessing theimp-act of EOPS written by Jennifer Franz (T981).

Page 77: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

74

variables including the vocational status of the students, the changing

natures of the students' academic goals, their reasons for leavir(g the

community college system,. etc. faa

* ,a

.To this date three annual reports hayia beell published and a

fourth is nearing completion. These reports paint 9n interesting and

detailed picture of certain aspects of the community coll e student

population in California.

Information About EOPS Students and Programs. Unfortunately

the analyses have limited dire& bearing on EOPS programsin targetI

part due to the strong underre presentatioc4 of EOPS- students in the

survey sample. This may be'accounted' for in part by the fact that the

data collection was done primarily, by telephone interviews. . In Part I ofe

the report Sheldon an Hunter ,explain *that, "There was a great deal of

eidifficulty on campus s with a higp percentage of lower socio- economic .

level students...with respect to disconnected telephones, wrong phone:''numbers, Sand moving with no forwarding address."

In additiono EOPS' status was not determined directly. 'Financially

disadvantaged' and 'educationally disadvantaged' were the categories

used in the study, and these cannot be taken as perfect surrogate

measures for EOPS participation: Franz (1981). in her analysis of the

report is quite specific on this point, and adds an addfilc<1 significant

criticism:

Thorough and consistent interviwer training is the sine quanon of valid telephone survey research, and there is no evidencethat the training done for this -study. wasp either. e

... In d_im, the training and utilization of interviewers wouldappear to be a major weakness of this study. One would needaccess to the original data to prove that it affected the results,but it would be noteworthy if it did not. (p. 10-11)

150l

Page 78: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

so

75

/ -

As a consequence of all these problems, th specific results of he

longitudinal study provide little information about EOPS students.

Relevance to an EOPS Evaluatioh System. Despite 'the criticisms of

the Statewide Longitudinal study, a number of general findinQs are

relevant to the e\velopment of an evaluation system for E01541. One ofi 1;. . .

the strongest findings of the study is that our old fashioned stereo-. .

. .. \-types of the typical community college student are not valid. As

e.

Sheldon and Hunter note in Part

The underlying assumption is that the student body 'Ishomogeneous-describing one or two "typicql" students will beenough vide an understanding of all students enrolled in thecoil Is assumption is unwarranted...

Comthunity college students - are. heterogeneous in everyr6spect:4,demographic, vocational and educational. To determinethe nature ofd `these students and thus define the educationalservice that the community college provides; it is necessary todevelop something other than the usual ° student profile." (p.70-71)

These findings reflect very closely many of the comments made by

the .colleges .study °group. Member of this panel often reiterate the*,

notion that the students being served by EOPS displayed a wide, diver-°

sity of iriterests and academic goals.. They rljected the notions that.°

one could characterize college students in a simple manner and the

- resultant implickion that one could base an assessment of the success

of the program on simple measures of course completion9 and trpnsfer to

other institutions. Sheldo/and Hunte'r's conclusion on this point is

unequivocal:`44

"The data collected the first year of the Statewide LongitudinalStudy are certainly sufficient to dispel the myth that:

:1 California Community Colleges are Primarily two year colleges.2. Most 'students who matriculate finish their lower division

requirem nt and transfer to 'A university. as juniors.

Page 79: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

76

A

3. A large proportion of students enroll in and complete a twoyear vocational program.r

The myth is difficult to -destroy because some' of the studets do.out the program listed,lerlsove but, considering the hundreds

an thousands of potential students, who are admitted to CaliforniaCommunity Colleges for the first time- each year- -the stereotypicaltwo- year program describes the activities of very few."

is to Hunter and Sheldon's credit that they have gone beyond

merely destroying an old stereotype and have attempted to create a set

of more accurate prototypic descriptions of community college studen s.

They group these students prototypes into three broad headings--

. transfer, vocatiorWl and leisure time--but within these three broad

categories they have identified 17 distinct. student prototypes. These

descriptive classifications range *from thq "fulitime transfer" and the

"unmotivated transfer's to' the "career program completer!! and the-

"leisure skills student."

Sheldon and Hunter believe they can classify students into proto-n

types with up to 85% *accuracy based on telephone (-discussions about

4.4

heir goals in the commodity college' and a review of tragscripts. This

raises an exciting posilbility for capturing some of the diversity of

EOPS studerhs. 1A more, detailed classification of stuant' types, such

as that °suggested by Sheldon and Hunter's prototypes might yield Jar

more useful and. appropriate information on the success q EOPS.

Certainly - it would allow a more accurate formulation of student goals

than that afforded by item S11 of USRS.. This possibility will bee

discussed in greater depth in Chapter 6.

the classification procedure employed in the Statewide Longitudinal

Study has another broader implication about the mission of the corn-.

munity colleges. In Part III oi the report we ',find the following issue

raised:

ok

82

Page 80: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

of

b

77

The importance of the findings lies in their capacity ofincreased understanding of . the, community college "mission." Ithas become evident during the study that students fend to gene-rate. their own goalt, enroll in and drop classes at will, glean fromcourses the information they want, and leave when their ownpersonal goals have been achieved or are not being met. Thisindicates that the mission of the community colleges i5 determinedto a far greater extent, by the students than by other groups,e:g . Boards-4f TrusteefgACademic Senate, College Administration.

56)

This echoes many of the comments made by the colleges -study

94 oup . They .conceive of, the mission of the community college 12-- very$-,

broad, terms, a h d see EOPS fulfilling a wide variety of functions. Some

students are sttItcessful by their own. measure when .they have learnedea

eRaigh to obtain the job promotion that they hoped to achieve. They

drop out df college, without completing a degree program of any. ... . t

sort; yet in their own' -terms their ,tenure has been a complete success..

.. . . ... . ,

Other EOPS.,

students hakie rspecific .goal of tansferrring fQ -a f6Lir year-6.

, .

college and co>ftletin a Bachelors Degree. A -lengthier'l period of..

i 0academic. course- woirk- resulting in, e suCcesful transfer to a four 'year.

- ,. . ,.. - 4. .college may be the only outcome that one 4...accept. as a successful

'1,,realization of this particular goal . What: eyaluation. system= is

...,,---- .4.

l '

Ii-opted slibuld resRond to ttlis wide diversity of interests and purposesII'.

6. ..,. .arbong-.EOPS -students.

,.w

f ) e Iri: °a...

ei

9- W 0 .o t h e r ,useful paints arr.te gleaned frOm )the analyses: them.--, _, V

r serves°. ' The firs1,s.contradicts a traditionateassvmptiori that vocational

. * 4 r. '9 and non-yocational students have markedly" different academic .gbars.

.A' . 1 1s'.

.The *edond` points out the poor validity of meakres of -student goals--

in particular measures related to desire to transfer to° a ,four year.41

college.

One finding of the Statewide Longitudinal Study illustrates theII

dgficulty of using the traditional student goal lapels of vocational and

f.

a."83#.,s.

4

ti

Page 81: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

78

non-vocational. Overall 30$ of the, respondents in the study indicated

that they wanted to transfer to a four year ,college, while 36$ said that

they were- attending school for job related reasons. (General interest

accounted for 20$ of the students, AA degree for 6%, and home or

hobby skills for the remaining 2$.) The stereotypic view of the com-

munity colleges would place the vocational students among the '3E% vkillo

a rte attending for job related reasons, while non - vocational students

Would form the bulk of the 30$ who are planning to transfer to four

year colleges.

The results of their study indicate that this view is completely

erroneous. Vocational and non-vocational students were almost equally

distributed among the 'categories. 25$ of the -vocational student said

that they planned to transfer to 4 year colleges:. In fact/, most voca-

tionaltional programs in the community college system are- set.up so that the

course work isaTrectly transferable to UC and CSUC campuses.' Equally

surprisingly, 28$ of the non-vocational students indicated that - they

were attendin-tj the community college for job related reasons.

The authbrs of the report suggest,that misclassification of respon-

dentsdents according to, the SAM criteria mA, be responsible for some of

these data,' but this would represent only a small percentage., The

conclusion we draw from these data is that one must be extremely.

careful when categor izing community college students. By conjuring up

incorrect stereotypes, descriptive labels may .opscue rather than illumi-

natethe nature of students' interests in attending, community colleges.

After' studying the individual student's transcript and having a

lengthy -telephone, conVersa on, the interviewers were asked to inter-

pret the studen t's ability to - rticulate his specific g els.... The resultd on

8 4,

Page 82: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

A

V

4

79

the question GOAL CLARITY are interesting. Overall only 5196 of the.(

respondents were judged to have articulated specific goals. The other

half of the sample had a genqral idea of what they were interested in

(.37%) dr had no clear idea a all (12%): This suggests that the .com-. /

munit,y, college c6uld provide an opportunity for students to clarify their

own specifics educational and career goals. . As Hunter and .Sheldon '

conclude, "this finding points, out the need for more and better career

counseling on community college campuses in California" rp. 37)

RevieW of reponles from. other questions further illustrates the-

<<,uncertainty of student goals. When students were asked, ".DO' you plan

to attend a four year college?" The rescionst differed markedly from

those reported in the previous disAsion. 53% ,of all students said

"yens" (UP from the 301, reported above). 'Sheldol and Hunter's expla-

nation of this change was- that students responded to. this "guestion in

terms of "long term aspirations," . while the figures above represent< .

more reliable,. short-term predictions. (As, will, be reported below, -the'

ETI study found an even higher 'percentage of students who aspired to

attend a four year college. ) The conclusion we draw fron, all of this. . ,

research is that student goals are a very tenuous .measure 'and a6st be s., . \

Used with great- care.'

Conclusion. The Statewide Longitudinal Study suggests a number' .

Of issues . that are .relev nt to aq evaidatiOn system; for LOPS. a. ,._,, . .reinforces the notion i t e community colleges have a dierse and.

-cognplex mission. 'LOPS students comprise a wide variety of student

prbtotypes, and each may have different set df goals, Under .,,such-,

conditions it would be difficult, to 'measure ".the succest''7 of an EbPS

,program.

85

-

Page 83: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

46

8.0

.9 ,

Attention must be paid to the cdrIcern f the. colleges study group

that EOPS provides a nu ber ofp, services to students: `cOunseling,

°ring, financial aid, wh ch in and of themselves may go ,a long way

serving-°. student needs regardleA of their -particular educatio)al

goals. The Sheldon-Hunter. study also reminds us that transfer to a

tour year college is a commonly held goal among community college

students. The legislative executive study, group's interest -in transfer

rates is legitimate. A number of the EOPS students have as theirt,

immediate or their long term-aspirations transfer to a four. yeir college,

and no evaluation system would be complete without at least considering

these statistics.

WAat Happened to the BOPS Students Of Fall 197'? (A pilot, follow-up-

a ' .

study comparing SOPS and all other first-time entering Fall, 1973 day. f

-4ftestaa, 4students at .San Jose City College) / ,

4

tDescription.. In June, 1979 Paul Prhipgof San Jose City Cdollege

completed a study on QOPS students who enured the college for the`.- \ :-

first time in the Fall of i373. The study had ,two "purposes: first, to. .l

develop a model which might be used by other, community colleges to

follow up the performance- of EOPS students aver time; and second, to- A

compare .academic performances ,:.abd the geniiral college population at

n particular, he was interested in looking, at*

, their success in meeting objectiqs,

San/Jose City College.4

student& degree object'..,

, - ,amount of academic work they, completed, grade point averages, and. s . , .-

.

number.

umber who actually .transferred to four year-

colleges To-some_ e'xt'ent

he achieved both of his' goals, but the difficulties he encountered ;point.

up tip fact that there are nO easy W ays to obtain long! udiner data on. ..-

students.-

86

Page 84: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

O

li

81

d

The study was conducted through examination o f transcripts and

other records in the San Jose Community College Admissions Office.

The study team went back six years and obtained a list of all fifr'st time

students w1 obtained EOPS grants, funds, for tutoring, . or funds fora

'work study in Fall of 1973. The analysis was baied. on the officialorecords of these students over the ensuing six .years. Each stUdept's

record could have contained over ,30 different types of information. It's

interesting to note that of the 49 student records that were used in the

study, only 29 were relatively complete on all these categories. How-

ever, as Preising, points out,. all of ,the students' grade records with

relevant data, were 'complete. Thus the questions posed for this study

OuldTbe answered" (p.

Results,: The bulk of_ the report examines five comparisons?-

between EOPS and non-EOPS students. (The data on non-EOPS stu-

dents were taken from an earlier study, What Happened to the Class ofA .. co- /

Fall 1973?, conducted at San Jose City College.) The study finds that

upon enrollment, EOPS students had lower expectatons than non- OPSP

students, yet they received a higher Rercentage of AA degreees, ad a

higher _retention rate, completed more units, had higher GPA's,, and a

higher percentage enrolled rn foUr year.CO1leges after leaving ,the com-

mtanity college. Preising attributes these rather startling findings to

the additional attention, support, and academic,assi4tance provided by

the EOPS pr,ogram.

-Filllow7up Telephone Interviews. In a subsequent effort, Preising

attempted to "Fo.ntact 49 students by telephcfnd an't conduct follOW4up.

intervie\ws.' He based his interview format on the interview, guide used\

.in the Statewide Longitudinal Study, college's` records includeda 3 ,*

, '

00

' '

Page 85: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

82

telephone 'numbers-L. all of the students; however, they Proved to be

extremely unreliable., The study team was able to contact only 7 of the

49 EOPS students who had enFolled for the first time in 1973. In-X V

addition, it was able to get information on 5 others from family members

who answered the telephone' calls. Because' the sample Was-sosmall,

they reported Enly anecdotal evidence- from these telephone contacts'ind4

did not 'attempt to draw any 'broader conclusions. e0

Implications for an EOPS Evaluation System. This study suggests

that EOPS programs have a lot to offer, and that EOPS students may in

fact be .,,receiving tremendous benefits from the additional agsistance

They are given! In thieregard it reinforces the importance of ,develop--ing*i relia&Ie evaluation system for the EOPS program.' In reinforces

the notion that. a longitudinal / study is necessary to really understand. \ ..

the long term benefits of EOPS,,, while at the same time adding 013/idence.

r

to the argument that such studiesare not easily accomplished-. Reexam-

ination of past records is time consuming and not easily undertaken on 44'

0

a laege .scale.. -

'Students, are a particularly mobile group, especially after they

leave college. Thus the low /response rate to the follow-up telephone.

interviews is not surprising. )heldon and Hunter had much greater

success by beginning their, contacts with students while they were still,

genrolle in pshool and then continuirrg. thefollow-up contacts in- subse-.

'4q years, Preising makes 'a similar recommendation for others who. 4,

, ,.., consider doing longitudinal follpw-up on.

., community college students.. .

To' get a high rAt.@, of responSe from forwr EOPS. students, willrequire keeping, contact with at least a saMPle oestudents as they

._, . ...progress through,.:college. If students understand the .importanceof follOw-up and, while its college have been advised that suchfollow-up .will. be made,,, their' cooperation should facilitate highresponse rat and reasonably- good (data. (p. 34)'' .

_

*88

0'

Page 86: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

83

There's one additional point to be draw!, from the Preisi,ng study

regarding an EOPS evaluation system. This relates to the number of

A

r

EOPS students who transferred to four(year colleges.

Happeped to the EOPS Students of Fall 1313? transfer rates are inferred

from the students' requests that tran5crits be sent to a ,college. This

is a necessary but hardly sufficient condition for transferring to a four

In What

year school. It is, a clever method of obtaining an indirect indication of

a student's 'errrollment in a four year school, 'but it is hardly reliable.

On,e of the questions' that must be addressed in an evaluation systeM is

how to oblain a better, more accurate appraisal of transfer rates...40Conclusions, lihis study was an excellent attempt to document

precisely wtrat happened tg 'a grOup of EOPS students over time, and

we applaud./ee.

its intention as well as the efficient and iwisightfut methods4.,that were used. Two points emerge strogly,. First, longitudinal data,

mustust *be buil into oroing evaluation systems if it is to be obtained.

reliably. STconc1,, a gin ore direct and accurate. measure of transfer.

, status is necessary. If an revaluation system can incorporate both of

these aspects then the kinds of striking EOPS successes reRorted by1

, Preising at San Jose City College may be docuMentable "on EOPS

. students statewide. .. c,-.9

_ACollege Going Rates In California (Knoell Report).

..° . . . ...-°°

Description. Since 1977. the Califo,ia Post-Secondary Education. .., ...

Commiision has published anNannual report on the college going rates of ,

0Californie high, school graduates. The main focus of the report has°. i . , .

been the three segments of the California public collegb and urziNtirsitiest4 . . ,' systems--the University of California, Californja State University -ed

ofiege sy .stem, and the California Cmm,

-unity 'Collegs.' f addition,.

Page 87: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

a.

84

rates for accredited independent colleges and universities in the state

of California have been obtained yearly since 1977. Data for the

various tables and graphs in the report is obtained from -the colleges

admission's offices rather thin from student questionnaires. As aA

result, it. provides a very. accurate ,indication of .precisay which

students were admitted and began attending the various pbst secondary

educational institutions...

,

The analysis have grown more detailed each. year. Initially rates

were .reported separately for, men and women. Beginning in 1979 they

have also been broken down by ethnicity. In the last two years,,

transfIr rates for students who leave the community-college system and

enter ,the niversity dr the state university and college system have

becoMe' availlable as well. These are also Broken doWn by gender aeid

ethnicity. county by county aggregations are made as well as overall

statewide totals.

Results. The commisionis report contains a tiumber of interesting

figures and tables describing the various classifications of first time

college freshmen and high school seniors in California.. One table of°

particular interest ,for EOPS .relatei, to the etlinic distribution of 1979er

g'raduates and first time freshmen in California. The most striking,

difference is the increasing number 'of minority students whd are

admitted as one mov9\ from the University, of California to the California

State -University, and College System, and finally to the CaliforniaCom-P

munity Colleges System. For example, 3.3$ of the men and 5.4% of the:

owomen admitted to the University of. California in 1979 were Black. The

same figures for the California State University and College System are

6.8%. and 10.4% respectively, while at the California Community College,

1

1

Page 88: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

NA

ot

System 10,5% of the men and 10.8% of the women who were admitted for

the first time in 1971j- were Black. The reverse, pattern holds if one

examines the percentages of the various freshmen. classes that were

White. At the University of California roughly 74% of the freshmen

.were White while in the California State University and College System

the percentage dropped to roughly 69% and in the Community Collegeo ,

System to a similar 69%. The pattern for Hispanic and America)" Indian

freshmen follOws almost exactly the ratios displayed among Black fresh-.

men. (one must use the figures from the California State University.

and College System with some caution, however, because fully 30% of

the ethnic data from thii source we's missing. No explanation of thiS

large amount of missing :data is provided in thgr summary report..

However, the missing data is not a problem with the University of

California or the Califorkii Cdmmunity Colleges.)

In 1979, the report also included yalu.able information about stu-

dents who transfer from the community .colleges to either the University

of California or the California State. Uhiversity and College System. In

fett Knoell)

presents ilite 15 year trend in Community College transfer

students compared with total nu filer of first time freshmen.' Since 1975

the. number ,off commtinity colleget students transferring to U.C. has

declined from abodt 8,000 tc; 5,690. Dyring the same period the number

transferring to CSUC declined from abdu,/35/000 (to./10 /000. During

this period the entering ,freshmen class_es at UC and CSUC, renfained

about the sage size overali. Thus the transfers for the last five years

have represented an ever slightly decreasing .percentage of new

admissions for the two pot-secondary institut?ons.

"-91.

Page 89: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4tt

l8-6

., .

Also of interest is a breakdown of the 1979 transfers bf ethnicity.

While Black students represent 12.2% of the full time &ommunity colleg

enrollment they represent only 3.3% of those who transferred to the

University and 6.8% oY those who transferred to the State "University,:-

-and Colleges. Similarly, Hispanic Students are '12.1% of the overall

community college enrollment but only 7.5% and 9,7% of those students. ,

who transfer to UC and CSUC respectively. It' is only among Asians t.

and Whites that this pattern is reversed. Thus is appears that while.

the community colleges are serving proportionately larger percentage of '

the minority students,k these students are not usingI.'

the community

colleges sy-stems as a stepping stone to another post-secondary..)-

institution.

There are many- ways to interpret these numblers. Without further

longitudinal study of students at the community colleges, it is difficL{It0 .

if not impossible to know what these disparities in transfer rates

represent.

,implications for an Evaluation System of EOPSC The data in the

Knoell report provide valuable baseline information on what is happening

overall to the student population in the various California post-secondary0

institutions. Transfer rates by,

gender .and ethnicity are, in fadt,

available for each individual community coljege as well as the county .

t

and state wide totals referred, to above. HoweVer, whiles

these data,

help us describe what has taken place at a statisical level,

.1

qhey do

very little to answer the question of what thp trends represent in terans

of prpgrams. In many respects, le data ask ore questions that they

--.- answer, Because there is no indentification of student goals nor of ther

particular program that a student .was enrolled in, it is difficult to rake

any but the most general interpretations.9

4

4.

,

I

Page 90: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

87

1."6

,. ft. '

.. At present, there is' no coding to identify a student as a partici-ro

.1' pant in EOPS-. 'It is not required as part of most college application..

4%,

processes and it is not one of the mandatory pieces of information that. ,. :.;r1

is reported' to the California post-secondary. education commission.

' ,However, this situation may° be, changing within the next °two years.

Our conversaton with individu als at` CPEC indicates that they are very

interested in obtaining an EOPS identifier On newly admitted collegea

transfer students in -fhe near future.. They hope to expand their

current data base in mapy weirs, and the EOPS identifer ishigh on the

list of 'additional data 'they would like to collect. This addition would

provide the most alicitrate indication of EOPS transfer rates . availabler

from any 'source.. , - T4 t

The data bank from which College going rates in -California is;compiled is avairable to qualified users. 'Thus, it would be possible to

access this.44hiqhly refeYant and accurate data source in an EOPS .evalu:r'

'r

ation system. There is one drawback however: it takes CPEC. quite

some time to -collect*, yeaify, and clean:the data Colleges are asked to"4, .

provide information b January 1st on the class that entered' the., -../. .

. r . . ....

previous fall. it taked- some time for this data to, be screened 'and1 . . o

verified,, In fact, two years ago the final data tapes were not available',. 0

until Septegibber of the following year, and for the current year (1980:''ldo81) they were not available until 'this summer. Thus, ,for ,example, a.

. . 6. . ,-student Vo completed an ,EOPS` program in junaband transferred to the

,.

. s

/''' 4'' I'd

G`

California State University and C.

ollege System in the fall would not-show up in the CPEC data base as a verified 'transfer student until Ihe. :following summer.

r

4

93

0

S

Page 91: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

16

43

88

4$Q34

\.

. While it may be two or three years before EOPS identifiers arek.

,adopted by CPEC%-and while there is an inconvenien time lag involved., ....,

in obtaining accurate data on community college ransfer- rates, the. . .

;'.

CPEC data, base from which the Kno,ell ce..f.larrl Isived appears to beC

.a, very promising source of information in a future EOPS evaluation .,

0 ' ---4'

=.4 . ';,,..system.. -0. ; ..

°e'.

, . ,..../

. . i.Tohe.Study of Extended Opportunity Programs and. Services "In

.F.

.7 'California

,

Community Colleges

t

Description: 1915, the Evaluatipn and Training Institute

Conducted a study designed to provide the first "external" audit, or

evaluation, of EOPS in its statewide system of community colleges._

James W. Trent of UCLA was director .and Ronald W. Farland, `then a

full-time professional associate of ETI, was associate director. EEA hr

commistioned Trent to write a brief summary of that study and consicer

its implications for an ongoing evalLiatiOn ,oT E-OPS;, (See Appendix B.)

IInformation about EO S. The data base for the&evt?luation was

comprehensive, inclUding he .following: (1) extensive, carefully pre-..t ,

tested surveys of statistically representative (mostly random) sampling.

of all constituent groups in the 93 participating' colleges--EOPS

studenti, 'non-EOPS students (for comparative,\ "control" purposes),

faculty, administrators anti -.chairs of advisory committees; (2) institu.-

tional and program characteristics obtained ,from a ."BaSic Data Sheet"

designed by the Project staff and reports submitted by the colleges to

the Chancellor's Office; and (3) case studies based on thorough,-

accreditation-type site visits Of 12 colleges, representing the 'spectrum

of California's community colleges:

4

*0.

94

.

Page 92: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

0

1

89

4

Several important stra egies were executed which helped to assurewere

the data,baSe described abo)e was a pfitch one: The college presi-

dents appointed their. Own college liaison officers to 'assist the Project

Staff; a series of .regionaf, workshops were held for the Project staff;, .

ularly with the statewide EOPS Advisory Commit.-liatdn officers met

tee; and pre -site visits were made end site-team orientationsfi

were held

to asstre that all elemen(of the case cstudies would run smoothly

These strategies :without question contributed to remarkable resultsAr

The survey response rates ranged from WA of the EOPS students to 94%

of the EOPS directors. Almost all of the data sheets -;and other docu-

ments requestedwere received. The site Visits went even better thani7)

planned. On the other hand, the project was constrained by typical

limitatiOns of'time and fiscal resources.'g

Relevance for an' EOPS Evaluation System. The study S'uggesterl. a

number of. points concerning procedures, criteria and issues t.hc rimy, -' ,

well deserve consideration :inafuture ,evaluatiOns of EOPS. AJ

comment

on several major points follows. Trent notes in the Appendix that'

among the procedural .ptlin'ts _most commonly mentioned was the lackofkey definitions, espetially operational' d rinitions-%..g'. eligibitilty ..tt-

P .

EOPS- -that would contribUte t8cdata validity.. A series of multivarjate.. .

. aanalyses in the study clarified that ere is no one, measure o

t"success" of E PS. Instead, There are a variety-of criteria deserving-X .consideration.- his has also been noted in this EEA study and is

.- .vreflected in the user croup questions which have been posed.

Tren't notes' that a number of other points to be deriv,ed from the

study pertain to the role of the Chancellor's office in the evaluation of-

EOPS. These include: (1) the need for greater, moreinforMative,..

4.95

0..

6, .

Page 93: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

90

more prompt and more reSponsive consulting; (2) the nted for muctl

more prompt and meaningful feedback of collective and comparative

information; (3) a .definite reduction in and justified rationale fOr the

data reports required of the colleges; and. (4) greater statewid coordi-4

nation of all these efforts. Certainly, thes points must be attended to

in the evaluatio0 System that we will devise.-

Three -concluding points are noted 4by Trent as emerging from the(N.P

evaluation. First, ongoing evaluations shliiird -reflect appreciation, for."'1 I

the diversb ways and settings--and the ,quality/of their diversity--

thrbugh which t alifbrnia's community colleges carry out their program

and overfill missions. Second,,evaluations of the kind under considera-

tion should include a systematic asiessment of the larger organizational

and community environments in which .the evaluated programs operate.

A third point noted--more -difficult/ and perhaps' not relevant to our

work- -that legislation: for such, 'programs 'as EOPS must, itself, be

continuously evaluated for itS appropriateness in light of changing4

economic sand social conditions.

-

Th°a

\I

O0

a

1117(

e

Page 94: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

914

Chapter VI

EOPS EVALUATION SYSTEM RECOMMENDATIONS:

DATA COLLECTION

1 .

Having surveyed the available, sources for evaluation) dafa and, the

expressed 'needs of, various evaluation users--and the coincidences and

discrepencies between the two--we now describe data collection for-the

proposed evaluation system. The proposal derives from all of thesources we have surveyed (Chapters II-V),, as well as from valuable

conversations with people in the Chanfellor's Office, .in the California

Post-Secondary Education Commission, at indiVidual community colleges,

and with ,other, evaluators. 3

The proposed evaluation system utilizes many of the existing data ..,

sources--some with modifications -,-and includes two-I

new surveys of

students. These data bases are summarized in Table V and the corre-

spondence to the six evaluation questions is shown.

Information for question 1 is generally available from various of,TO.

the USRS segments', along with some recommended additions to USRS-

EOPS, Similarly in question la, most, of the relevant information is .

already provided by USRS; we recommend that certain supplemental

data be made available by expanding USRS to ir'lclude the collection of

annual data on all students.

Question 2 will 'be based on information from Form 10 and supple-

'aental 4tata from the recommended expankion of the Operational Program .

Review.r

Comparison of EOFfS7/vices with those offered ina'rthe total,

college (question 2a) has been dropped because appropriate data cannot

be made available in a feasiblg manner at this time. HoWever, We feelo

.

, 9 .

Page 95: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

TABLE V0,, ),,

THE EOPS EVALUATI

PROPOSED AND CURRENTLY

QUESTIONS RELEVANT. INFORMA- 0

1. What are the ba0cgroundcharacteristics .of thpopulation served bEOPS?

la. How do these comparewith non-EOPS stude

2. Whit .aof<'the

SYSTEM'

AVAILABLE INFORMATION

.4k

AVAILABLE AND USEABLE1 PROPOSED, ADDITIONS/RENASIONS.

Prior school e ,eriencefhigh scho diploma/certificatehigh sc ool GPA

'Age , ..Sec

;Rpudent finanoial heed

'Pride school experience:. high- school' diploma/certificate

fiigh school GPAAgeSex"Race

.USRS-census &

USRS-census &'USRS- census &USRS-censusUSRS -EOPS

. ,

USRS-census

U SR S -cen s u.sUSRS-censusUSRS-census

annual

annualannualannual

USRS-ahnual

the elementsOPS Program?

How do these comparewith non -EOPS Students?

98

14

Community needsActivity Goals",Program subcomponents.Number and type of

*activities offered . ,Organizational structure

Form 10 2

Form 10 2

Form 10 2

Farm 10 2

Form 10 2

-OperatiOnal ProgramReview (OPR).

DELETE -Judged ,to 'be beyond the limitsof the evaluation system atthis the..- g,-*

Page 96: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

:

,. !.

TABLE V (Continued)

ti

1 ft

i

1 .

QUESTIONS RELEVANT INFORMATION AVAILABLE AND USEABLE./ PROPOSED ADDITIONS/REVISIONS. l

3. What lervel of servicesare provided in EOPSprograms?

Estimated number of Studentsserved per component

Actual number of studentsserved per component

lnttruciional support activitiesTutoring hours _

C6uriseling hours ,'Administration activitiesOthdr activitiesFinancial Aid to students ..'Work- Study grants to students

Form 10,2 .

Form 10 2

U.SRS-EOPSUSRS-EOPS t

Form A-12'Form 10Form A-1, USRS-EOPSUSRS:3EOPS

,

Form A-1 revised. .

USRS-EOPS revisedUSRS-EOPS revised

/ ,4 ..

. , .

4. How are resourcesexpended? co

I ., ,Expenditure per coenPonerrt

(100-900Projected cost per component

(100-900) :-'

-Projected cost per activity

(`37, leVels) e Form 10 2

District contribution. . Form A-1

Form 10,2

/

Form A-1 revised

How 'successful are EOPS What are students' own scademiCstudents during their goals?tenure at the, community Hdw successful are they incollege? terms of their own goals?.

How successful are, they elati,tie

1 o o

to: '

units attemptedunits completedweekly student contact hours ,tpositive attendance, enrollment

.

...:

.'. '

, ..1

USRS-census & annualUS,5S-EO.P,SUSRS-census A, annualUSRS-census & annual

. {'.

I

Supplemental .Suryey1

Supplemental Survey/LonOitudinal Study

USRS-EOPS reviSect.'4

,101 b

Page 97: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

J

TABLE V (Continued)

A

4

QUESTIONS RELEVANT INFORMATION

5, ,(Continued),

student levelacademic standing beginning,

of termacademic standing end of termannual GPAcumulative GPAdrop-out/retentio rateAA,or AS certifi ate attained

11.

/AVAILABCE,AND,OSABLE1, PROPOSED ADDITIONS/REVISIONS

USRS-census & annual

USRS-EOPS

USRS-EOPSUSRS-EOPSUSRS-EOPS

--*

# '

USRS-EOPS revised

USRS-EOPS'revisedUSRS -EOPS revised/

Longitydinal Study40

5a. How do these ,comparewith non-EOPSstudents?

/How successfyl are theyrelative to:

units attemptjd'units comple ed

weekl stud nt contact hoilr§.a positive att ndance enrollment

student le /elacademic standing beginning

of termacademic standing end of termannual GPAcumulative-GPA

Op-ouVretention rate(Yr AS, certificate attained -

LISRS-census

USRS-censusUSRS-"censusUSRS-census

USRS-annualUSRS-annualUSRS-annualUSI4S-annualUSRS-annual

.USR'S-annual

USRS-annualUSRS-annualUSRS-annualUSRS annualUSRS-annual

i

lb

a.

Page 98: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

TABLE V (Continued)-

c

c.

QUESTIONS. RELEVANT INFORMATION , AVAILABLE AND USEABLE1 PROPOSED ADDITIONS/REVISIONS

6. How s4ccessful .are EOPS How successful are they instudents when they leave terms of: their own goals?the lebmmunity college? How successful are they relative

°

to.job placements ....transfer to four-year

institutionssuccess after transfer

..

\..

c.

Longitudinal Study °

Longitudinal StudyCPEC (if planned revisions

are made) °

CPEC (if planned 'revisions..-1 are &lade),

...

6a. How do these comparewith non-EOPSstudents?'

1

%

How successful are theyrelative to':

° transfer to four-yearinstitutions

success after transfer

4.

o

J /1Currently available and useable with only minor modifications or improVement? in -data access.

.. .2

We suggeSt that a supplemental analysis be conducted to dettrmine the correspondence betweenbudget revels 'used in progrm planning on Form' 10 and actual expenditures are reported on Form A-1.Form 10 data may not be useable if this correspondence is low.

..

--,,a

104 G_

.

3,

../

I

A

t ,

14 05k.

Page 99: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

96

-r r

this is a potentially important question that might be addressed in some

manner in the future.'

Data for question 3 are cur ently available fr6m USRS, Form 10I

and from Form A-1 with some minor' revisions recommen-ded for USRStomake the information more meaningful. Question 4 will be answered at afairly broad level by elements gleaned from For k-1 and Form 10.The academic. success data for question 5 are available from USRS--and

from a new. Supplemental Survey of EOPS students ,focused on students'own academic goals. To answer question 5a We are recommending thatthe USRS arfnual survey be expanded to include all students. Data forquestion 6 may be available- through the California Post-Secondary

Education Corfirdission (CPEC), but the evaluation system relies most

heavily on newly proposed EO.PS longitudinal studies.

For the remainder of the' chapter, we will describe these modifica-tions in great4r detail, considering USRS, Form A-1, the CPEC data

base, and OPR. ,AddItiOnally, we will suggest adding' the Supplemental,

Survey and the Longitudinal Study of EOPS students tot, the evaluation

system-- Finally, We swill describe a practical data c ollection timetable.

Delineation of Data Sources: Additions, Deletions, iModifications

.When we began our investigation of the existing EOPS inforrtation

network, we reviewed each of the six data sotrces for reliability, valid-C

itV, and accessibility, and we asked how successfully they answered the0

.evaluation needs -*established by the,study groups. Two goals directed

our work: to develop edata co lfettion system that produced information

necessary for 'answering as many evaluation questions as possible, and

to do so frugally and efficiently Thus, we have retained (with ,some

r'45

o tr

Page 100: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

I

97

.

changes) five of ihe, original six data sources. In addition, Vve have

added two new instruments Avhich provide important unduplicated injor-

mation. These 'seven data sources- -and the evaluation questions they.

address- -will be described inthe next- sections:/),

USRS. Three distinct sub-stems ,of USRS provide useful infor-

matign for us.

he SRS census. data system is completed once 'eachter on all students enrolled. in California communitycolleges.

The USRS-EOPS student data survey is conducted onceeach year on EOPS students only.

The USRS annual student data segment is administeredannually on EOPS, handicapped, and vocational educationstudents only.

The USRSCensus gathers information about. each student's back-$

ground characteristics, academic level, and current course load. Of

the 2a items, 18 are required. (Because -experienc edi.cts a lowe - *

response on optional items, we will not use th'em.) Of these 18, six ire

'useful in their present form as measures of background Characteiistics

(questions 1 and la). Three others provide information about a stu-

dent's academic activities--but only very incomplete information. They4

measure course enrollment, but not course completion:-

Table VI lists all 20 USRS- Census data items, noting whick, are

required. The nine 'items we can use are Cross-referenced to specific-

evaluation questions.

The USRS-EQPS attempts to .assess student status and EOPS pro-

gramu

impact at the "end of the academic.year, rather than to measure`

on-going "segvice levels. All 20 items of this assessment provide some

information toward answeringthe evaluation qUestions. Jo be sure,

0!

Page 101: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

)

7-98

..7.'tP

TABLE VI

.44

d N US STUDENT DATA SYSTEM

.-

Data Element Name

.Elemer4 t Related Evtaldation

Number - Required - Question'#,Pr

., iCollege Code X1 X -

Report Period, XS ,X

Reco.rd X *Number X2 .

Birthdate S1 X2

Sex --1 S10 X

Racial /Ethnic Code Sly X

Citizenship S X.Residence Code S9 X

, ,

High School of, Last Attendance SS.*e

College of 'Last Attendance S3 ...

.

High School .NEducatioh .S5.--,r

Enrollment Status. .

. S4 Xn

Student Level ' S12

Student Goal S11

Student Major S13

Positive Attendarice,Enrollment 57. :

Weekly Student Contact Hours (WSCH) S17,Total Potential Hours of \-

Attendance (TPHA),iUnits Attempted \. S15 4

Veteran's Aid Status - % S16.

, X

X

Q1 and 1a

Q1 and 1a

Q1and;la

.X Q1 and 1a

- 01 d 1a

X . i - Q1 d it

i

YX,' .

4 .'

Q5 and 5a

Q5 and-5a

Q5 and ,5az

"'

O

A?

1 From 'the: Chancellor's -Office informatton System Data Element, Dittionarjr, May 1.981.

'2,Required onty when submitting data.on,-80-coluthn cards. '

I 0'0

at

6

Page 102: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.99

;Some items produce redundant information and others require revision.

But on the whole,, the USRS-EOPS is very useful.

Five items measure the students' academic success (question 5).

One item, which measures students' financial need for the term, is

relevant" to assessing .background characteristics .(question 1). Two

other items 'measure levels of academic_ support services--.counseling and

tutoring (question 3); one item has to do with previous EOPS exper-'

ience. Eleven items measure financial aid, distinguishing grants,

scholarships, aids, and work-study. For 'our purpose,- 11 Separate

figures are unnecessary. One grant total would do.

Table VII lists the 20 Hofof USRS#EOPS and., again, cross-/

reTerences them to the evaluation questions. We also mark the items

that need addition or revision with an, asterisk. (Table IX gives' a'

concisjir compilatiori of these additions and revisions. Appendix 'C pre-.

-

sents each revised item and the full text of each revised data element

dictionary page. )

The USIZS-annual student data segment consists of the `same 20

items as the 'USRS-Census, but here annualized to provide, in theor

words of the Chancellor's Office, "demographic and cumulative workload

. measures to r Support the information in . . . the EOPS student data

system." This annualization gives us a ,baseline against which the

yearly 'USRS-EOPS data can be interpreted. In addition to small modi-

fications in the USRS-annual* survey, we rec-o-mmend that a similar.

USRS-survey be administered tca all students; without such an addition,

it is not possible to answer the two comparative. evaluation questions I

(questions 1a and 3a).

109.1

Page 103: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1 00

TABLE VII Q

EOPS STUDENT DATA SEGMENT.

path Element Name

Element

Number 1 RequiredI

Related

Evaluation

Question #. Revision

EOPS Status

Academic Standg, BeginningAcademic Standing, EndingUnits Completed

Total Financial NeedEOPS Grant

B (EOG Eligibility Status

BEOG Grant

SEOG Grant

NDSL

COG

Scholarship

Other Financial AidEOPS State Funded Work Study

Money famedNon-EOPS State Funded Work Study

Money Earnedlb

Total Work Study Hours Worked-GPA for Academic Year

Cumulative GPA

Tutorial HqursCounseling Hours

El

E2r

E20

E17

E8

E4 _

E3

E16

E12

E5'

E15

El 4

X

X

X

X

X

X

X

X7X

X

X

X

X

of?5 and 5a

Q5 and 5a

Q5 and 5a

Q.Q3

Q3

Q.3,

Q3

Q3

Q3

Q3

Q3

E9 X Q3fr

E13 X Q3

E18 X Q3

Ell X Q5 and Q5a

E7 X Q5 and Q5a

E19 X Q3

'E6 Q3^./

**

fi

1 From the Chancellor's Office Information System Data Element Dictionary, May_1981.

*Revision recommended (see Table IX and Appendix B).* *Addition of an .annual measure of this element on. all students is recommended (see

fable IX and Appendix B).

110

9

Page 104: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

In Table VIII we note the pertinent items of USR-annu , cross -.

reference them to the evaluation questions, and -indicate whe e additions

and revisions are recommended.

Form A-1. The final EOPS Claim form, submitted .t the end ofthe academic yer, presents a complete ac ounting of all program

expenditures reRorted by object class code. Expendit, res are classified

in three categories: (a) program development and- maintenance;

(b) student services; .(c) dirdct payment to student Thrs is minimally

adequate information for our needs (q stions II and IV). However,

we recomend that Form A-1 be modifie to requi e reporting tiy. compo-

nent (100 -900) rather than just the t ree categories. In particular this

would provide both expenditure tote s and n0 ber of students served in.4

the nine different program '-area designated on the original program

application. 'In addition, we recommen that ,atteittion be paid to replacing the

present non-automatic analysi pi4ocd*di.w" with' e system for coding and

"analyzing Form A-1 data qu ckly in o der, to avoid evaluation delays.

CPEC Data Base. CP C (the *California Post-Secondary Education

Commission) is responsibl for collec ind, and reporting data on students

enrolled in California co leges and universities. A partial list of thedata elements currently meintaine on Commission tapes is presented in

Appendix D. This da a base co tains one item pqtentially very impor-

tant for EOPS evald tiOn--tran er rates Of community college students

to four-year institut orys in the state (questions 6 and 6a).

Unfortunately, CPEC currently lumps all transfer students together.

Thus it ,is not no ipossib,le to ,compare EDI:3S transfer rates to overall

rates. However, among several proposed additions to the CPEC data

%

Page 105: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1 02

TABLE VIII

ANNUAL STUDENT DATA SEGMENT

Data Element Name

Related to

Element Evaluation

,Number 1 Required Question # .Revision

College iCode X1

Report Period X3

Record Number X2

Birthdate S1

Sex S10

Racial/Ethnic Code

Citizenship Code, AnnuilResidence Code; Annual SA6

High' School of Last AttendanCe S6

College of Last Attendance S3

High School Education S5

High School GPA (new item).Enrollment Status, AnnualStudent Level, AnnualStudent,,Goal

Student Major

X

X

X2

X

X

X'X

Q1

Q1.

Q1

X Q1

Q1 and 1a

SA4 X Q1

SA7 X Q5 and 5a **S11

* **

S13

Positive'Attendance Enrollment, Ahnual, SA5 X

Annual Average Weekly Student SA2 XContact Hours

Q5 and 5a,

Q5 and 5a

.-Actual, Hours of Attendance SA1 X,

Unitg Attempted, Annual SA8 X Q5 and 5a

Veteran's Aid Status S16 X -

**

**

1 From the Chancellor's Office Information System Data Element Dictionary, Ma1981 .

*Revision recommended (see Table IX and Appendix is

**Addition of an annual measure of this element on all students is recommended (see- Table I"X and Appendix B). 0

112

Page 106: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

V-

11,

STABLE IXr

PROPOSED CHANGES TO USRS:REVISIONS, ADDITIONS & DELETIONS

103

USRSSegment, ,* Title Item # Change

Relevant toEvaluationQuestion #

annual high school GPA

EOPS tutorial items

EOPS,

*:-/r29PS units completed

counseling hours

EOPS

/annual

0

add this item, Q1

E19 revise to separate EOPS Q3and non -EOPS hours

E6 revise to separate EOPS Q3and non-EOPS hours

E20 revise into same format Q5as units attempted--item- SA8

academic standing E2at end' of term -

high school GPA

annual *units completed,annual

annual , :* weekly studentcontact hours

annual

annual

anntial

annuaI

annual

annual

positive attendanceenrollment

academic standing,beginning

4 academic standing,end

annual GPA

cumulative GPA

student level,

-E20(revised)

SA2

SA5

El

E2(revised)

Ell

E7

SA7

revise to include AA/AS Q5if no longer enrolled

A

Q1a.'

Q5a

Q5a

expand the annual surveyto inclu'de all students Q5a(including these' itemsas a minimum, set)

Q5a

Q5a

-Q5a

Q5a

Qla

Page 107: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

104

base is an EOPS identification element. We have been told in conver-

satior* tat this new item has high priority. Under the assumption that

EOPS identifiers will be added- to CPEC; we are relying on this source

as the primary basis for assessing transfer- rates. In the event that

this change does not take place, it will be possible to obtain similar

information through the Longitudinal Study, though this is a slower and

more' limited procedui-e.

Operational Program Reviee. The OPR we foresee will be a multi-

ple day, on-site, team review of each EOPS program. The team(s)

review approximately 20-25$ of the colleges each year on a rotating

basis. The OPR will address two evaluation concerns: compliance with

existing regulations and effectiveness of program activities. The narra-

tive descriptions of planned program activities submitted as part of the

program application (Form 10) will be useful' in guiding parts 'of this

review. This OPR program activities review will be directly responsive

to question 2. In addition, we would encourage expanding OPR's. scope

to gain as much first-hand information' as possible about how EOPS

programs are being: conducted and about the effectiveness of support

services to EOPS students (question 2). To strengthen the overall

review 'process, we would suggest that concurrent'lnformal, open-ended4

.interviews with' EOPS °students and staff be added. Potential jer-viewees in critical program areas and responsible student spokespersons

.%might be identified by administering a brier questionnaire to all EOPS

students and staff shortly before the OPR visit. In addition, these. .

questionnaire responses woul5 provide a rich source of qualitative data

about the program.

1144

Page 108: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1 Cr5

Finally, an overall aggregate repbrt on all EOPS prOgrams reviewed

during the year should supplement the individual school reports:Guided by question 2, Chancellor's Office staff or an external contractorcould easily compile this summary analysis.

EOPS Form 10. The Annual Progr\am Application (Form 10) will

R .much of the data relating to the elments of the EOPS p:rogr ms

(questions 2, 3, and 4). However, because Form 10 data represent

projections rather than actual expenditures, they wills be useful only ifthere is a high correspondence between actut4 distribution of funds and

projected budgets. During 'our early analysis, we proceeded on the

assumption that there was a high correspondence.

At the same time, though, we tested that assumption. With the

assistance of Jennifer Franz, we compared the projections of Form 10 to

the annual expenditure reports of A-1 in a small sampling. of colleges

for 1979-80. The discrepancies were so 'large in the sample that they. would render Form 10 useless for EOPS program evaluation. (The full

sub-etudy analysis is presented in Appendix F.). . Beginning in 1980-81,

howevet*,' a new planning procedure was implemented and the current

program applications are 'based on reasonable projections of actual

allocations.

These conflictin.g reports on Form 10. leave some doubt in ourminds. While we have included Form 10 as a part of our evaluation

system, we suggest a similar analysis be donifi on the correspondence,

`between Form 10 and Form ,A-1 for the 1980-81 School year. °.

The Supplemental Survey of EOPS Students. We have designed

the Supplemental Survey to provide new info,r ation about student goals

and academia, plans (question 5). Based on the Statewide. Longitudinal

Page 109: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

106

0Study °(of Chapter 5), our survey uses Sheldon and Hunter's student

prototypes. Students will classify themselves according to their acade-

.mic goals. (A draft of the questionnaire will be found in Appendix E).

This , brief survey, which will be administered each fall, will tie only

5 minutes to complete. We suggest that a random sample of EOPS stu-

dents at each collhe be selected to receive the Supplemental Surn../ey,

thUs reducing bother witho reducing .validity or generalizability.

Longitudinal Study. We h ve designed 'the Longitudinal Study to

collect' data on students' long-term activities which reflect their partici-

pation in EOPS programs. We suggest a 10 or 15 campus sub-Pample,

selected from students who are already participating in theSupplemental

Survey.,

. Students will participate in thee study for four or five years.

They will receive An annual questionnaire each spring designed to moni-

tor their academic activities and assess any changes in their personal

and academic goalp. The questionnaire, will incorporate a branching

structure' based on student ollentation: each goal category in the

Supplemental Survey would have a unique set of follow-up questions

relevant to it. \ .

.The Longitudinal Study will provide information about how academic

and career aspirations. change, what affects, these decisions, and what

happen's to students after EOPS (whether they finish or drop out). It

will also help to answer salient questions about the community colleges'

mission and how that mission. prescribes EOPS services. 4.

Because community college students are a highly mobile group, we

can safely predict that the Longitudinal Study poses the greatest diffi-

culties (cf. the problems of Sheldon' and Hunter or Preising reported in;

116

60

Page 110: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

107

' Chapter 'SY. Therefore,' we suggest delaying the study foe a year toallow more planning and .coordination with the EOPS offices at thecommunity colleges.. It might be worthwhile to expand the survey wehave suggested--the Statewide Longitudinal Study illustrates a number

4dof additional issues that might be-addressed.

Reporting'-Sciiedule of EOPS Data Sources.

Figure 1 illustrates the data collection timetable fqr the sevensources In )Athe proriosed EOPS evaluation system, indicating, two criticaldates. The triangle denotes the date of data collection; the circle

*denotes the dat, on which analyzed data might be available for useo

the evaluation system. The line between triangle and circle rePres

the time required for coding and processing the data.

USRS -EOPS and USRS-anneal data segme ts, which reflect

information from the previous year; are collected in the fall. and sub-

mittlusl-- to the 'Chancellor's Office in October. 'Verification and, confir-

nation of the data require two to three months/ and the nal totals

,should. be available in December. (Assuming. that the commended

changes in USRS-annual will be made, we have not included usRY--

CensUs data in the timetable).N 4

LOPS Form A-1. Expenditure totals for the academic year are

reported on EOPS Form A-1 and submitted to the Chaneellor's, Office

soon -after the spring term. These data are presently -analyzed by.3

hand, .a protess which 'delays any, evaluation based on A-1 data until

the next fall.

CPEC Data Base. The California Post-litondary 'Education com-

mission" collects enrollment data from all three elements of the Californiaer °

Page 111: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4.

or

o

0

Figure 1a.

Timeline fort Data Collection

Academic Year

.1: I I I I. it I 14, I° I

# J ASS .0

t

118

Form 10.

0-

'J F

013R

Supplemental StudentSurvey°Longitudinal Study

Part

1

J J

O.'

A

$,

D.

I 1. I I 1111 I

J F M.A M J J A

aA .0USRS-annual

-'EbPS For A-1"

(opgttudinal StudyII

.4.

Key:.

Data-Collection 4

COMpleted Coding/Analysis

4..

43) ,

-

4

4

CPEC ot

41.

lip

fYee

119

iP

-40

Page 112: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4 109

college system--UC, CSUC, community Jcolleges -- during the winter for

students who enrolled during the fall. Thus, all students who transfer

from EOFT to four-year schools, one year's fall data 'collection, will_ .'show up ih the CPEC data base/as' new transfers' in next year's

winter reporting. Theoretically., the final data on these transfers are

due at the' CPEC office* by January 1. Practically, however, tlie last

two year's deadlines 101eight months for putting data in an accurate form, this. suggests that

one might not \be ale t access information on a transfer student for a.year or more after trans r:

et. Coupled with a , period of up to

OPR. Since individual operational prdgram r7eN-Ti-e-v-A are schedul d

foriwo-day" to four-day. periods throughout the academic year, thPR

exit interviews and fin'al 'reportktfor individual schools will be available

at .various times from' Dctober to June. An overall summary, however',

could not compiled until summer and would. not be available until late

summer.-

EOPS Form10. Program -applications are developed in the spring

prior to the. new academic year.' Final Form 10s are not submitted until

- after the new state budget' is 'approved on July 1. Thus, the program, ..

d)e- scriptive narratives _are available for use on a one-by-one basis byc ,

early fall, but the computer-analyzed fiscal section will not be available

until later inthe fall.

Supplemental Survey of EOPS Students. The Supplemental Survey

questionnaire we are recommencing will be administered to a sample of

EOPS students in the fall only. The data will be transmitted to the

Chancellor's Office in October and available for evaluation system use

after a month. or two.

120

Page 113: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

110

Longitudinal Study. The Longitudinal Study we propose will

require contact with studehts twice a, year--fall and spring--during

. their tenure at the community college, and an annual follow-up after

leave. Though the complete longitudinal analysis Will take four. or- .

five Years and should be reported in five-year cycles, the question-inaires yield annual information as well. After the spi-ing data

- -

collection is analyzed, a longitudinal astudy report will be issued ,each

summer.

Now, that the limits iThposed by the data collection timetable are-,.

understood, we can turn to the evaluation reports themselves and the

analyses that will to produce them,.

6P

0*,

R

f.

1,91

4.

4

Page 114: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

a

Chapter VII, alL

.6

EOPS EVALUATION SYSTEM ReCOMMENDATIONS:--s

REPORTING & ANILYSIS

,. ichapterIn this chapter we will recommend a multiple repo-ft evaluation .I. A \ ..-', Ne.

system for EOPS-; We will describe six evaluation reports: WI-1ich, data-...,, . .._

, sources inform them, which evakiation need they respond to, how their.,

analyses are to proceed; and wherc'they. should be aye'- r

Multiple Report System.

.Two major factors contribute to the practicality of a multiple r eport

system.. -First, the evaluation q uestions analyzed in Chapters., II) and-ck..

IV., address EOPS on three levels: individual EOPS -student, individual

college EOPS .program, and EQPS systemwide. The natural division.0

-

argues for 'separate, smaller documents.

Second, the evaluation data. operate, on different and irrPconcil---able--timetables. It seems to us totally foolish to wait several months-

for a complete report wl-fen \useful data is rpady and waiting. Ev-en.

though the 'fact that relevant data for any given academic year are

currently collected over a two-year period dictates a two-year evalu-

ation period, we recorpmend makirtilg information in separate document's°.

available as rapidly as posS

The -six reportst4we suggest, are:,

the College Characteristics Report

the- OPR Report

the Program Oriented Report

the Longitudinal studyReport

the Student Oriented Report, ande2.,4:..1r7

4,

122

6

*

Page 115: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

I

11,2

tudent Interaction Report

The reporting timeta le is presented in Figure 2.

The College Characteristics Report. The Supplemental Survey of

EOPS,,stUdents will serve as the basis for the College Characteristics

(CC) Report. In the survey, students classify themselves as one

seventeen prototypes accOrding to academic and career goals (question

5). The CC Report Will then abstract a collective profile of the EOPS

population At each community collie, which will prove, a valuable tool

for assessing the correspondence between the needs of EOPS students

and the services provided by the various EOPS programs.

The collective profiles will be produced by tabulating the Supple-

mental Surveys and presenting frequency counts, converted' to percen-

tiles for easy comparison. Graphs representing the distribution of

student types at each school and systemwide will make it possible to,

compare the profile of each campus to th erall profile of EOPS

students. Idaddition, schools will be clustered according to the

similarity of their student profiles. College membership in similar

clusters will be useful,. in further analyses, pArticularly -tfl the Program

Oriented Report and the Program Student Interaction Report.

When the Supplemental Survey is collected in October, it will be

sent to the Chancellor's Office for analysis. The large number of

student questionnaires may slow down data analysis. In any circum-

stances, the College Characteristics Repork should be drafted and Z_

edited by winter; a. February delivery date seems reasonable.

The OPR Report. OPR teams wiLl provide individual school reports

thi-oughout the year cf. tilapter VI). By ,OPR Report, we refer to a

summary of all the ollege reports written during the year. The- 6PR

123O

Page 116: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

FIGURE 2

Timeline of Proposed Evaluation 'Reports

Academic Year

111111.11H111J ASOND'J F MAMJ J AS O N D J F M

I ICollege OPR j StudentCharacteristics Report Oriented

Report1

ReportLongitudinal

Report '

124

asRrogramOrientedReport

0

0

J A

Program-StudentInteraction

Report

125

mot

Page 117: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

114

report summdrizes system-wide program characteristics and activities,

focUSing on personal end organizational structures in the various EOPS

,programs (question 2). A

The first portions of the OPR. Report will focus on the several

sources of easily quantifiable data such as the 'parameters- documents

which are checklists that indicate Which services are incorporated into

each EOPS program. However, we do not yiew this analysis as of most

value: Instead, we would prefer to see the first part of the report

focused more heavily on data which show the correspondence betWeen

actual program characteristics, activities, structures, etc.., and those

put forth in the Form 10 application. In geditiqn, descriptive statistics

of systemwide program characteristicivl also be derived from Form 10.

The second portion of this report is more difficult to produce.

The data on which it will be based. include personal impressions,

narrative descriptions and prose analyses which are more difficult to

aggregate. The analyses of these will be guided by question 2. The

analyses will be further guided by recurring observation in the

various inClividualvat

OPR college reports, and an attempt to de ermine

whether particular program elements correlate with these recurring

patterns. Finally, the "report will Investigate "outliers"--Programs that

received unusually strong 'positive or negative comments;-in an attempt

to identify key, features that contributed to these striking impressions

The annual OPR site visits' will Occur between qctober and May.

The summary OPR Report shoUld be available in Septemer. .

The.Program Oriented Report. The Program Oriented (PO) Report

will draw on data f1.(3m:Forrn A-1, rm 10, and the College Character-

istics Report to compare actual expenditures and the collective student

Page 118: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

A

1.1 5

profires at each school. The report will not only illuminate how well

individual EOPS programs meet student needs (questions 3 and 4), but

also permit intercampus comparisons of program service levels.

The report depends on coding the A-1 'expenditure records from

each school for computer analysis or analyzing the data by hand.Several variables from each school will be reported, including total

expenditure per cate1gory, expenditure by object class code (aggreg'atecl,01°

to the 1000 level) per category, financial aid to students, and district

contributions to coordinated services.. In addition, a systemwide aggre-

gation of these variables will be made. Charts and tables will facilitate

comparison of college expenditure patterns.

The College Characteristics Report should be ready in February,

but Vie Form A-1 data will not 'be available until the end of summer.

Thus, we suggest that the p0 Report be completed by October:

The Longitudinal Report. The Longitudinal Report will summarize

the activities -- jobs, promotions, transfers -- of EOPS students as

they depart from 'the program (question 6). It will also provide .useful

information about students' changing perspective on their school experi-

ence and their academic and career goals (question 5).z

The data from the Supplemental Survey will serve as a baseline for

comparing students' plans and goals when they enter the program, to

their end-of-the-year (and following year) objectives. The comparison

will tell us something about the patterns of personal expectations and

progress toward goals.

The most important Longitudinal Study 'result will bevinformation

about students' success after they leave 'EOPS. Graphs could display

various categories of endeavor and the relationship between stated goals

127

110

Page 119: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

116

and post- college activities. Eventually, after the first cohort group

finishes, it will be possible to investigate the relationship between

background- variables, program variables, academic achievement

variables, and success outside the program.

The -Longitudinal Study we propose will operate in a four or five. year

cycle, and this cycle should be repeated with a new group, of students

every four or five years. While it may take three or four years to see.

the first group through to pot-EOPS program results, Longitudinal

Reports will be issued every year, analyzing studek perceptions about

goals and paqicipation in the program. The Longitudinal Report will be

issued in the fall.'

The Student Oriented Report. The Student Oriented (SO) Report

will be ,based on data from USRS and the CPEC data base. It will

analyze background characteristics (question 1) and academic success

(question 5). The SO Report will be one.of the most detailed of the

six documents. We assume' here( that our proposed additions and

revisions to USRS will be adopted., thus allowing for EOPS/no97EOPS

student comparisons.'

The SO Report, will first compute systemwide descriptive statistics-

on all background variables for tlie -EOPS and non-EOPS populations.

Most of 'this analysis will require simple percentages, frequency counts

and univariate statistics. Graphs or tables will display the distribution

of EOPS and non-EOPS students by age,, sex, financial need, high

school .G..P.A., etc., Individual college aggregations may be compared

to the systemwide gatteens.

. 128

Page 120: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

v

)

-4

ir 117

, t

-Tti? Report will also analyze data related to student success at the

community' college using descriptive statistics. Here too, graphs or

tables will display the diStribution of SOPS and non-EOPS students by

units attempted, units completed, course enrollment levels, etc.a

Both USRS alid CPEC submit their annual reports in the fail follow-P

ing the academic year- (Chapter VI). Thug; assuming that they report

on schedule and that CPEC includes an EOPS identifier, the SO' Report...

will tle ready in March following the academic year under review.

The Program-Student Interaction Report.. The Program-Student\-.

Interaction (PSI) Report, the final document in the proposed evaluation

report system, combines the analyses of the Program Oriented Report

and the Student Oriented Report. the PSI, then, subsumes Form A-1,

Form 10-, the Supplemental Survey, USRS and the CPEC data base.

The cqmbiciation of all these data sources will make possible general

analyses about how programs differ in response to students' needs and

about how program characteristics affect student outcomes. Cross

tabulations, correlations and other) bivariate descriptive statistics are.. r

PSI's dominant analytic tools.

. One set of analyses in the interaction report will determine the

a

relationships between student entry characteristics and the types of '

EOPS services provided. For example, the program-student interaction

report may provide data to answer the question: "Do EOPS programs

differ between schools hose- student- populations have significantly,.;

different goal profiles?" ) .

fThe second -set of analyses will seek to determine if there are any

, .

linkS between program characteristics sand student ,academic variables,

I

such as course enrollment levels, units completed, GPA, awarding of

i '

1

129

,\

I

I

Page 121: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

J

1 1 8

certificates or degrees, etc. In some instances it `may be possible to

,use multiple regression analyses to determine which program service.48

variables contribute most to the -variation in student outcomes. For

example, the relationships between counseling hours, tutoring hours

and number of units completed could be analyzed to determine the

relative importanceof these services to courdt success.

As soon as the Student Oriented Report is completed, probably in

March, evaluators can cum to the PSI Report. It could then be

completed by April or May.

Postnote

This completes the recommendatk1ns for the EOPS evaluation

system. We have tried to develop an evaluation mechanism that is

responsive to the concerns of both ihel legislative and colleges study

groups while, working primarily within the existing data framework. We

believe the proposals offereehere, if implemented effectively, will

provide the Chancellor's Office and the community college Psystem with a

powerful tool for program description, analysis and improvement.

130' fa

Ise

Page 122: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

6

APPENDIX A

ACTIVITY TYPES

1 ---Act. 110 - Project Administrative Factions2 ---Act..120 - Project Support Functions

.

3 ---Act. 130 -'. Management'Informatiori and Evaluation Functions

4 - = Act. 140 - Other iia4gement,;Rinctions

5 ---Act. 21® - Recruitment Servitces Functions

6 Act. 220 - Early Outreach Fupctions7 -Act. 230 - Pre-entry Services Functions8 ---Act. 240 - Consortial Outreach Functions9 ---Act. 250 - Other Outreach Functions

10 ---Act. 310 - Curriculum and Course ,Development Functions11 ---Act. 320 - Instructional Services Functions12 ---Act. 330 - Tutoring Services Functions13 ---Act. 340 - -Other Instructional Functions°14 ---Act.°410 -.Educational,and Academic Counseling Functions15 ---Act. 420 r. Career and Vocational. Counseline Functions

010

16 ---Act. 450 - Personal Counseling Functions17 ---Act. 440 - Testing, Diagnostic, and Interpretive Functions18 ---Act. 450 - Information and Advisement FUnctions.19 ---Act. 460 - Other Counseling Functions.20 ---Act. 510 - Transfer: Transition. Functions21 ---Act. 520 - Employment Transition Functions \22 ---Act. 530 - Consorial TranSition-Functions23 ---Act. 540 - Other Transition Functions24 ---Act.'610 - Interagency. Functions25 ---Act. 620.- Consulting SeiVice Functions26 ---Act. 630 - Other Special Functions27- ---Act. 710 - EOPS Eligibility Determination Functions-28 ---Act. 720 - Other. Financial Aid Coordination Functions29 ---Act. 810.7-fvoject Staff DevelOpment and Training Punctions0.---Act. 820 - College Staff Development and Training Function's"1 ---Abt. $30- Other Staff Development and Training Functions---Act. 910,- EOPS Student Grants Function---Act. 920 - -EOPS Student Work-btudy Functi*---Act. 930 - EOPS Student Loans Function---Act. 940 - EOPS Student CWS Matching Function-.---Act. 950 EOPS Student NEGLMatching Functio---Act. 960 - Other Direct Aid, 0...

e

A) 131

Page 123: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented
Page 124: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

INTRODUCTION

It is an interesting soul-searching experience to reflect

upon one's evaluation of a major statewide educational program.

It is also rewarding to'have the opportunity to ferret out the

different implications IfOr the ongoing evaluation of programs

such as he one to be discusged that are generated by this

refle ion. The following,paper is an accounting of these

S

.

precei ed implications. ...,1

,p.

In 1975 the State of California commissioned a study,

under'the riter's direction, designedto provide a comprehen-

rsive evaluAio of EOPS in its statewide system of community

colleges. The study was designed to result in conclusions4

concerning the extent to which the community colleges were

meeting the.three interrelated sets of objectives established

for EOPS:' Those in the enabling legislatiOn (Senate Bill 164);

in the Statemen't of policy and GuA idelines adapted by the

California Community College Board Governors; and in the

'individual community colleges' applications for EOPS funding.

The study,-completed in the spring of 976, was; indeed,

comprehensive. It included the followringl

eL,

(1) Ext sive survey, in each of the 93 participating-'

colleges, of representative samples of EOPS

students, anon-EOPS.students, faculty, administrators,

EOPS directors and chairpersons of local EOPS commu-.,

nity advisory committees. The surveysmcmphasized

, 33'B .1

. ,

.0

Page 125: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

I

the reSpondentsk self-ieported personal and

academic charactAtstics, attitudes, perceptions,

experiences and anticipated outcomes. The survey

instruments were extensively pre-tested and refined,

and considerable attention waegiven to assure that

their items and measures reflected EOPS objectives

and related. issues. A considerable majority of all

targeted groups responded, ranging from 70 percent

of the EOPS students to 90percent of the EOP

'directors.

(2) Analyses of institutional and program characteristics

btained from two document sources: (a) a "Basicl_

D ta Sheet "- designed by the Project staff and (b)%

program self- reports roUtinely.submitted to tile

satewide Chancellor's Office.% .

...., ,

II

C la' rstUdies based on site' visits to twelve colleges- ..

, .

lo-atda throughout the state,, and represe g the

ctrum `of ,California community colleges in terms

such factOIN assize, student characte ristics and

di trict characteristics.. The case studies comprised

, the pooled reports from the teams that conducted the

sit visits. With few exceptions,, each team consisted

ofi ecollege president, dean.Of student perionnel

services, facu'ty member and/or counselor anda

current or former EOPS student, in aaaition to the

Project staff. In all lases reports were deve loped

. from interviews with the colleges' presidents and a

B. 2 134

Page 126: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1/4

cross-section of other key administrators, faculty,

counselors, current and former EOPS students,ofr'

represent4tives of local advisory committees aria

district superintendents. The interviews covered

the. complete range of objectives, processes and

issues pertaining to EOPS.

This data poorprovided considerable, wide-

lweeping knowledge on the background, personal and

"experiential characteri ics of the constituent

groups as well as importan institutional and pro-

gram-characterisiics. I also made possible a s

feature fend very impo tant to a program of this

kind: an examination of the interaction of the

target groups with significant other groups and

their environment.

`Much was learned from this evaluative study that has

implications for the increasing effectiveness of ongoing

systemwide (and institutional) evaluations of EOPS. These

implications bear on three areas: (1) procedures for the

evaluations; (2) appropriate criteria through which,to assess

the effectiveness of the evaluated piograms; and (3) various

'issues which the evaluations might well take into account.

As indicated 4111 the outset, the intent of this 1er A\s to

discuss major implications of the former statewide evaluation

of EOPS within each of these areas.

0

6

B 1 3 5

11

I

Page 127: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

PROCEDURES IF

: .

The evaluation actually led to judgments about three aspects

of the evaluation process, those having to do,with the Project's

own procedures, those having to do with the evaluation of EOPS

A,

per se4And.then those having to' do with how the statewide office. .

,

handles the evaluation process

PROJECT PROCEDURES

Several important strategies were executed which helped

to assure that the data base described above was a rich one..

Each college president was dtked.to (and did) appoint his own

college's liaison officer to Assist the Project staff to1 ,

. .

implement the evluation study on each campus. In addition to

considerable correspondence, the liaison 6ffi rs were acquainted

with the nature of this study and their tasks through a series

of regional workshops held. by the Project's staff, The

Project's directors presented the intent and prdcedures of

the evaluation personally to the EOPS directors and-studalt

persEtel deans at their statewide Meetings, .and they met

regularly with'the statewide EOPS Advisory Committee. The site

team, members were selected from nominations the Pgoject staff.4solicited from involved individuals throughout.the-state -

a tedidus process, but one which helped to assure an important

*sense of participation and good will among the particrpantsin

thg evaluation. Finally, pre-site v sits were made4And pre-

site team orientations were conducted ..tp as re that all, aspects

of the ,case studies would be taken care, of; and as smoothly

136B 4

Page 128: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

as possible.

Three strategies unquestionably contributed to remarkable

results. The survey response rates were unprecedented, ranging

from 70 percent of theEOPS students to 94 percent of the EOPS

directors. Almost all of the datasheets and other documents`

requested were received. The site visits wenteven better thanrplanned. Cooperation and responsiveness prevaile9 throughout:

Two research strategies, however, were imposed upon,th

Prpject which were debilitating. First, both the time allowed

(nine months from start-up to final report) and funding were

unrealidtic for such a massive project. Second, the state's

presctiptiont for the evaluation precluded a ongitudinal

study (examining the same individuals from the t entrance to

a program through 4jokir termination and/or beyo ), the only

way real programmatic impact can be determined..

THE EVALUATION

Primary among the points' addressed to the increased

effectiveness of the evaluation of EOPS were: '11)i the

clarification of definitions; (2) revisions. of state office,

,

quidelined; 'validity of measurements; and (4) discrepancy

among group perceptions.

(,1). The Clarification of definitions..-ThroughOut the

'evaluation constant questions were raised about key elements

of EOPS as articulated by the state office. A majority of

the questions had to do with the need for clarifidation

137B 5 Q

\-e

tl

4

Page 129: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

9

J

leading to more consistent clasSifications of dependent,

independent and family-emancipated students.

But the problem exceeded,this issue. The enabling

legislation (SB 164) stipulated that EOPS be for "...students

affected by language,Apcial and economic handicaps." The

consensus of the site teams was that this stipulation indicated

that EOPS was to be aprogtam for the "multiple disadvantaged."

In addition, the legislation indicated that participating

colleges should devO.op means for identifying such students

as well as the means for subsequently recruiting them.

Thus, as indicated in the evaluation report, indentifi-

'cation is taken as a way of developing and ranking the. criteria

for eligibility, presumably based on a systematic assessmentID9

of the needs of potential EOPS students in each college'S

actual'district. Recruitment, in turn, is the actual means whereby .

students so identified. are informed about and enrolled in the

colleges.

The findings were, however, that, with few. exceptions,

the colleges did not have a defined, articulated identification

process, other than financial need. Nor did the state office

compensate for this omission by providing identification

criteria to the colleges. Therefore, recruitment went on

without any real, systematic or consistent- reference t.44/'

criteria clearly identifying'the*mpatiple disadvantaged.

Obviousll, the, full assessment of a program's effectiveness. ,

,..-; 4

is impossible whe4l tlracAa'racteeistiqs of the constituents4%

9

.

Trol k- a t,

0. 4.

.? 35

. B 6'ii

a

7

Page 130: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

are. either undefined or inconsistently defined.

Another observtiOn resulting from the study concerns

actually identifying EOPS students, however defined. A

number of colleges refrained from labeling any program or

student as part of EOPS out of the intent to prevent any

negative stereotyping that might be associated with these

programs. While this,intent might well be justified in'some

stances, it eliminates expedient and valiAkmethods for

tracking4EOPS students and programs for purposes of any kind

of meaningful evaluation of them.

(2) Revisions of state office guidelines. A series of

multivariate analyses of the survey data made it very clear

that there were a number of indicators of success achieved by

EOPS students. These included academic achievement, presistence,

use of EOPS services, self-perception of skills and'personal-A

traits, academic motivation and educational aspiration. To

the researchers' 'surprise, these indicators were statistically

quite independent of one another, which, led to the conclusion

that 'any systematic evaluation of a program like EOPS ought

to allow for, even look for a variety of criteria, all of

which while relevant to the effectiveness of the program, may

not be that much related to one another.

Another matter along these lines came from the personnel

interviewed during the site.visi?ts. A strong consensus --

verified by the site teams -- was that systematic evalua-

tion must take into account local and"changing needs'and

139B 7

Page 131: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

situations. Guidelines for individual program effectiveness rt

and evaluative criteria should be flexible. The final judgment4

of the. present writer was that there was a great deal of

quality in how'''the EOPS objectives were being met and that

this quality in diversity should be recognized., In other.

words, any overall evaluation of EOPS should consider

different institutions' diverse,ways of meeting general

objectives in their own right. Yet, at the point of the

evaluation state guidelines didclotprovide sufficiently

for either multiplicity or flexibility of procedures and

criteria. (Those interviewed considered this true even though

there was exhibited general sfaction with the guidelines.)

The consensus was that the relines needed to be revised to

better accommodate® e. points seen by so many as crucial.

(3) Vlidity,Of measurements. The previous evaluation

,4included a variety of measures.of'stqdents: self-concept

regarding their perceived aptitudes, skills, interests,

4 self-esteem and ability to master their environment --

dimensions important to objectives of EOM.- Some of these

measures were based on previouslyAtAndardized and validated

scales and otherd were devised for the purpodes of this study.

Of moment here is the fact that most of Les measures

did not discriminate adequately-among the various student

groups, whetherincoirling students compared to cor\tinuing

students, EOPS compared to non-EOPS students, "successful"

students compared'tO less successful, or whatever.

B 8

Page 132: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

The problem is that most of the students rated themselve's

unduly hip on most of the measured, a phenomenon the writer

also discovered a few years earlier in across- country study

of dommunitty and junior colleges.* Apparently, the type of

items used in these measures (and commonly used in much if not

most research of higher education student characteristics) are

quite subject to positive response bias or social desire-

ability. The importance of the dimensions intentionally

being measured is not in question,' only the accuracy of the

measurements; Perhaps the solution is to move toward more

subtle or less obvious, behaviorally-based items; In any

eyent, what is essentialds to find a wa and

validly'assessing the kind of student characteristigs under

discussion..

(4) Discrepancy among group perceptions. A final ,Ad.

finding of the evaluation'offered nothing new to those

involved-in the evaluation of'theeducational process, 'but

rather served as an important reminder: that great care

must be taken in interpreting different groups' varying.

4.responses to common items. For example, when asked to rate'

the level of effort ,that had been made "... to recruit ar

faculty, adMinistratfon and staff whose racial and ethnic.

composition reflect's that of the community your colleges

serves," 93 percent of the administrators compared to

46 percent of the faculty and only 25 percent of the EOPS

* Trent, J.W. and Associated. The Study cif Junior Colleges, \Vol. II. Los Angeles: Center fdr the Study of Evaluation,University of California, Los Angelet, 1972.

1

13°9

Page 133: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

directors indicated that a "major effort" had been made.

As another example, when asked if separate personal

counseling was available for EOPS students, 31 percent of

the administrators said yes, compared to 23 percent'of the

faculty and only 20 percent of the EOPS students,.but 50

percent of the EOPS directori. Now, where does "truth"9

exist among these 2esponses? Probably in all diredtions,

as perceived by the different groups. Perception, of

. course, possesses its bwn reality. But the function'of

evaluation is manifold: it distinguishes what is.perceived

from what actually occurs; it assesses the effectiveness of

what occurs; and it determines the reasons for any reper-

cussions of discrepant perceptions.'

STATEWIDE ACTION

A number of responses from both the surveys and site

visit interviews referred to the statewide practices regarding

the implementation and evaluation of EOPS. These remarks

especially centered upon'the state office's consulting

function anii reporting procedures'.

(1) Consulting.. Strong sentiment was expressed both

though the surveys and interviews that the specialists in,

the Chancellor's Offide ought to be more involved with the

campuses' programs.- Only'33 percent of the administrators

and 50 percent of the.,EOPS directors surveyed could even. .

agree that the specialists' campus visits were enough to

maintain an effective link between the state office and the

r

B 10 ,

Page 134: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

individual EOPS.

The administrators and directors were much more divided

on two other-issues regarding the state office visits: only

37 percent of the administrators considered the visits

helpful, compared to 64 percent of the directors; 2 percent

of the adminiStratorStfelt that the specialists were fair in

their assessments, compared to 76 percent of the directors.

obviously,.are two additional striking examples of

'discrepancy of perceptions between constituent groups which

deserve idditiOnal attention in the present context.

These data, however, are not to be taken to mean that

the administrators wanted to see a reduction in the roles of

the Specialists. The interviews, in particular, led to the---

firm conclusion that there should be more specialists; that

they should have a greater consultative role and notsimply

serve as monitors; that they should provide prompt, construc-

tive evaluative reactions to.program proposals; and that they

shodld provide equally pr6pt'information about budget'

approvals.

More specifically, the request was that there be more

specialists to serve as resources helping the camp4S BOPS

personnel by providing the following: a source for^sharing

ideas,-problems and successes across the state; orientation

for new directors; frequent on-site reactions and suggestions,

contributing-to program effectiveness; and assistance in

gaining Skills in the following areas: 'program and-budget

f.

Page 135: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

*planning; management; data. collection and analysis; curricu-

lum and instructional development; and recruitment and

identification techniques.

(2) Reporting. There was' also consensus about the

reporting procedures of the Charkeilar's Office. A sub-

stantial majority of involved individuals surveyed ejected

the endorsement that the annual data requir by the state

office for the evaluatidn offEOPS was n cessary to_judge

the effectiveness of these programs. even greater majority

(72 percent of the administ:rators and 66 percent of the

directors) rejectedthe statement that the data reporting

procedures are efficient.

Still only a 'considerable minority saw no use for the4

data at all. Their main concern was not that there be no

4evaluative data, but that theresbe4-revamping of reporting

practices. 'commonly expressed was the perception that the

current level of data required from indipidual EOPS far.

exceeded that of other programs; th'at these excessive

requirements waste time and funds; that 'repOrtips practices

onthe whole be carefully reviewed, planned and coordinated4ir 0 /

at the state level that such coordination'include.the..

"written" evaluation material on'EOPS performance with that

of the specialists' on-site visits;-and, perhaps most

commonly expressed, .that feedback be prompt enoUgh to' be Of.

some use to the colleges.

Page 136: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

it:

4

CRITERIA 2

As indicated previously, the criteria fundalnental to the

tassessment of the statewide EOPS were themselves questioned.

Mostly the questions had to do with clarification of "success"

indicatorsand their operational definitions. Vince criteria

constitute an essentidlimponeint of, ev ldative research or

procedllres, several difficulties goncern ng the criteria

indicating the success of EOPS were treate in the above

discussion on Procedures. Hence, itshould uffice at this

point to enumerate these points and consider a few additiOnal

criteria not treated above.

1. There were inputs froni all respondents that there

was need for better definitions of all criteria

and other key elements of. the mandated programs.

2. There was most evident --.as indicatqA in'the abovr

discussion of quality vin diversity that EOPS'

.objectives can be met in'different ways, 4pd that/

therefore, there should be diverse evaluative

criteria to refle these different ways.

"3.-Also evident fro& e cilitta was the fgct that there.4 .-. k:,...

should be multipieluteitdia for "success." Again,

as discussed above, multivariate analyses, imperfect

an themselves, still revealed a, definite diversity

of quite independent indicators of success. As-just

one example, the evidence is that "success" should

be considered more in terms of persistence than

B 13145

a

Page 137: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.questiohably,achieved grade point averagt.,.

4. Enough has a3.ready2teen said about the °need for

establiihed, workable sand multiple activitii!es for,

the identification of potential EOPS °students andti

their recruitment.....

5. There°is also the tatter aCrecruitindnt versus

E-,

support criteria. The success of.EOPS cannot be

*.based just on-the-extent:to which these 15rogtams'. a

recruit "disadvantaged" studehts and offer them

financial aksistanCet'The success of these programs

,Must als9 be gaseduip-onphe.extent to which they do

,,offer suppoqive services that help the financially6

and other4ise6didi aged 'students to advancI

,,thr gh'tommunity.college and beyond'.,

;6. Finally', the surveys ant interviews brought bp the

,,°` #,foregone other major concefn. those questioned

gene 'isr felt that stateWide-ggdelines-for

r

N eValtlation7yere ke0onarrli flexible, but they were141.4ZL's I , t4.' : 'ei 1 - V ,,

mosfaigaqtahlrallout,thersOtinued need f9r this ,

v a

flexibiltty,

''.stlatwide imperativwlwould not

as

Asmother local iceedsfi,Z,

S c 4 '4'°

ystits

The study pointed, out

important to thewaluation,b

146.

B 14

res6lved issues

e of them-have

V

Page 138: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

S.

tl

al ready been indicated because they bears() heavily on the.1.

evaluation procdss and criteria for evaluation. So do the

others. Thus they will be reiterated and summarized in

that fashion;

THE EVALUATION PROCESS

Again, evaluation is a situation of the kind under

discussion cannot be & one-way process. Atsissue is not

only the state's evaluation of the effectiveness of EOPS

operatinr)in individual institutions, but also the effective-,

ness of .ke--statewide office responsible for EOPS.

Statewide Office Procedures. Probably thelmost

prominent issue arising from the study had to do with data....-

pequiremerits. Campus 'official found them to be far too

frequent, too burdensome, to uncoordinated, and reported back

toc? poorly with too little meaning.

A seqprad issue emphasized the statewide guidelingp for!r.

EOPS. Although the involved respondents generally we---

positive about them and condered them iiifficiently flexible,

their great concern was that they remain flexible.enough to

takeinto account local needs and situations. They were al

concerned that a statewide evaluation, such as the one tha

was-being conducted, would rigidify the guidelines.

Although the respondents were generally positive toward

the statewide specialists, certain problem's were stressed.

These were the need for more quality consultative time from

the specialists, greater and quicker responsiveness to such

-1"/B 15

Page 139: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

matters as budget, proposals and data reports and more

orientation service. These problems, among others, con-

. tributed to the decided conclusian that the dtatewide unit

responsible for EOPS should also be regularly aria system-

(.-tically evaluated.Institutional Procedures. The ladk of adequate,

operational definitions was a problem at thestate level and

continped to the institutiodhi level. This lack led to aatp

considerable amount of confusion over who is really eligible

for EOPS. In many cases financial need was the sole index

used to d- thou h the intent'.

as interpreted through the evaluation was that sops Students

be multiply disadvantaged. Yet, as indicated,-these

multiple disadvantages were never operationally defined.

A related issue was whether recruitment should be

considered in any ongoing evaluation. A common criticism

frOm the directors was that normal enrollment, precedence

and worms -of -mouth were already attracting more EOPS-eligible

students than could be accommodated under existing funding.

Inappropriate funding also figured into the lack' of /

long-range institutional masterplans for EOPS which were to

'besubmitted to the Chancellor's'Office. Only,a few campus

Masterplans for EOPS existed, and most of those were,sparce

and incomplete.. The contention of those involved was that

this situation was thecase and would remain the. case until

they achieved long-range funding. They argued that they could

not make plant for the future Without knowing ahead of time

48B

Page 140: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

the level of funding to be anticipated making those plans

possible to implehirent.

-funding came up in another important way., Even before

Proposition 13 Vas an issue there was a widespread concern

that EOPS was insufficiently' funded to carry out its many-,

mandated objectives. The question was rased then -- and

seems even more relevant now -- as to whether each campus

should be required to at least attempt to seek supplemental

funds from private sources. Should such An activity be man-

dated, it, too, would be an important eleMenf to evaluate.

This situation would be even more true if the evaluation led

to the identification of effective, replicable t

fund raising an4 the identification of the befte

hniques far

sources for

funding.

A fir issue concerning institutional evaluation

centered on roles of the EOPS directors. Generally, they

\were rated quite positively ( "excellent" or "good") by-other

administrators and the faculty on all of a variety of criteria.

They received comparatively low ratings, however, on their.

working relations with the faculty, inteikacting with the

community, recruiting EOPS students, planning and

evaluating EOPS and publiciiing EOPS. Considering the

presumed importance of some of these criteria, ,he meaning

of the relatively low stings deserveg-Iurther -inquiry.

7

B.-17 149

r

Page 141: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

c

-""fliE CRITERIA

Enough has been said above about the issue concerning

quality in diversity, but its importance dictates its reitera-

tion here. A majo finding was that there are different ways

to the same e and that the diverse ways may well be determined -1

by varying situations. Consequently,. evaluation criteria'

should be'sensitive to this'diversity. How this process can

best be accomplished while maintaining statewide program

jsptegrity may in itself constitute an important evaluation.

These criteria shoul.d alsd be sensitive to, the overallqt.

environment which'influences and/or is influenced by EOPS`.

EOPS is an interactive phenomenon. The quality, of tHat

interaction is critical) and equally critical to assess.

Two cases may be illustrative. Much of the effectiveness of1

E0t5B- is dependent upon a supportive faculty wiling and

able to assist EOPBostUd7pts with the. different "disadvantages"

that they'come with'by design. Yet,.both the surveys and

site visits made it evident that the faculty in general were

largely ignorant ofthe nature of EOPS and very resistant-Nto,

faculty developme'N programs that would-be most likely to\.\

help, them work with EOPS students. Hence', faculty behavior,

itself, may be one of the most importafit criteria of all to

to consider. 46 s

i

/'-'\

Then, since much of EOPS does or should constitute an

outreach program the qualit% of the programs' community

relations --4

both relations with the campus community and

oa

B 18

150

Page 142: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

with itssurrounding , "service" community *form two other411k:

important criteria. Here, too, are criteria' deserving

special consideration, since community relations was an issue

,foUnd'to be somewhat problematic for EOPS in Ways cussed 11,

previously. ( ti

Under the C.ircumstances, a major conclusion of the,

previous eivaluation appears.ss relevant now as it did a few.

yearS ago; namely, that the knoWledge gain6d through the

evaluation resulted in:

1) A heightened appreciation of the diversity ofsettings in which California community collegescarry out their missions; .

'2) a correspondingly heightened awareness,thatevaluations or assessments of state'- mandatedprograms -- educational or otherwise -- need

'always to include a systqmatic assessment ofthe larger organizational and community contextsin which the programs operate; and

'3) confirMation of the. need for legislation thatinitiates programs like EOPS itself be continu-

_ -ously reviewed, evaluated and revised in light, of changing economic and social conditionsvhichaffect the approprlatenosS and/or _precision ofits language (Ei. 1).

EOPS reaches many important levels. This program's

importance is such that it be adequitely evaluated at each

of these levels. Consequently, continued effort should be.

made'to assume the adequacy of such evaluation.

t

.

40B

- 1

t

Page 143: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

Appendix C'

Revisions and Additions to USRS Data Elements

- We propose-to revise four ok,the existing USRS-EOPS

student data elements, add one new element to the existing

USRS-annual subsysteM, and addat least nine elements to a,".44.

new USRS-annual segment which would. be collected on all

students, not just special program participants. :These

modificationg will be described briefly in the followilyg

paragraphs, And proposed data element dictionary langSage for

each will be*presented..:

Description of kodifiCations *The measurement of units

completed (Itim-E20) shouldtake the same orm as -the

AtteaSUreent of units" attempted, which is c rrently part of/ .

the USRS-annual student data elements. .As resentry constituted_

, 4

'the same distinctions-:are not' made in the t elements.°.

Conseguently,"ive-are suggesting revisions E20 to

9.

make thee tWo-4elements parallel.

As currently written .the, reguests'for the number of

counseling hours-(item-E6) and the°number of tutorial, hours

(item E19) provided to=each EOPS student are ambiguous.

Each college has` the option :of inciuding or excluding non-EOPS

funded counseling and-tutoring-hours in its totals and is

not requiied to report which option it ha8%-selected. --As a-

4

result the data obtained from different colleges. are-riot

comparable. We suggest that the distincti;ns between EOPS-,

C1

A

Page 144: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4 I

fndedsand non-EOPS funded hours be made explicit in both

data items.

There are currently seven options offered to indicate

the Students academic standing at the end of the term (item E2).

,These include three distinctions betweenEttudents who are,still

enrolled and three distinctions between students who

withdrew or were dismissed. We proposed to.add two additional

distinctions to those applicabl,to students no longer

enrolled in_order to describe their status, in greater detail.

These additional categories would designate students who

had left the college in a normal manner after completing

a certificate or:degree program.

Currently there is no measure of academic acheivement

prior to enrolling in a post-secondary institution. To

provide some common baseline acheivement data we propose to

add.an item refleCting high-school GPA to the USRS-annual

data elements.

Finally, we are propoSing that an annual survey of all

students be added to the current USRS system. At least nine

items should be included in this survey to provide important

comparisons between ,EOPS students and the general college

population. The nine items we suggest for inclusion in

this survey area gh'school GPA, units completed, weekly-.

student contact hours, positive attendance enrollment,

academic standing-beginAing, academic standing-end, annual

GPA, cumulative GPA, and student levels.

C24

Page 145: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

II

e.

.

AO

. ,--.

Data element dictionary de'finitions for each of the

modified elements 'will be found on the pages.*

t

z*

S

*.

...----

- .

.,

wa ,

n

.

C 3 15 4_

Page 146: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

REVISED

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA EIIMENT DICTIONARY

E7

ELEMENTNUMBER

ELEMENT TITLE: Units Completed. .

DEFINITION:.

.. -of units Of credit generated . by credit courses

the student was actively enrolled during theacademic year.

. .

r

,,

Total numberin whichprevious

,

OD CAT i. NTS:

There areunits ofthe student

0. Full -term1. Full -term2. Positive3. (Not4.-Instructional5. Apprenticeship6. Independent7. Work

Zero fill,- student.

If variablereport the

Note: TAseunits completedbe reportedpositive

.

Obtained

. .

,

eight (8) sub-elements for this data element. Reportcredit completed for those types ,e- courses-in whi'h

was actively enrolled during the last academic year:

credit courses -- day course. .

credit course -- extended day (night) courses.attendance credit course.

used.)..

.Television courses (ITV). ,

courses.studies. ,

.

. ,

experience.

those sub-elements that are not applicable for the.

a

. -.

number of units of credit are awarded in a course,units of credit for which student was enrolled.

data elements are mutually exclusive; i.e., ifin ITV are reported; the same units should not

in the first two data elements,'full term orattendance credit courses.

..

from college course masterfileiregistration office.

.

,

COBOL PICTURE:.

,99v9 oc &urs 8 times,

-.

.USES: 155, c

C 4

Page 147: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

. REVISED

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

A ELEMENT DICTIONARY

El9

ELEMENTNUMBER

Tutorial Hours

The total hours spent by'the student being tutored this term.Report separately EOPS,and non-EOPS tutoring time, if goodestimates are available. Include hours spent in the studycenter working with or under the direction of a tutor underthe appropriate category.

CODEt, 'CAT EGORIES, AND COMMENTS:

There are two (2) occurrences of this data element:

The total tutorial hours (rounded to the nearest hours)shoulA.be reported separately for:

1. EOPS-fuhded tutoring2. Non-EOPS funded tutoring

If the student has not received any tutorial assistance under',one or both categories, zero fill the appropriate data.

COBOL PICTURE:DATA LENGTH: 3

DATA CLASS: NumericCOBOL PICTURE: 999 occurs two times

"USES:

DATE.- ISSUED:47.

L56 C 5

Page 148: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ELEMENT TITLE:

DEFINITION:

REVISED

CHANCELLOR S -OFF I CECALIFORNIA COMMUNITY COLLEGES

DATA ELEMENT 'DICTIONARY

Counseling Hours

E6

ELEMENTNUMBER

The total number of hours spent by the student this termreceiving counseling.° Report separately EOPS and non-EOPScounseling hours if good estimates are available.

CODES, CATEGORIES. AND COMMENTS:

a

There are two (2). occurrences of data element.

RrThe total hours (rounded to the nearest hour) of counselingshould,be reported for:

1. EOPS counseling .

...

# ..2., Non-EOPS counseling -t ..:p

.4

If the student has not receiveeany counseling assistance. under one or both categories, zero fill the appropriate data.

i .

COBOL PICTURE:DATA LENGTH: 3

DATA CLASS: Numeric .

. COBOL PICTURE: 999 two times

DATE ISSUED:

Page 149: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

REVISED

CHANCELLOR'S OFFICECALIFORNIA .COMMUNITY COLLEGES

DATA ELEMENT DICTIONARY ELEMENTNUMBER

ELEMENT TITLE: .

Academic Standing at End of Year. (

. .,

.

DEFINITION: 4 .. .

- The'students' acadethic standing at the -end of the previous

..

\_

. academic year according to.thp following codes.

. ,.

.

.

coas, CATEGORIES, AND COMMENTS: -.

1 = good standing.

.1

.

2 = special admit . .

- . ,

3 = probation Al ,.

'

4 = withdrew upon completion of certificate program5 = withdrew A.S. or A.A. degree - . ,,..,

6 =.withdreWwithout certificate or degree, b according, to..,

.,

, procedure , ,

7 = withdrew without notice.

_,B = dismissed..9,= other .

.

-, - ,.

.. ../

,

.

. ..

..

., ,

, .b .. .

. / 4. '

c e. . 4

I

. t.\ ..

4

. 1.4.. .

.

. .,. a

..

.

.

L,

a

-

.

. .

COBOL PICTURE:.

'DATA LENGTH: ,--1DAT& CLASS: Numeric -NCOBOL PICTURE:. 9

1

USES: `

, -

0C 7

Page 150: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ADDITION TO.USRS-ANNUAL

CHANCELLOR'S .OFFICECALIFORNIA COMMUNITY COLLEGES

- DATAILEMENT DICTIONARY ELEMENTNUMBER

ELEMENT TITLE: High School GPA 7

...

.

DEFINITI3N: - .

-High schooI;GPA as shown on student's transcript arkd determined. by Registrar'.

.... ..

.

t4 .

,

.

.

CODES, CATEGORIES, AND COMMENTS:

. . . r

If 'high school GPA is*nit<available,

Round to' nearest 1/100..

Obtain 'from Registrar.

.

.

4 ; .

.

.,

.

.. ..

.. .

.

. .

. .- ,

, ..

..

t

leave blank.,

. '

- , .

.

. .

.

.

di

..

t

0

.

.

..

4

.

.

,

..

..,

4

..

.

.

.

.

,

,

a

...

k

.

.

..

.

.

COBOL RI CTURE:.

.,

9v99 ' -

.

.

USES:..

.

Page 151: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1/4.

ANNUAL SURVEY OF TUDENTS

.0#. CHIANCELLI OFFICECALIFORNIA 'COMMUN TY COLLEGES

DATA ELEMENT ICTIONARY

11111ELEMENTNUMBER

ELEMENT TITLE:, ..

Units Completed

DEFINITION: -.. i ,*

Total number of units of credit generated by credit coursesin which the student. was actively enrolled during the preshousacademic year.

t.

.

,. .

. -

.

CODES, CATEGORIES, AND COMMENTS:

..

There are eight (8) sub7elements for this data element. Reportunits of credit completed for those tl pes of courses in whichthe student was actively enrolled during the,last academic year:

:.

0., Pull-term credit Courses -- day courses..1, - Full-term credit courses -- extended day (night) courses.2. Positive attendance credit course. 'ick

.

3. (Not used.)CI

` LF. Instructional Televisf,op courses (ITV).5, Apprenticeship courses. ,

,.

6. Independent studies. -

. 7. Work experience.

Zero fill those sub - elements that are not applicable for the,

. -.student. ,

..

,

.

If variable number of units of credit are awarded in a course,.

report the units of credit for which student was enrolled..

NoVe: These data elements are mutually exclusive; i.e., if units. completed in ITV are reported, the same units should not.be

reported ail the'first two data elements, full-term or positiveattendance credit courses.

. .

.

Obtained from college coarse master file/registration office.^ , .

.

' .-..COBOL PICTURE:

'. -

99v9 occurs 8times

1,

.

-,USES:

- . (

06. , , 0

DATE ItiSUED:.

.

.

, C

Page 152: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

'N.

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA DICTIONARY

A

NEW

ELEMENTNUMBER

ELEMENT TITLE: Weekly student Contact Hours (WSCH) - Full' -Term CreditCourse(s) Da and Extended Day

DE I IL u.

.

Average number of class hous for which student is actively/ enrolled in full-term creditcourse(s) duringthe past academic.year.

.

.

4

.

.- .

CODES, CATEGORIES, AND COMMENTS:. -

.

.

.There are,two, (2) occurrences of this data element.

(Refer to record f6rmatsin Appendix G.) Report WSCH for day-4, courses and extended day (evening) courses separately;

This data element is to be reported only for studentssptivelyenrolled in full-term credit, course(s). If the student

zerois not actively enrollee in a full-term credit course, ze s..

fill data field. _.

,

.

.

s.,

tz

,.

-, ,

.

-,i

.

_.

**.,

.

..

,..._,,,..--"\

.

,.

.

.t

COBOL PICTURE: ..

.. DATA-LENGTH: 3 1

DATA CLASS: Numeric DisplayCOBOL PICTURE: 99v9 /A,9

.

.USES: 161

111P1111

Page 153: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4

ANNUAL SURVEY OF ALL STUDENTS

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA ELEMENT DICTIONARY

NEW

ELEMENT;:NUMBER

7ELEMENIZTITLE' Positive Attendance Enrollimeht, Annual

DEEENITION . .

.%k..,4

A code to 1 icate active enrollment in positive attendancecourse(s) at ny time during- the past academic year. -,

, aOt

,

..

r.'

CoDEZCATMORIM'ANDIZ.COMIVIENTS:-.1

....-

CODE CATEGORY

.

4,

0 Student was n t enrolled in a positve attendance course.. -.

1 Student wa nrolled in a positive attendance course.

2 Student was enrolled_ in a positive attendance noncreditcourse.

,

,4 3 Student was enrolled in both' positive attendance creditrandnoncredit courses.

4 - Student was enrollebin a non-state supported class..

,

There are two (2) occurrences,

of this data element. Report enroll-ment in positive attendance, courses separatdiy for day and extendedday (evening). 0

,

Extended day course is any course starting at or after 4:30 p.m.e.

,.

/.

.

.

.

e . t.

.

)./

.

.

.

,

COBOL PICTURE:s°4DATA LENGTH: 1

.

DATA CLJSS: ' Numeric Displayi_COBOL PICTURE:

_

_._

,.

Page 154: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1

ANNUAL. SURVEY OF ALL STUD$NTS

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA ELEMENT DICTIONARY

NEW

ELEMENTNUMBER

.

ELEMENT TITLE: Academic Status at Beginning of Year

.

.

,

,.-

,-

.

DEFINITION: -i

-. . .

.. , . I ,

. Student academic standing at the be4knning of the, first termthe student was enrolled during the past academic, year to the-fallowing coes.

\.

.

.

, 4

-

CODES, CATEGORIES. AND COMMENTS:

..

1 = good standing # .

2 = special admit43 = probation .

9 other/unknown

.,

,

Collected by EOPS Director at beginning of term.

..

- .

.

.. .

.

-IP

. , ..

, . . ,

.

/ .

.. .

.

-

,

L

I,

.

.a.

.

-

.

.

er

.

.

COBOL PICTURE:/ DATA LENGTH: 1

DATA CLASS: Numeric: ,,

COBOL PICTURE: 9. .

.

USES: 16' a

. ,

DATE ISSUED:

Page 155: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ANNUAL SURVEY OF A14, STUDENTS

--d'HANcgaoRIs OFF I CE -

CAL MORN IA COMMUN I TY, C414.EGESDATA ELEMENT DICTIONARY

F=c

ELEMENTNUMBER

ELEMENT TITLE: ..,Academic Standing at nd Of'Yar. ,

. ,

DEFINITION: _.,... - .

The students' academic standingit the end of the previous dcadethi.

.

. year according to the following codes.

.

RA

a .A

.

..CODES, CATEGORIES, AND COMMENTS.i

'.

.. ., 1 = good standing

. ..

2 = special admit - .

3 = probation,

4 = withdrew upon completion ,of certificate. program' ..

5 = withdrew upon receipt of A.S: or A.A. degree6 = withdrew without certifigke or degree,,, but according to

.prOcedure .

7'= withdrew without notice*

8 = dismissed .

.. , w...9 - other .-

.

,

.

.

. .

. ,.

.

. -. ,

4 . Z.i

A ..

' 2V

...1.4

6/ 44

t

,I

.

s

.

.e.

.

C.

A

.

i

,

,

.

.

.

COBOL PICTUBE1 .

ADATA LENGTH:]:

.

.4.DATA CLASS: NumericCOBOL PICTURE: 9 .

.

t

;*

,

..

UVES: . 1 . .,A . a....

.

I

Page 156: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

O.

ANNUAL'SOVEY OF-ALL STUDENTS

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES.

DATA *ELEMENT ICTIONARY* .

NEW

ELEMENTNUMBER

ELEMENT TITLE:,.,1---

.Ahnual GPA '' ,

,

li

DEFINITION: t 1:,

,

' . .. ? , , 0 ,..4

.r.

'''' .

GPA earned during the previous ac ademie'yeari6s determined bycollege Registrar. .. 1

e

. ..C..

..

ibt r :'-', .0.. ..;,..-- ... t1

CODES, CATEGORIES. AND COMMENTS: -- ,,. - . ....

,

.,Rouhd to nearest 1/100..\ .

Obtain from Registrar at the end of academic yea.r.

.

. .

.

-

.

* .

.

.

.

. . .

.

.

..,

/. .

.

. ..

. ,.f .

..

-..

,

A .

.- .

..

.

1

,e'v

.

.

.

.

.

.

,

1

.

.

COBOL PICTURE: .DATA LENGTH:...' 3 .

DATA CLASS: NumericCOBOL PICTURE: 9v994 . .

.

. 0USES: .

.

.

.

DATE ISSUED:

Page 157: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ELEMENT TITLE:

DEFINITION:

ANNUAL SURVEY'OF ALL STUDENTS

CHANCELLOR'S, OFFICECALIFORNIA COMMUNITY COLLEGEg°

DATA ELEMENT DICTIONARY

9

,NEW

ELEMENTNUMBER

Cumulative GPA

d

Cumulative GPA at the end of the previous academic year.as shownon student's transcript and determineci#y Registrar.

CODES; cATEGOMES, AND COMMENTS:

oa.

If cumulative GPA is not computed, leavb blank.

Round to nearest 1/100.

stain from Registrar'a the end df'academic year.

4

.

S.

4

COBOL PICTURE:DATA LENGTH: 3

DATA CLASS: NumericCOBOL BICTURE: 9v99

$

^C 15

Page 158: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

ANNUAL SURVEY OF NEW STUDENTS

. 4

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA ELEMENT DICTIONARY

ELEMENT TITLE: Student Level, Annual

DEFINITION: ! 4

,b

0

o

4

.

accredited work completed"as of thefirst census weekterm in which the student was enrolled during theyeat which reflects the student's level of academic

.

-

.

.

.

.

.. 4s.

The totalof the firstpast academic

P achievement.

CODES, CATEGORIES. AND COMMENTS: .

'CODE CATEGORY . * .

.

.

4

.

1 , Hicih school student '

.

. . .

2 Freshman .

..

.

3 ' Sophomore with 30-59 semester units' or 45- 89-quarter.

i.Units ,t,

'

I'4 Student with 60 or semester units or.96 or more

quarter units..

.,

..

. 4

-5 Associate Degree ., 'c' .

.

'....

.

Y. 6 Bachelor's Degree or- higher 1

.

.. .

,. ..9 UnkhoWn 1.

't

' . ... ._. .

c.

°. .

.

. ..

.

, y ,

.

.

.,.. .

.

.

. . ,

V . : ,

.............1.,

COBOL- PICTURE:. ,

DATA LENGTH: 1. ,

DATA CLA4S: . .`Numeric Diblay . " .

. COBOL PICTURE: 9 , . .

..

USES: ,.

' -; 4 4.. 1u I,.

14Or

Page 159: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

CHANCELLOR'S OFFICECALIFORNIA COMMUNITY COLLEGES

DATA ELEMENT DICTIONARY

a

a

ELEMENTNUMBER

ELEMENT TITLE:High School GPA .

DEFINITION: .

.

6,

High school GPA as shown on student's transcript and determined-by Registrar. ,

\*

.

. , .

f. t

,..

.

CODES, CATEGORIES, AND COMMENTS: .

If high 6chool GPA is not available,,G.

Round to nearest 1/100..

Obtain from Registrar. %

U .

Y_1

.

. -

. .

.

. .

.

t ,

. ,

. .

....c4

".'

.

leave blank.

,

.

.

, ,

.

.

,

.

.

.

.

.

.

,

,

.

.

.

.

-

...or

.

CORM PICTURE:

-9v99 .

.,.

.-

..,-

_.

0.

USES:..,.

..

A

DATE ISSUED: -C 17

Page 160: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

STATE OF CALIFORNIA

saw

APPENDIX D

CALIFORNIA POSTSECONDARY EDUCATION COMMISSION

EDMUND G. BROWN.JR., Governor

Student Oriented Data Bases Maintained by the Commission

Data Bases are available for all 'Years between 1976 and 1980. DataElements noted with an "*" are available since 1979.

A. Student Enrollment Data Bases

1. Public Segments - One record is maintained for eachstudent enrolled in the Fall term. Each record containsthe following data elements:

a. campus where the student is enrolled,

b. student level (e.g., freshman, sophomore),

c. student status (e.g., first-time,continuing),

d. student major,

transfer,

e. fee status (resident/non-resident),

f. citizenship, (e.g., U.S. citizen, not U.S. citizen),L"

g, full/part-time status,

h. credit load (undergraduates only),

i. sex,

j. ethnicity (six categories definedFilipino,fNo.Response, and "Other"),

k. student program (e.g., Biology.,Sociology),

1. age,

by OCR plus

Mathematics,

m. high school of origin* (for students classified asfirst -time freshmen only),

n. institution last attended* (for students classifiedas transfer students only),

o. permanent residence* (zip code,.county, of CommunityCollege district).

D l 169

'4

Page 161: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

2. Independent Colleges and Universities - A single record ismaintained -for each campus. Each record supports thefollowing data elements:

a. campus,

b. student level (e.g., freshman, sophomore),

c. seX,

d. ethnicity (six categories defined by OCR; does notinclude Filipino, No Response, or Otherclassifications).

B. Degrees Conferred Data Bases

411,1-1. Univers4 of California and the Cali'4' University and Colleges - One record is

each,student receiving a degree dUring anEach record supports the following data elem

a. campus where the degree was awarded

rnia' Stateintained forcademic year..

nts: .

b. degree level (e.g., bachelors, m steri);

c. sex,

d. ethnicity (six caegor es defined by OCR plusFilipino, No Response, nd "Other"),

e."

f.

program (e.g.,'Biology, Mathematics, Sociology),

age,

high school of origin* (for students"native" students only),

institutift last attAdeit (fortransfer students only),

i. permanent residence* (zip code,* College district).

2. California Community Colleges and Independent Colleges andUniversities - A single record is maintained for eachcampus. Ie,contains aggregated information arranged inthe following manner:

ente?ing as

students entering as

county, or Community

D 2 A

If

4r.

Page 162: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4

/

a. campus where degree was awarded,

b. the number of degrees conferred by:

1. degree level (e.g., associate, bachelors

2. sex,

3. ethnicity (six categories defined by OCR only),

4. program (e.g., Biology, Mathematics).

V f P

t. %

41

1)

.,

tr

I r----.. ,

7

r

. , r

.

17:1. ,D 3

oft

...

s,

I

fr

Page 163: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

APPENbIX E

WHICH.TYPE OF STUDENT ARE YOU?

There are many reasons for attending.a community college. We would like

you to classify yourself in terms of the following descriptive statements.

Choose one.' Of course there are no right answers.. Students of all types

attend classes here.

VOCATIONAL ORIENTATION

I. am enrolled in a one or two year career program.

II. I am primarily interested in obtaining skills for an entry level job,and I may leave school as soon as I am able to find such a positiop.

I already have a job with career potential, and I am attending collegepart-time to increase my skills for a promotion dr raise.

IV. I have a career'already, and I am attending college (part-time) tbobtain skills for a second career or an additional after-hours job,e.g. real estate.

V. I am taking 'just a single course or two that are required to renewthe license for my particular job.

- (TRANSFER ORIENTATION

VI; My goal is a BA or BS degree. I am a full-time student, and I havemargood GPA. All together At will probably take me between 2 and 5semesters to transfer.

VII. Mrgoal is a BA or BS degree. J am a part-time student, and I haveavgood)'GPA. All together it will probably take me more than five

6 semesters to transfer.

VIII. I think I want to obtain a BA or BS degree, but my coursework isn'tthat strong and my GPA may not be good enough-to transfer.

IX. 'I'm hoping to get a BA, but my.real interest in transferring fs,toplay intercollegiate sports.

X. I want a BA or BS-degree, though I am proceeding through my coursework slowly: I'll be able to transfer eventuallyif.I can continueto get financial aid long enough so I can stay in school.

XI. I am already attending a four year coll.ege, and.I'm currently enrolled'here to mile up selective credits or take additional courses.

.

XII. My goal is a BA or BS degree. I am concentratind'on vocationalcourses that are transfqyable to a four'lear program.

El7 2E EA *

Page 164: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

L.i

.1%

a

OTHER ORIENTATION

'XIII*: I'm not really that interested in credits. I'm attending collegepart-time to get betterskills related to my personal interests andhobbies.

XIV. I'm not really that interested in credits. I'm attending college,*part-time because I just like learning. Most of the courses I take

are academic rather than skills.

XV. I have little interest in credit. I am interested in culturalexperiences and take courses in art, history, drama, music appreciation,etc.

XVI. I need basic skills in math, reading or writing, (e.g. 'someone who

might,be seeking a GEO).

XVII. I am taking basic preparatory courses that I need to transfer to avocational program that is not available at this college.*

*Adapted from prototypes developed by Steven Sheldon and Russell Hunter.

E 2

4

4 .t

tt

EEA*

Page 165: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4

APPENDIX F

COMPARISON OF EOPS APPLICATIONS AND FINAL CLAIMS*.

a

--

(

4

*Data for this report collected.by, Anifer Franz..

*0, Report written by Jennifer.Franz and Marvin C. Alkin.

474

e3

ig

Page 166: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

1.

BackgroundA

In the course of this study we have identified,a variety':

of state and local information'needs and existing documents or i

data sources which might meet those needs. A primary potential

resource identified was the EOPSapplication form or "College

Plan for the Extended Opportunity Programs and Services

Project," commonlyNreferred*to as Form 10. The questions

and information items which Form 10 addressed pertained to a

description of the elements of the EOPS program, levels of

Service's provided in the SOPS program, and resources expended.

14hit was not clear, however, was the extent, to which the

estimates and proposed plans contained in a college's

application reflected ,actual porject operations and levels of

'service. We therefore determined that it would be necessary

tp compare proposed and actual efforts before incorporating

-Form 10 into the evaluation system.. .

Approach

The ttgo forms used to determine the congru'nce between

what colleges proposed to undertake add what they actuallydp,

did'were the .application and the year-end "Budget Approval

%,nd Final Claim" (Form 7). Because final claims for 1980-81

1.had not, yet been .sfibmitted when the comparison needed.tó be4

made, it,was necessary to use data from 1979-80 - before

Form 10 had,been developed. However, its predecessor

(Form 10.& 12) contained many of the same eletents and was

Jr

175F 1

*

Page 167: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

.

deemed sufficiently similar to-provide a valid measure.

The colleges to be included in the comparison werl

selected by applying numbers derived from a random number

table to the alphabetical listing of $pmmuAty colleges in

existence,in 1979-80 (California Community College Dictionary:N)

yr1979-80). Of the 10% of all colleges so selected (n=11),

two did not have final claims for 1979-80 on file. Thevit."'

were replaced with two mord,randomly selected colleges, of

which one did not have a final claim on file. The final44 a

sample was thu's developed via a thir0 random selection process.

Table 1 shows the college's included in the sample along,

with information concerning. their size and location single'

Campus /multi campus; northern /southern California). As this

table suggests, although the sample colleges are prbpably not

totally representative of ali community colleges in the sate,'

they do'include colleges of all types - i.e., small, medium

and large; single and multi campus; and from the northern and

southern parts of California.

Once the colleges in the sample had been identified,

th it applications and final claim folderslwere,ipulled dnd

selected data items on the two sets of forms were compared.

Because the amount of information on the claim forms was

substantially less than the amount of the information in

the applications) the comparisons whidh,could be madp were

limited to.pr4osed and actual expenditurei and numbers of

students served. Accordingly, the findings which follow

I

F,2

s2.

0

Page 168: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

a

College

TABLE I

Characteristics of Sample Colleges 11,

Fal 1 1979. Multi Si ngl eEnrollment* Campus/Campus North/South

Bakersfield College

Cerro Corso Communety College,

.-

12;210

3,650 "

Multi

My' 1 ti

...,-

South

South ."

Cuyamac,a College 2,033 Multi South

Laney College° 10,130 Multi North,

(Lassen Coll ege 3,044 Single North

Coll ege of Marin 11,315 Multi North.

.

Saddleback College

4: es

21,644 . Multi *South

Saf Jose City Coll ege 13,761 Mul ti NotthO

Santa Rosa Junibr Collage 19,501 Single North

Sierira College f 8,370 Single North.

Solana Community Col lege 8,907 Single North

. ,

*Largest enrol lment Fat 1979 = 28,308 (tong Beach) ; small est enrollmentFall 1979 = 573 ( P4W Verde)

.F3

:177

c

Page 169: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

0

essentially encompass only.two of the information items

"included Table 3.

Findin P'-

0

AAs noted above, most of the informationrafttained-in

the EOPS application is not repeated in subsequent. documents.

It:is therefore possible only to speculate about the extent A-

461to which program plans are congruent with actual pbograM Arf

toperations. However, the evidence with respect to planned

vs. actual expenditures and numbers of students served

clearly implies substantial.differences betvieeri'what EOPSkl

projects propose to do and what they actually accomplish.

This evidence,is discussed in detail below.

Table 2 shows the amounts of..-

PS funds awarded to the

sample coNleges, including their initial g nts, any aug-

mentations and all "recycled!!, funds,(ungncumb8red monies

reallocated from onevollege.to another before the end of'it

the fiscal year). Table 3 compares the "adjustedAtotals"

.

derived from these vailopus transactions both, to the amounts

applied for and to 'the amounts actually vxpended.

As these tables indicate, total expenditures for allso

but one institution (Laney College) were exceedingly close. A

tO their adjusted totalEOPS grants (including augmentations

and the "recycling" of unencumbered funds). However, they'

were ,only".occasionally similar to the amounts originally

proposed. All eleven colleges.spent less than they-applied.

ci

, "fr 4

r4

17

1

Page 170: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

la

TABLE 2

EOPS Funds Awarded to Sample Colleges

College Funded Augmented Recycled Adjusted Total

BakersfieldCollege

Cirro Coso Community College

Cuyamaca College

Laney Collegeaop .

Lassen College

Co.1 lege of Marie,

234,523 1,304 -10,000

106,237

57,927 306

406,372 2,308 -56,000

168,91-2 932 -36,0100

21,638:,,_ 1,206 +1t4,000

Saddleback C91dege - 100,088 5'44. .t. r

Sang ose Citytollege-' 187,779 ,1,053

'Santa' Rosa Jiinior *liege i07,64b ..t..-".

- -

,

f.

-1-15,500'Sierra College '.:. 174;237' 951- 4 ,000.

Soland.CommunitrCollege 163,658 OAF 7,500

-ili

o_

6

i9

4,

CA 7,9'5

q

6

ki

225,827

406,237

,58,233

352,680

205,844

230,844

ld0,632:>

18-8,83

- 287,640

.204,695.'

157,063

10`

Page 171: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

TABLE 3

EOPS Funds Applied For, Granted and Expended

College

.2r

/? iApplication Adjusted. R9turned To

Amount Total Grant Expencied crre6sury

Bakersfield College

Cerro Coso-Community College

278,466

106,237

'225,827

106,237

9

Cuyamaca College 71,816 58,233

Laney College 1,503,800 352.680

Lassen College ..* 366,000 205,844

-drA

. College ofMarin 353,299 20,844

Saddleback College .483,094 100,632

.. -San Jose City College 363,'408 -...-188,832

%....2 .

Santa Rosa Junior College 287,640 287,640

0.

Sierra College '382,866 204,695

Solano Community Co)lege 184,956 - 157,063

'',225;827

J

104,176 2,06.1

58,233

306,647 46,033.,

.

205,844

130,844.

0,474 3,567

188,832

* .

275,011 12,329. $. .

204,695,

156,544 519

*

iIt s unclear why the.fu s expended and returned tothe treasury forSaddleback and Santa,,Rosa o not add to the adju -4--J-o-fal grant.-Howe47-, the dil4erences a e small and do not su stantiallya)terthe relationship between pr..osed an actual exp nditures.

ti

b

Page 172: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

fbr, and more than half'(n=6, or 55 %) expended over $100,000'.4(

-less. In one instance, the differenc- was er $1 million.

The discrepdncy betwgen proposed d actua expenditures

6f district funds, displ yed in Table 4, is som what similarm*to that for EOPS funds: Only two Colleges sent wha theyproposed to spend, while another threew re within $10,000

and 'a fourth was within $20,000. The remainder (45%) weree

off by over $100,000 4 and all but onerof thete was off

by more than $200,-4300. "HOwever, aAhough all eleven collegesA,. .

spent fewer EOPS dollars than they proposed to, two spent more

district funds than they proposed: one by relativelY'little

($5,881 - Saddleback); but one by a substantial sum

($693,809 = Cuyamaca)'.. . .

,

,,,,,,

Perhap predictably, given the differences betwedn

proposed and.actual expentitures, no college served the

numbers of 'students:it proposed.to,as Table 5 indidattes.-.

Differences ranged from relatively insignificant (e.g., Cerro..

Coso --,32 students)to the almost inGredibie Cuyamaca

which served only 9.6% of those proposed to and'Langy,

wh -ich served'22.1%).2' Among all eleven Colleges, service

was:proposed for 5,,950 students and actually.delivered to

3,730 - a net deeeaselofJ7%:;'

Tables .6 thr91018 depict proposed and actual EOPS

. e>eioenditures and students served by Part.' (Student figures,, .

are not given for, Part A because Part A funds are not used. .

. . '.-...

..

to provide direct services() Fain the'4ata we note that no.

'rt '

et

. F 7181

o

.01

4.

Page 173: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

0

4 a

41

JABLE10,4

AO

sec) and Actual Expenditures of Dktrict Fund for EOPSV

.

,

College -Proposed Expenditures Actual Expenditures

Bakersfield College 486,230

F

Cerro Coso bmmunity College '\13,604

Cuyamata Ca4 1 ege

- Laney College

Lassen College

. College of -Mani n

)1

o

Saddleback College.

San Jose City College

Santa Rosa Junior Cbllege

..f.

Sierra Col lege_

Solar C9mTunity College

^t°

73,3 8 1

1,740)750

498,600',

345002-..

296,522

205,524

767,190.

1,103,462

205,844

341,8714

. 316 379,497

0

587,838 5$7,838

730,41.8 , 514,731

290,209 Q 274,630

835,022 835,022

.. ik - i

*Proposed eillenditu ie data are.tal.:en from estimates recordeCon the,.

approved budget since estimates reported in the application comb relocal, federal and other funds. , - -

132F 8

Q.

Page 174: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

O

1

o

le

I J

TABLE 5

S

-

Proposed ,end Actual Students Served (Unduplicated Count)

College

Bakersfield College

Cerro Coso Community College

Cuyamaca College

Students Proposed ActLial

To Be Served Students Served_

o

300 350

125' 93

250 24

Laney-ColJege 1,000 22iMr.

Lassen Obllege 375

College of Marin'%

sSadd)eback College

1,800

400

San Jose City College 400oc.

Santa Rosa Junior College

Sierra4Collegd

Solano Community College

500 a.,

400

.400

I

F 9. .

4

312

.814

1

278

309

545 f

4178

606

.

Page 175: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

4 TABLE 6

PART A: Proposed and-Actual Expendjtures.

Colle e

..Bakersfield College

Cerro Coso Community College4,-/

`'

Cuyamata.eorlege

Oney4ObIlege 4r

lassen,College

r1.

ProposedExpenditures

20,645

8,448

21,722

73,100

83,000

4-

College of Marin

4

-0-

Saddleback College_JA 0 41,350-

.. -

'

San Jose City C'91.1ege. 7,400

Santa Rosa Junior College 15,734

Sierra College 26,815

So I 6no 7CorrimUn ity...40.1.1eqe '33;2

- .

Actual *

Expenditures

31,250

T.

.

**27,328 .*

5/,726

39)430

44,956

38,006

22,896

a

31,359

17,343-

b

4,

-\ . Actual' expenditures Ere presumed to be those claimed;ft.

..,.

$8,128.:transferred trom Part A C$6,262) 'and Part y,($1,866) to Part Cafter final claim wasfiled.

,....

Po0

F 10 1 S4

Page 176: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

o

)1

4

r'.

',, PART B:

TABLE 7

qProposed and Actual ExpenditUres and St4dents Served14

'5 College .

AY

1

1'

) -

.

.Proposed'Expenditures ,' Students

Actual,Exilefnditures Students

Bakersfield College

4 ,P

cerro-COso Community College-

Cuyamaca College*,

.

pLaney Col lege

t. N

Lassen Col lege.

College of Main.

,4

Sadle6ack College .A

San. Jose City College ft

Santa Rosa Jun or CO'llege

Q'

Sierra Colleg .:. '' It.

SolaAo Community College.

92,821 "30d+"

l

..

51,789 1.25.

35,094 250.

-

333,700 1,000 .

183,000 . 375

':222,049. 700

153,744 400

-

51/4 .160e308 - 400

6166,406 500 ,A

.98,551 : 400

. ,

.84,527 401.

78,383 350

'59,156 93

15,657* 24

127;736 4' 24

52,839 312

156,8.44 823

22,,853 65

43,391 309

i

137,620 5,45

83,406k". 178

"56,296 478

*$8,I28 transferred -from Part A ($6,-262) and PSrt B $1,866) to Part C after final claim was iiled71

185 )

186

A

Page 177: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

Tat

187

TABLE 8

PART C: Proposed and Actual Expenditures and Students Served

College

Bakersfield College

I

Cerro Coso Community College

Cuyamaca College

- Laney College

Lassen College

College of Marin

§addleback College

Sap Jose City College

;

Santa Rosa Junior College

Sierra College

I

Soiano CommunityCollege:

Proposed -

Expenditures, StudentsActual

Expenditures Students

165,000 "300+H..

so'

116,194 '300

46,000 125 33,939

15,000 75 15,248* 24

1

1,097,000 1,:000 121,185 221 1/

Tr,

100,000 375 , 113,575 157

131,250 ktp 170. 74,000 241

288,000 400 '29,665 65

145,700,

220 107,435 212

105,500 500 114,495 $ 545P .

257,500' . 400.

-89,930.

,. 178

67,000 100 82,905 128

I ) .

. . .

$8,128 transferred_from Part 4 ($6,262) and Part B ($1,866) to Part C after final claim was filed.

'188

Page 178: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

collegd spent, what it proposed to in any. Part or (with' the

,possible exception of Bakers field College's somewhat vague.-pxoposal)

Generally

served the numbers of students it planned to.

speaking, less was spent and feWer students were

,served than specified on.the applicatiOn, but this is not

UniLri:lly the case in any Part. Moreover, the magnitudes

4, of the chan ges range from lowsof $248 and 16 students

0

(in different collegeSi-to highs of $975,815.and 976 students., ,.

-...

(in the same college).: ,

. .

,.' Finally.; it sh4uld be noted that although,in'Most colleges1.

Attie dotal amou nts expended were the same as or very similar

to amountes,wardd, withthe result that approved budgpts_____-..0

are ,quite reliable in dicators ofEctual-provam costs this

is not quite as, often the case withirespept to expencritui-es

under the.various tarts. Colleges have the option of tra6s-0

ferring funds from one Part to another duiing the,year,.and

all but one of the colleges in the sample ded sa in 109430.

The amountsttiansfeaed, which are shown in Figute 1,.tenPed

to be fairly small, but in four .instances were quite.

.

substantial:. $9,:743 by Bakersfield College; $10,000 by,

-Cerro Cose;'qollege; $47,375 by Laney College;. an 008,000I -

by Solano College. Thesefigures represent 4.3%,9.4%,

13.4% a* 12.7 %, respectively{ of the colleges' adjusted\

total EOPS budgets.

Conclusions

The data discussed-in the,preceding,section strongly4 -

4° F 13

189

O

.

Page 179: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

e 0

4.

t

._

FIGURE

Transfers of Funds Among Parts A, B 8 C.

. 4

Bakersfield College

',.$2,094 to Part $377 from girt A &1$2,72-7.fram Part C:. $9,743 from Part C. $7,226 to. Part B.&$2,477 to Part

0

C614-ealo Community College

$382. from Part B to Part A$W;000 fromPart4^: 3496 to Part A & $6,504 to Part

T1.#

Cuyamaca College

$8,128 to Part C: $6,462 from,Part A & $1,866 from Part B

.

Laney College

$47,,375 from Parts A & B to Part C

Lassen College

$4,796 from'Part B to Part C.

College of Marin

None

11`

SaddlebaCk College

$289 from Part B to'>Part A

$5;799 from Part B to Part A

San JOse City College

'$300 from Part A to Part B

$5;236 from Part A:. $2,827 to "Part$2,027 from Part $7;173T.toPart

Santa Rosa Junior College

$7,350 from Part B: '$3,.350 to Part

B and $2,409 to' PartA and $868 to Partfl

A & $4,000 to,part'C

Sierra College '

$3,964 to Part d: $474 from Part A &$4,464 to Part C: $974 from Part A ,t$3,000 from Partt C to Part A-

Solano Community College$20,000 to Part C: $17,857 from$1,000 from Part B.t.00Rart A

4.

fro

C

$3,490 froM Part B$3,490 from Part B.

?ft

Part A & from Part B'

Page 180: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

suggest that EOPS applications bear little relationship to

actual program expenditureS and optvations. Both dollar

amounts expended and numbers of studefits'seryed vary sub-_

. .',

Stantially,from propbse4 to actual in ,most colleges. In,

.

. .. general, the changes are in a negative direction3- i.e.,

. : fless money spent and few stuaents.,served - but this is not

universally -the case. Moreover, the trends of the two '

variables are not necessarily parallel: colleges may serve

more students with fewer dollars or fewer students with more

dollars.

Approved' budgets are quite reliable indicators of actual-N

- expenditures for the large majority of colleges,iriterms of

total dollar amounts. ,However, becat,Ie colleges have the

option of transferring funds among Parts, these budgets...do

not accurately reflect expenditures by activity. Purther--

more, they tend not to show -planned decreased in the numbers

of students to be served.`

..

In sum, jtegiiiiing-of-yeardocuments do n'ot appear' to be.

a reliable source of inforTrion bouhat EOPS programs

actually do, at least ilIfar as. xpenditures and numberst

. -,' ,

of students served are concerned. Whether the other informa-

tion items included in the application ate less subject toA.

change calm t be determined from final claims and related

documents, but it would appear probably that as fundeand. -

,numbers of students change, other-aspects ofthe-progLm

.h

. would of necessity change-as well. Accordingly, it is

4.

6

Page 181: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

f

highly likely that EOPS project applications are .of minima 14,

utility in program evaluation.

I

s

ti

4

4

A

A

a

4

Page 182: DOCUMENT RESUME ED 206 362 JC 810 519 · 2014-02-18 · DOCUMENT RESUME: t ED 206 362 JC 810 519. AUTHOR. Alkin, Marvin C.: Stecher, Brian ti. TITLE Evaluating EOPS: A User Oriented

FOOTNOTES

-Final claim folders include all program claims for fundsalong with the program's approved,budget, the documentationof transfers-among Parts A, B C, other amendments to theapproved budget and related correspondence. All of theseitems were'review-ed, and some of them served.tfrto-enhance theanalysis of application reliability. Those which proved tobe,pertinent are discusbed under Findings.

2It should be noted that although Laney College serveLyoughlyone-fifth of the students it planned to serve with somewhatover two-fifths (43.5%) of the state and local funds itapplied for, Cuyamaca spent 569% of what it proposed servingone-tenth the number of students.

'Ns

OCT 16 1981

f:r Junior Colleges.95 Pov11 Lihr,ryUniversity of. CalifgrniaLos APqeles, California. 90024

tv